Oct 07 12:24:42 crc systemd[1]: Starting Kubernetes Kubelet... Oct 07 12:24:42 crc restorecon[4725]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 12:24:42 crc restorecon[4725]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 12:24:43 crc restorecon[4725]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 12:24:43 crc restorecon[4725]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 07 12:24:44 crc kubenswrapper[4854]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 12:24:44 crc kubenswrapper[4854]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 07 12:24:44 crc kubenswrapper[4854]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 12:24:44 crc kubenswrapper[4854]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 12:24:44 crc kubenswrapper[4854]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 07 12:24:44 crc kubenswrapper[4854]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.446032 4854 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450041 4854 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450062 4854 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450067 4854 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450072 4854 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450076 4854 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450079 4854 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450083 4854 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450087 4854 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450091 4854 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450095 4854 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450100 4854 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450106 4854 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450110 4854 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450115 4854 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450122 4854 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450126 4854 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450131 4854 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450136 4854 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450141 4854 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450167 4854 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450172 4854 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450177 4854 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450181 4854 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450185 4854 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450189 4854 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450193 4854 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450197 4854 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450201 4854 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450205 4854 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450209 4854 feature_gate.go:330] unrecognized feature gate: Example Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450213 4854 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450217 4854 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450223 4854 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450228 4854 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450232 4854 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450237 4854 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450241 4854 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450245 4854 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450249 4854 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450252 4854 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450256 4854 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450260 4854 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450264 4854 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450268 4854 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450271 4854 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450275 4854 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450278 4854 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450282 4854 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450286 4854 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450290 4854 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450294 4854 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450299 4854 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450302 4854 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450306 4854 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450309 4854 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450313 4854 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450316 4854 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450321 4854 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450324 4854 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450327 4854 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450331 4854 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450334 4854 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450338 4854 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450341 4854 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450345 4854 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450348 4854 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450352 4854 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450355 4854 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450358 4854 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450361 4854 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.450365 4854 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450451 4854 flags.go:64] FLAG: --address="0.0.0.0" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450461 4854 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450469 4854 flags.go:64] FLAG: --anonymous-auth="true" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450474 4854 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450479 4854 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450483 4854 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450489 4854 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450495 4854 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450500 4854 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450504 4854 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450510 4854 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450515 4854 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450521 4854 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450526 4854 flags.go:64] FLAG: --cgroup-root="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450530 4854 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450534 4854 flags.go:64] FLAG: --client-ca-file="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450539 4854 flags.go:64] FLAG: --cloud-config="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450543 4854 flags.go:64] FLAG: --cloud-provider="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450547 4854 flags.go:64] FLAG: --cluster-dns="[]" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450554 4854 flags.go:64] FLAG: --cluster-domain="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450558 4854 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450562 4854 flags.go:64] FLAG: --config-dir="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450566 4854 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450570 4854 flags.go:64] FLAG: --container-log-max-files="5" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450576 4854 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450580 4854 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450584 4854 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450589 4854 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450593 4854 flags.go:64] FLAG: --contention-profiling="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450597 4854 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450601 4854 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450606 4854 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450610 4854 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450616 4854 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450620 4854 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450624 4854 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450628 4854 flags.go:64] FLAG: --enable-load-reader="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450633 4854 flags.go:64] FLAG: --enable-server="true" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450636 4854 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450642 4854 flags.go:64] FLAG: --event-burst="100" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450647 4854 flags.go:64] FLAG: --event-qps="50" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450654 4854 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450660 4854 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450666 4854 flags.go:64] FLAG: --eviction-hard="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450673 4854 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450678 4854 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450683 4854 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450687 4854 flags.go:64] FLAG: --eviction-soft="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450691 4854 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450695 4854 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450700 4854 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450704 4854 flags.go:64] FLAG: --experimental-mounter-path="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450710 4854 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450714 4854 flags.go:64] FLAG: --fail-swap-on="true" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450718 4854 flags.go:64] FLAG: --feature-gates="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450732 4854 flags.go:64] FLAG: --file-check-frequency="20s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450737 4854 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450741 4854 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450746 4854 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450751 4854 flags.go:64] FLAG: --healthz-port="10248" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450756 4854 flags.go:64] FLAG: --help="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450760 4854 flags.go:64] FLAG: --hostname-override="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450765 4854 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450770 4854 flags.go:64] FLAG: --http-check-frequency="20s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450775 4854 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450779 4854 flags.go:64] FLAG: --image-credential-provider-config="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450783 4854 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450787 4854 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450792 4854 flags.go:64] FLAG: --image-service-endpoint="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450796 4854 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450800 4854 flags.go:64] FLAG: --kube-api-burst="100" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450805 4854 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450811 4854 flags.go:64] FLAG: --kube-api-qps="50" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450832 4854 flags.go:64] FLAG: --kube-reserved="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450838 4854 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450842 4854 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450847 4854 flags.go:64] FLAG: --kubelet-cgroups="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450851 4854 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450856 4854 flags.go:64] FLAG: --lock-file="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450862 4854 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450867 4854 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450871 4854 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450878 4854 flags.go:64] FLAG: --log-json-split-stream="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450883 4854 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450887 4854 flags.go:64] FLAG: --log-text-split-stream="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450892 4854 flags.go:64] FLAG: --logging-format="text" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450896 4854 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450901 4854 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450906 4854 flags.go:64] FLAG: --manifest-url="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450910 4854 flags.go:64] FLAG: --manifest-url-header="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450917 4854 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450922 4854 flags.go:64] FLAG: --max-open-files="1000000" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450929 4854 flags.go:64] FLAG: --max-pods="110" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450934 4854 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450940 4854 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450945 4854 flags.go:64] FLAG: --memory-manager-policy="None" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450951 4854 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450957 4854 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450962 4854 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450968 4854 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450979 4854 flags.go:64] FLAG: --node-status-max-images="50" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450984 4854 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450988 4854 flags.go:64] FLAG: --oom-score-adj="-999" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450993 4854 flags.go:64] FLAG: --pod-cidr="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.450997 4854 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451006 4854 flags.go:64] FLAG: --pod-manifest-path="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451011 4854 flags.go:64] FLAG: --pod-max-pids="-1" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451015 4854 flags.go:64] FLAG: --pods-per-core="0" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451021 4854 flags.go:64] FLAG: --port="10250" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451025 4854 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451030 4854 flags.go:64] FLAG: --provider-id="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451034 4854 flags.go:64] FLAG: --qos-reserved="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451038 4854 flags.go:64] FLAG: --read-only-port="10255" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451043 4854 flags.go:64] FLAG: --register-node="true" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451048 4854 flags.go:64] FLAG: --register-schedulable="true" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451053 4854 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451060 4854 flags.go:64] FLAG: --registry-burst="10" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451064 4854 flags.go:64] FLAG: --registry-qps="5" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451068 4854 flags.go:64] FLAG: --reserved-cpus="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451072 4854 flags.go:64] FLAG: --reserved-memory="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451077 4854 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451081 4854 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451086 4854 flags.go:64] FLAG: --rotate-certificates="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451090 4854 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451095 4854 flags.go:64] FLAG: --runonce="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451100 4854 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451105 4854 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451110 4854 flags.go:64] FLAG: --seccomp-default="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451115 4854 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451120 4854 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451125 4854 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451130 4854 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451162 4854 flags.go:64] FLAG: --storage-driver-password="root" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451168 4854 flags.go:64] FLAG: --storage-driver-secure="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451173 4854 flags.go:64] FLAG: --storage-driver-table="stats" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451178 4854 flags.go:64] FLAG: --storage-driver-user="root" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451184 4854 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451189 4854 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451197 4854 flags.go:64] FLAG: --system-cgroups="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451202 4854 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451211 4854 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451216 4854 flags.go:64] FLAG: --tls-cert-file="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451220 4854 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451228 4854 flags.go:64] FLAG: --tls-min-version="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451231 4854 flags.go:64] FLAG: --tls-private-key-file="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451236 4854 flags.go:64] FLAG: --topology-manager-policy="none" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451240 4854 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451244 4854 flags.go:64] FLAG: --topology-manager-scope="container" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451249 4854 flags.go:64] FLAG: --v="2" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451254 4854 flags.go:64] FLAG: --version="false" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451260 4854 flags.go:64] FLAG: --vmodule="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451265 4854 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451270 4854 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451378 4854 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451383 4854 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451387 4854 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451391 4854 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451396 4854 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451401 4854 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451405 4854 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451409 4854 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451412 4854 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451416 4854 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451420 4854 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451423 4854 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451427 4854 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451430 4854 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451434 4854 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451437 4854 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451441 4854 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451446 4854 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451449 4854 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451453 4854 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451457 4854 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451461 4854 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451464 4854 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451468 4854 feature_gate.go:330] unrecognized feature gate: Example Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451471 4854 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451475 4854 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451478 4854 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451481 4854 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451485 4854 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451488 4854 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451492 4854 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451496 4854 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451499 4854 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451502 4854 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451507 4854 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451510 4854 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451518 4854 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451521 4854 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451525 4854 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451528 4854 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451532 4854 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451535 4854 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451539 4854 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451542 4854 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451545 4854 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451549 4854 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451553 4854 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451556 4854 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451560 4854 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451563 4854 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451567 4854 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451571 4854 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451574 4854 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451579 4854 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451583 4854 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451587 4854 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451592 4854 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451595 4854 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451600 4854 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451604 4854 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451608 4854 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451612 4854 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451615 4854 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451619 4854 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451622 4854 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451625 4854 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451629 4854 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451632 4854 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451640 4854 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451645 4854 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.451656 4854 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.451673 4854 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.460687 4854 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.460726 4854 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460789 4854 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460796 4854 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460800 4854 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460805 4854 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460810 4854 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460814 4854 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460818 4854 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460821 4854 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460825 4854 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460829 4854 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460833 4854 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460837 4854 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460842 4854 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460846 4854 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460851 4854 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460855 4854 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460860 4854 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460866 4854 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460870 4854 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460875 4854 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460879 4854 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460884 4854 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460888 4854 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460892 4854 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460896 4854 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460899 4854 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460903 4854 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460906 4854 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460910 4854 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460913 4854 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460920 4854 feature_gate.go:330] unrecognized feature gate: Example Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460924 4854 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460929 4854 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460933 4854 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460938 4854 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460942 4854 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460945 4854 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460949 4854 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460953 4854 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460959 4854 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460964 4854 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460969 4854 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460973 4854 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460977 4854 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460981 4854 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460984 4854 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460988 4854 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460991 4854 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460996 4854 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.460999 4854 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461003 4854 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461006 4854 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461010 4854 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461013 4854 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461017 4854 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461020 4854 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461023 4854 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461027 4854 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461030 4854 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461034 4854 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461037 4854 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461044 4854 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461050 4854 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461054 4854 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461058 4854 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461063 4854 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461067 4854 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461074 4854 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461078 4854 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461082 4854 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461100 4854 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.461107 4854 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461970 4854 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461983 4854 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461988 4854 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461992 4854 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.461997 4854 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462001 4854 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462005 4854 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462009 4854 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462013 4854 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462018 4854 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462022 4854 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462026 4854 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462029 4854 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462033 4854 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462038 4854 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462044 4854 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462048 4854 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462051 4854 feature_gate.go:330] unrecognized feature gate: Example Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462056 4854 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462061 4854 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462065 4854 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462070 4854 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462074 4854 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462078 4854 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462082 4854 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462086 4854 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462090 4854 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462094 4854 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462098 4854 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462101 4854 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462105 4854 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462109 4854 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462113 4854 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462116 4854 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462120 4854 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462124 4854 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462128 4854 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462131 4854 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462134 4854 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462138 4854 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462155 4854 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462159 4854 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462163 4854 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462166 4854 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462170 4854 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462173 4854 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462177 4854 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462180 4854 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462183 4854 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462187 4854 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462191 4854 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462195 4854 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462199 4854 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462204 4854 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462208 4854 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462213 4854 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462217 4854 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462221 4854 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462225 4854 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462229 4854 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462233 4854 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462237 4854 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462242 4854 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462247 4854 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462251 4854 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462255 4854 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462259 4854 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462263 4854 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462267 4854 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462270 4854 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.462276 4854 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.462281 4854 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.463344 4854 server.go:940] "Client rotation is on, will bootstrap in background" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.467017 4854 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.467118 4854 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.469093 4854 server.go:997] "Starting client certificate rotation" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.469116 4854 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.469460 4854 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-27 14:23:50.261724411 +0000 UTC Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.469632 4854 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1945h59m5.792098356s for next certificate rotation Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.495909 4854 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.499704 4854 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.517325 4854 log.go:25] "Validated CRI v1 runtime API" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.556735 4854 log.go:25] "Validated CRI v1 image API" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.559756 4854 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.570190 4854 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-07-12-19-55-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.570270 4854 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.603746 4854 manager.go:217] Machine: {Timestamp:2025-10-07 12:24:44.600227665 +0000 UTC m=+0.588060000 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:9adad3a0-6916-40db-9f9d-6808f74fc783 BootID:5ce82c01-a615-4296-adac-17bc71cde602 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:11:15:b2 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:11:15:b2 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:af:62:06 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:43:7e:f4 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:b8:91:8b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:97:fe:62 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:77:a2:b6 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:06:a1:60:06:ca:4a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:da:f9:ed:48:92:c0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.604132 4854 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.604395 4854 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.604968 4854 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.605303 4854 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.605365 4854 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.605761 4854 topology_manager.go:138] "Creating topology manager with none policy" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.605781 4854 container_manager_linux.go:303] "Creating device plugin manager" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.606533 4854 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.606601 4854 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.606958 4854 state_mem.go:36] "Initialized new in-memory state store" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.607142 4854 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.611427 4854 kubelet.go:418] "Attempting to sync node with API server" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.611616 4854 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.611672 4854 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.611699 4854 kubelet.go:324] "Adding apiserver pod source" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.611732 4854 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.617495 4854 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.618966 4854 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.619903 4854 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.243:6443: connect: connection refused Oct 07 12:24:44 crc kubenswrapper[4854]: E1007 12:24:44.620098 4854 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.243:6443: connect: connection refused" logger="UnhandledError" Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.620020 4854 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.243:6443: connect: connection refused Oct 07 12:24:44 crc kubenswrapper[4854]: E1007 12:24:44.620208 4854 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.243:6443: connect: connection refused" logger="UnhandledError" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.621227 4854 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.622602 4854 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.622637 4854 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.622645 4854 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.622658 4854 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.622670 4854 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.622677 4854 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.622685 4854 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.622697 4854 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.622705 4854 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.622713 4854 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.622741 4854 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.622753 4854 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.623309 4854 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.623761 4854 server.go:1280] "Started kubelet" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.624050 4854 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.624864 4854 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.624497 4854 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 07 12:24:44 crc systemd[1]: Started Kubernetes Kubelet. Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.626265 4854 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.243:6443: connect: connection refused Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.626817 4854 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.626864 4854 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.626979 4854 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 08:28:38.66064772 +0000 UTC Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.630644 4854 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1916h3m54.030020337s for next certificate rotation Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.633643 4854 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.633675 4854 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.633968 4854 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 07 12:24:44 crc kubenswrapper[4854]: E1007 12:24:44.634267 4854 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.638762 4854 server.go:460] "Adding debug handlers to kubelet server" Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.640130 4854 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.243:6443: connect: connection refused Oct 07 12:24:44 crc kubenswrapper[4854]: E1007 12:24:44.640306 4854 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.243:6443: connect: connection refused" logger="UnhandledError" Oct 07 12:24:44 crc kubenswrapper[4854]: E1007 12:24:44.639787 4854 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.243:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186c350d108fd3cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-07 12:24:44.623729611 +0000 UTC m=+0.611561856,LastTimestamp:2025-10-07 12:24:44.623729611 +0000 UTC m=+0.611561856,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.641853 4854 factory.go:153] Registering CRI-O factory Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.641923 4854 factory.go:221] Registration of the crio container factory successfully Oct 07 12:24:44 crc kubenswrapper[4854]: E1007 12:24:44.641869 4854 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.243:6443: connect: connection refused" interval="200ms" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.642102 4854 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.642126 4854 factory.go:55] Registering systemd factory Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.642195 4854 factory.go:221] Registration of the systemd container factory successfully Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.642248 4854 factory.go:103] Registering Raw factory Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.642284 4854 manager.go:1196] Started watching for new ooms in manager Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.643668 4854 manager.go:319] Starting recovery of all containers Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.653705 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.653836 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.653861 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.656953 4854 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657017 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657044 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657068 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657096 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657119 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657226 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657251 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657280 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657299 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657318 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657344 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657362 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657385 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657405 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657458 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657478 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657499 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657520 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657537 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657553 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657591 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657612 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657631 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657654 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657677 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657694 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657712 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657732 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657753 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657772 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657796 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657817 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657836 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657860 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657885 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657906 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657926 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657944 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657968 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.657987 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658006 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658025 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658045 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658065 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658086 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658104 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658123 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658227 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658250 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658280 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658302 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658324 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658345 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658365 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658385 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658406 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658425 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658446 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658468 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658488 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658509 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658574 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658594 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658615 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658633 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658660 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658677 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658694 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658714 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658733 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658754 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658773 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658789 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658810 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658833 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658878 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658898 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.658916 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659118 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659139 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659180 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659198 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659216 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659235 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659256 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659276 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659294 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659338 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659359 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659379 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659399 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659420 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659439 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659460 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659481 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659500 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659517 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659542 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659561 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659582 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659603 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659639 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659662 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659685 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659704 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659725 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659747 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659770 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659794 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659815 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659836 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659854 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659875 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659897 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659916 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.659937 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660006 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660024 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660043 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660062 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660078 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660098 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660119 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660138 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660181 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660202 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660218 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660236 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660253 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660274 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660292 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660310 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660327 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660348 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660365 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660384 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660402 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660421 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660442 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660460 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660480 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660503 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660521 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660542 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660560 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660578 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660598 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660617 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660637 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660656 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660675 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660694 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660714 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660732 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660751 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660772 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660798 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660818 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660837 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660857 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660877 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660900 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660919 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660937 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.660957 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661181 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661200 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661221 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661242 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661263 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661283 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661301 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661322 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661341 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661360 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661380 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661398 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661417 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661439 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661459 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661479 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661498 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661517 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661535 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661566 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661587 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661607 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661628 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661648 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661668 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661686 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661705 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661724 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661743 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661760 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661779 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661799 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661819 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661839 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661859 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661877 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661895 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661913 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661933 4854 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661953 4854 reconstruct.go:97] "Volume reconstruction finished" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.661967 4854 reconciler.go:26] "Reconciler: start to sync state" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.677407 4854 manager.go:324] Recovery completed Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.689338 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.691861 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.691930 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.691943 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.695654 4854 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.695678 4854 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.695711 4854 state_mem.go:36] "Initialized new in-memory state store" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.698425 4854 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.700776 4854 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.700879 4854 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.701340 4854 kubelet.go:2335] "Starting kubelet main sync loop" Oct 07 12:24:44 crc kubenswrapper[4854]: E1007 12:24:44.701720 4854 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 07 12:24:44 crc kubenswrapper[4854]: W1007 12:24:44.702423 4854 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.243:6443: connect: connection refused Oct 07 12:24:44 crc kubenswrapper[4854]: E1007 12:24:44.702524 4854 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.243:6443: connect: connection refused" logger="UnhandledError" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.716208 4854 policy_none.go:49] "None policy: Start" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.717536 4854 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.717607 4854 state_mem.go:35] "Initializing new in-memory state store" Oct 07 12:24:44 crc kubenswrapper[4854]: E1007 12:24:44.734379 4854 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.798559 4854 manager.go:334] "Starting Device Plugin manager" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.798634 4854 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.798656 4854 server.go:79] "Starting device plugin registration server" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.799434 4854 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.799458 4854 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.799727 4854 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.799979 4854 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.800003 4854 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.802916 4854 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.803054 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.805937 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.805979 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.805997 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.806218 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.806381 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.806446 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.807319 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.807377 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.807398 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.807612 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.808490 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.808543 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.809156 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.809191 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.809204 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.809800 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.809823 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.809832 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.809989 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.810199 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.810262 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:44 crc kubenswrapper[4854]: E1007 12:24:44.810400 4854 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.810554 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.810582 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.810596 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.810792 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.810813 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.810823 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.811002 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.811135 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.811223 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.811398 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.811446 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.811466 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.811911 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.811970 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.811990 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.812285 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.812340 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.813417 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.813446 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.813460 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.813675 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.813723 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.813744 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:44 crc kubenswrapper[4854]: E1007 12:24:44.843236 4854 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.243:6443: connect: connection refused" interval="400ms" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864373 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864436 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864464 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864489 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864513 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864533 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864575 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864602 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864656 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864690 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864714 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864733 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864761 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864789 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.864871 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.900785 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.903050 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.903086 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.903100 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.903124 4854 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 12:24:44 crc kubenswrapper[4854]: E1007 12:24:44.903674 4854 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.243:6443: connect: connection refused" node="crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.966313 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.966420 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.966465 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.966513 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.966558 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.966601 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.966640 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.966688 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.966751 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.966798 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.966841 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.966881 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.966920 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.966963 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.967007 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.967835 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.967985 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.968034 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.968059 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.968109 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.968107 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.968188 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.968072 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.968250 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.968264 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.968270 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.968186 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.968327 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.968328 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:44 crc kubenswrapper[4854]: I1007 12:24:44.968482 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.104708 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.110951 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.111057 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.111088 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.111136 4854 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 12:24:45 crc kubenswrapper[4854]: E1007 12:24:45.112892 4854 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.243:6443: connect: connection refused" node="crc" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.170699 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.180483 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.195761 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.205301 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.210406 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:24:45 crc kubenswrapper[4854]: W1007 12:24:45.217178 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-4512c5dc9bb03e8469f4e862e0af66cff2f68a91b86703c9c0797d3887e5293e WatchSource:0}: Error finding container 4512c5dc9bb03e8469f4e862e0af66cff2f68a91b86703c9c0797d3887e5293e: Status 404 returned error can't find the container with id 4512c5dc9bb03e8469f4e862e0af66cff2f68a91b86703c9c0797d3887e5293e Oct 07 12:24:45 crc kubenswrapper[4854]: W1007 12:24:45.218062 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-f83aab3b318af4c3cdbaa62403d452edef45ba2ccec9279ef470e863b459b05f WatchSource:0}: Error finding container f83aab3b318af4c3cdbaa62403d452edef45ba2ccec9279ef470e863b459b05f: Status 404 returned error can't find the container with id f83aab3b318af4c3cdbaa62403d452edef45ba2ccec9279ef470e863b459b05f Oct 07 12:24:45 crc kubenswrapper[4854]: W1007 12:24:45.224631 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ccdd43a409b2f6415a271885ff7150800e2cf3243039d592c994b1f11624b650 WatchSource:0}: Error finding container ccdd43a409b2f6415a271885ff7150800e2cf3243039d592c994b1f11624b650: Status 404 returned error can't find the container with id ccdd43a409b2f6415a271885ff7150800e2cf3243039d592c994b1f11624b650 Oct 07 12:24:45 crc kubenswrapper[4854]: W1007 12:24:45.227941 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-8a12c4906e014cc92684b022b9057a155137d6e647d71bb47b87b2c429feefc8 WatchSource:0}: Error finding container 8a12c4906e014cc92684b022b9057a155137d6e647d71bb47b87b2c429feefc8: Status 404 returned error can't find the container with id 8a12c4906e014cc92684b022b9057a155137d6e647d71bb47b87b2c429feefc8 Oct 07 12:24:45 crc kubenswrapper[4854]: E1007 12:24:45.244390 4854 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.243:6443: connect: connection refused" interval="800ms" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.514042 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.515700 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.515774 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.515795 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.515841 4854 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 12:24:45 crc kubenswrapper[4854]: E1007 12:24:45.516544 4854 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.243:6443: connect: connection refused" node="crc" Oct 07 12:24:45 crc kubenswrapper[4854]: W1007 12:24:45.527088 4854 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.243:6443: connect: connection refused Oct 07 12:24:45 crc kubenswrapper[4854]: E1007 12:24:45.527187 4854 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.243:6443: connect: connection refused" logger="UnhandledError" Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.627196 4854 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.243:6443: connect: connection refused Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.706449 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f101a0fc7dc53359e93b3ef9f1082f677ddc5278fd84805e6bb512676955cb5e"} Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.708201 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8a12c4906e014cc92684b022b9057a155137d6e647d71bb47b87b2c429feefc8"} Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.709668 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ccdd43a409b2f6415a271885ff7150800e2cf3243039d592c994b1f11624b650"} Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.710755 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4512c5dc9bb03e8469f4e862e0af66cff2f68a91b86703c9c0797d3887e5293e"} Oct 07 12:24:45 crc kubenswrapper[4854]: I1007 12:24:45.712177 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f83aab3b318af4c3cdbaa62403d452edef45ba2ccec9279ef470e863b459b05f"} Oct 07 12:24:45 crc kubenswrapper[4854]: W1007 12:24:45.759358 4854 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.243:6443: connect: connection refused Oct 07 12:24:45 crc kubenswrapper[4854]: E1007 12:24:45.759617 4854 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.243:6443: connect: connection refused" logger="UnhandledError" Oct 07 12:24:45 crc kubenswrapper[4854]: W1007 12:24:45.887756 4854 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.243:6443: connect: connection refused Oct 07 12:24:45 crc kubenswrapper[4854]: E1007 12:24:45.888442 4854 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.243:6443: connect: connection refused" logger="UnhandledError" Oct 07 12:24:46 crc kubenswrapper[4854]: E1007 12:24:46.045106 4854 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.243:6443: connect: connection refused" interval="1.6s" Oct 07 12:24:46 crc kubenswrapper[4854]: W1007 12:24:46.278331 4854 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.243:6443: connect: connection refused Oct 07 12:24:46 crc kubenswrapper[4854]: E1007 12:24:46.278568 4854 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.243:6443: connect: connection refused" logger="UnhandledError" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.317264 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.319515 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.319596 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.319627 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.319679 4854 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 12:24:46 crc kubenswrapper[4854]: E1007 12:24:46.320576 4854 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.243:6443: connect: connection refused" node="crc" Oct 07 12:24:46 crc kubenswrapper[4854]: E1007 12:24:46.563525 4854 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.243:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186c350d108fd3cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-07 12:24:44.623729611 +0000 UTC m=+0.611561856,LastTimestamp:2025-10-07 12:24:44.623729611 +0000 UTC m=+0.611561856,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.628048 4854 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.243:6443: connect: connection refused Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.724385 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8b62f5c2a8c2f43108e2d3b18eeb3878a8f4b16febf8684d60455625f6d27908"} Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.724494 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d7b9e2d5a6c8a94b980a69e03f1fc9f838a9adaa1295de32eb39920f233d367f"} Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.724532 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8f02cecca02203b76ec49e16e42cfe2040326082ff6986292d9d8896c7611c5c"} Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.724498 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.724554 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"19529d7e55b4b6964e19282e2d5455e33eb2c477965a2107eb1c9709b11652b5"} Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.725805 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.725856 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.725870 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.728043 4854 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="17fecde24011ce4cc70952a55756d12d2e3ae5be575d349c246bf678ea033408" exitCode=0 Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.728181 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"17fecde24011ce4cc70952a55756d12d2e3ae5be575d349c246bf678ea033408"} Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.728326 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.729932 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.729971 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.729990 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.730852 4854 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="433cd114994886a32865db55830797dea0a682d8f4db1583153246fabf4786df" exitCode=0 Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.730919 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"433cd114994886a32865db55830797dea0a682d8f4db1583153246fabf4786df"} Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.731057 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.732540 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.732597 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.732657 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.732684 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.733629 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.733671 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.733685 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.734265 4854 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="2e2f989ec7b6989274ce4275800975e3b1b618754aa48a8d658705cb22d76c9d" exitCode=0 Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.734355 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"2e2f989ec7b6989274ce4275800975e3b1b618754aa48a8d658705cb22d76c9d"} Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.734423 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.736284 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.736352 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.736380 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.739075 4854 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="8c7f90be4592842246dd3858b64b58ce9563c7011d62c0898ad7481baf19d5a8" exitCode=0 Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.739117 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"8c7f90be4592842246dd3858b64b58ce9563c7011d62c0898ad7481baf19d5a8"} Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.739501 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.744487 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.744515 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.744528 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:46 crc kubenswrapper[4854]: I1007 12:24:46.968612 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.180760 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.627484 4854 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.243:6443: connect: connection refused Oct 07 12:24:47 crc kubenswrapper[4854]: E1007 12:24:47.645846 4854 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.243:6443: connect: connection refused" interval="3.2s" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.747271 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6b9739cf9b935b4af8578327ef6891884eae28ee59a768c30deab7cc49ed73ff"} Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.747355 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"66f7ba3fd3bb29e8808c12161f5d6f919e85920f49b10d6c42490f3c1c2c5067"} Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.747380 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2ba427dd9f64f0aaae3cab9458c9c7a64db6fe9b3acf4939fcc109a551a85892"} Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.747402 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3ae34795ed2fc8c3fe6c000b7e45a2752aad9f160853bb5b214bcc2a25fb3d44"} Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.749233 4854 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0b173752743efe3913f40ca7041786d0690b4b4e7a96439c35ce4ee931c6cb72" exitCode=0 Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.749379 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0b173752743efe3913f40ca7041786d0690b4b4e7a96439c35ce4ee931c6cb72"} Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.749459 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.751382 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.751428 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.751443 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.759047 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.759140 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9e9147b290cfc4bfa186bb1c5ef0cd95c5368b22fb008f7b34f444b57327b71b"} Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.763801 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.763845 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.763856 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.767289 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b580d229f52f3e331282be7b1b7c4e08e227d51a46775e76520b30fa444ef5b5"} Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.767341 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.767339 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6aa0cad1d08e7a95f89da94b8f7426208015555e4831c87ff3dcfc748f542282"} Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.767459 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fc86fbbb61283b9e9c242bd68a721ae735ed0bd1a083b859d04430b3f39dfc07"} Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.767478 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.769686 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.769712 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.769744 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.769747 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.769771 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.769781 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.921675 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.923211 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.923263 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.923280 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:47 crc kubenswrapper[4854]: I1007 12:24:47.923319 4854 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 12:24:47 crc kubenswrapper[4854]: E1007 12:24:47.923992 4854 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.243:6443: connect: connection refused" node="crc" Oct 07 12:24:48 crc kubenswrapper[4854]: W1007 12:24:48.381955 4854 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.243:6443: connect: connection refused Oct 07 12:24:48 crc kubenswrapper[4854]: E1007 12:24:48.382126 4854 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.243:6443: connect: connection refused" logger="UnhandledError" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.440074 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.773460 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.775415 4854 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e1cb040918cc593d2eb1d6656be36d11565041ce87451b235d76783ad504a16c" exitCode=255 Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.775472 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e1cb040918cc593d2eb1d6656be36d11565041ce87451b235d76783ad504a16c"} Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.775611 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.776675 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.776705 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.776714 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.777184 4854 scope.go:117] "RemoveContainer" containerID="e1cb040918cc593d2eb1d6656be36d11565041ce87451b235d76783ad504a16c" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.780884 4854 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4fd0f2989a9078b0748a1277b249fcbef96bc332033968fdd20fab7cc02e5787" exitCode=0 Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.780964 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4fd0f2989a9078b0748a1277b249fcbef96bc332033968fdd20fab7cc02e5787"} Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.781030 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.781185 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.781219 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.781298 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.781241 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.782352 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.782397 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.782411 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.782500 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.782523 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.782535 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.782641 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.782654 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.782664 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.782934 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.782949 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:48 crc kubenswrapper[4854]: I1007 12:24:48.782961 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.788257 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.792168 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.792257 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a"} Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.793245 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.793286 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.793299 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.798978 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"84af4237a3fc22d2fa6b12ad2decead9e87e4b3791b3e15bd25e8428921e1d00"} Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.799022 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dc7847d2b9a30513e924c947eace05fa35532504e8c2035b7293aeb678cb051a"} Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.799038 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"76dc5e1e292c99636a6c71d3c9a8fa070207a36ded949654658083054dbc96a8"} Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.799069 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.799354 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.800124 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.800188 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.800203 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.800815 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.800864 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:49 crc kubenswrapper[4854]: I1007 12:24:49.800878 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:50 crc kubenswrapper[4854]: I1007 12:24:50.133985 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:24:50 crc kubenswrapper[4854]: I1007 12:24:50.586841 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:24:50 crc kubenswrapper[4854]: I1007 12:24:50.810003 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dfba95fccbb1d4d4988e49ca93cb3db9bbdf9a64547b63b4524314304e8a225a"} Oct 07 12:24:50 crc kubenswrapper[4854]: I1007 12:24:50.810126 4854 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 12:24:50 crc kubenswrapper[4854]: I1007 12:24:50.810188 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ce7de701892d499b0fd914254736c6f2447d39087918af9a775462ddf180c4cf"} Oct 07 12:24:50 crc kubenswrapper[4854]: I1007 12:24:50.810232 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:50 crc kubenswrapper[4854]: I1007 12:24:50.810233 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:50 crc kubenswrapper[4854]: I1007 12:24:50.812252 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:50 crc kubenswrapper[4854]: I1007 12:24:50.812322 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:50 crc kubenswrapper[4854]: I1007 12:24:50.812354 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:50 crc kubenswrapper[4854]: I1007 12:24:50.812429 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:50 crc kubenswrapper[4854]: I1007 12:24:50.812470 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:50 crc kubenswrapper[4854]: I1007 12:24:50.812492 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.124487 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.126571 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.126652 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.126674 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.126721 4854 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.440410 4854 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.440535 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.814958 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.814960 4854 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.815283 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.816601 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.816658 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.816679 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.816804 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.816869 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.816890 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.877110 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.877410 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.879375 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.879453 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.879492 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:51 crc kubenswrapper[4854]: I1007 12:24:51.889538 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:24:52 crc kubenswrapper[4854]: I1007 12:24:52.097395 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 07 12:24:52 crc kubenswrapper[4854]: I1007 12:24:52.817334 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:52 crc kubenswrapper[4854]: I1007 12:24:52.817654 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:52 crc kubenswrapper[4854]: I1007 12:24:52.818519 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:52 crc kubenswrapper[4854]: I1007 12:24:52.818550 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:52 crc kubenswrapper[4854]: I1007 12:24:52.818563 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:52 crc kubenswrapper[4854]: I1007 12:24:52.823191 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:52 crc kubenswrapper[4854]: I1007 12:24:52.823266 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:52 crc kubenswrapper[4854]: I1007 12:24:52.823293 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:53 crc kubenswrapper[4854]: I1007 12:24:53.852578 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:24:53 crc kubenswrapper[4854]: I1007 12:24:53.852947 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:53 crc kubenswrapper[4854]: I1007 12:24:53.854647 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:53 crc kubenswrapper[4854]: I1007 12:24:53.854724 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:53 crc kubenswrapper[4854]: I1007 12:24:53.854742 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:54 crc kubenswrapper[4854]: E1007 12:24:54.810748 4854 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 12:24:56 crc kubenswrapper[4854]: I1007 12:24:56.870616 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 07 12:24:56 crc kubenswrapper[4854]: I1007 12:24:56.871651 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:56 crc kubenswrapper[4854]: I1007 12:24:56.873656 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:56 crc kubenswrapper[4854]: I1007 12:24:56.873705 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:56 crc kubenswrapper[4854]: I1007 12:24:56.873726 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:57 crc kubenswrapper[4854]: I1007 12:24:57.188097 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:24:57 crc kubenswrapper[4854]: I1007 12:24:57.188319 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:24:57 crc kubenswrapper[4854]: I1007 12:24:57.190127 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:24:57 crc kubenswrapper[4854]: I1007 12:24:57.190211 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:24:57 crc kubenswrapper[4854]: I1007 12:24:57.190232 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:24:58 crc kubenswrapper[4854]: W1007 12:24:58.525258 4854 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 07 12:24:58 crc kubenswrapper[4854]: I1007 12:24:58.525980 4854 trace.go:236] Trace[168236573]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 12:24:48.524) (total time: 10001ms): Oct 07 12:24:58 crc kubenswrapper[4854]: Trace[168236573]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:24:58.525) Oct 07 12:24:58 crc kubenswrapper[4854]: Trace[168236573]: [10.00176448s] [10.00176448s] END Oct 07 12:24:58 crc kubenswrapper[4854]: E1007 12:24:58.526122 4854 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 07 12:24:58 crc kubenswrapper[4854]: I1007 12:24:58.628237 4854 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 07 12:24:58 crc kubenswrapper[4854]: W1007 12:24:58.929653 4854 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 07 12:24:58 crc kubenswrapper[4854]: I1007 12:24:58.929782 4854 trace.go:236] Trace[2094496436]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 12:24:48.927) (total time: 10002ms): Oct 07 12:24:58 crc kubenswrapper[4854]: Trace[2094496436]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (12:24:58.929) Oct 07 12:24:58 crc kubenswrapper[4854]: Trace[2094496436]: [10.002117871s] [10.002117871s] END Oct 07 12:24:58 crc kubenswrapper[4854]: E1007 12:24:58.929819 4854 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 07 12:24:59 crc kubenswrapper[4854]: I1007 12:24:59.129006 4854 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 07 12:24:59 crc kubenswrapper[4854]: I1007 12:24:59.129080 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 12:24:59 crc kubenswrapper[4854]: I1007 12:24:59.134195 4854 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 07 12:24:59 crc kubenswrapper[4854]: I1007 12:24:59.134287 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.210734 4854 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.210859 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.592878 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.593086 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.593507 4854 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.593568 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.594319 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.594375 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.594389 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.597194 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.841729 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.842591 4854 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.842735 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.844111 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.844255 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:25:00 crc kubenswrapper[4854]: I1007 12:25:00.844276 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:25:01 crc kubenswrapper[4854]: I1007 12:25:01.441219 4854 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 12:25:01 crc kubenswrapper[4854]: I1007 12:25:01.441349 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 12:25:02 crc kubenswrapper[4854]: I1007 12:25:02.923995 4854 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.164988 4854 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.626924 4854 apiserver.go:52] "Watching apiserver" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.633051 4854 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.633881 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.634448 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.634589 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.634662 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:03 crc kubenswrapper[4854]: E1007 12:25:03.634733 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 12:25:03 crc kubenswrapper[4854]: E1007 12:25:03.634736 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.635098 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.635137 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.635388 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:03 crc kubenswrapper[4854]: E1007 12:25:03.635522 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.637595 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.637948 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.638075 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.638322 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.638562 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.638566 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.638567 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.640130 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.640351 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.691369 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.714517 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.734307 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.734953 4854 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.753328 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.772896 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.787914 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.803144 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.816982 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.836343 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.853885 4854 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 07 12:25:03 crc kubenswrapper[4854]: I1007 12:25:03.853989 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.115346 4854 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.117876 4854 trace.go:236] Trace[1355724444]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 12:24:49.185) (total time: 14932ms): Oct 07 12:25:04 crc kubenswrapper[4854]: Trace[1355724444]: ---"Objects listed" error: 14932ms (12:25:04.117) Oct 07 12:25:04 crc kubenswrapper[4854]: Trace[1355724444]: [14.932571555s] [14.932571555s] END Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.117913 4854 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.119414 4854 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.119862 4854 trace.go:236] Trace[429026434]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 12:24:53.461) (total time: 10658ms): Oct 07 12:25:04 crc kubenswrapper[4854]: Trace[429026434]: ---"Objects listed" error: 10658ms (12:25:04.119) Oct 07 12:25:04 crc kubenswrapper[4854]: Trace[429026434]: [10.65861605s] [10.65861605s] END Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.119884 4854 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.120819 4854 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220493 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220544 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220566 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220582 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220600 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220617 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220632 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220651 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220674 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220692 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220781 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220805 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220825 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220844 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220869 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220898 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220913 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220942 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.220997 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221027 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221027 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221075 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221132 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221206 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221231 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221271 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221295 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221338 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221383 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221408 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221413 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221471 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221498 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221521 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221543 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221565 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221593 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221615 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221638 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221662 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221692 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221713 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221738 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221763 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221813 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221837 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221861 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221884 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221909 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221932 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221955 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221978 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222000 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222028 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222049 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222069 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222093 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222118 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222143 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222220 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222243 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221660 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.221890 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222189 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222247 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222434 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222535 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222574 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222602 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222568 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222659 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222667 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222743 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222834 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222907 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222992 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223010 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223059 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223075 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223224 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223306 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223346 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223414 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223444 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223444 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223572 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223604 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223679 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223690 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223730 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223769 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223763 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.223890 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.224010 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.224021 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.224129 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.224167 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.222278 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225240 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225279 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225307 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225470 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225501 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225531 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225559 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225587 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225612 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225635 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225659 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225681 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.225723 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:04.72568 +0000 UTC m=+20.713512295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225795 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225864 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225902 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225940 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.225978 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226034 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226072 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226106 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226144 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226209 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226246 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226280 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226316 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226368 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226405 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226438 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226474 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226509 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226546 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226547 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226582 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226630 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226663 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226698 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226734 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226781 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226783 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226815 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226850 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226887 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226925 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226957 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.226994 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227006 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227031 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227074 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227108 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227134 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227142 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227269 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227313 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227352 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227384 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227420 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227464 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227501 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227542 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227601 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227635 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227673 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227735 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227793 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227837 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227856 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227867 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227871 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.227997 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228042 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228085 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228121 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228247 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228292 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228331 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228367 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228403 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228440 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228476 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228512 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228547 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228584 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228584 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228621 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228663 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228735 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228771 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228808 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228841 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228875 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228911 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228915 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228947 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.228984 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229020 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229026 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229056 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229095 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229133 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229207 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229232 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229248 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229286 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229324 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229363 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229399 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229439 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229473 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229454 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229521 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229524 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229593 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229636 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229670 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229703 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229742 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229775 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229826 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229860 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229819 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229895 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229936 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.229974 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.230010 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.230047 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.230083 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.230118 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.230213 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.230260 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.230888 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.230949 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.230992 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231031 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231067 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231122 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231195 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231233 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231272 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231309 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231347 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231386 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231422 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231460 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231474 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231501 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231539 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231577 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231614 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231662 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231699 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231736 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231773 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231814 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231884 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231891 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231944 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231990 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.232034 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.233186 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.231898 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.232078 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.232260 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.232550 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.232886 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.245267 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.232903 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.232949 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.233097 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.233309 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.233438 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.233478 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.234629 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.235186 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.235408 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.235954 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.235973 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.235468 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.236630 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.237123 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.238141 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.238337 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.238648 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.239915 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.240208 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.240234 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.240353 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.240380 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.240422 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.243179 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.244059 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.244319 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.244782 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.245082 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.246464 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.246726 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.246550 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.247220 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.247284 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.247613 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.249110 4854 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.249322 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.249653 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.249694 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.249729 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.249859 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.250256 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.250595 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.251012 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.251016 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.251315 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.251595 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.252206 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.253569 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.253986 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.254431 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.254462 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.254666 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255001 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255118 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255295 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255342 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255386 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255404 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255417 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255471 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255502 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255533 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255613 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255789 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255815 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255829 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255842 4854 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255866 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255882 4854 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255894 4854 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255913 4854 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255927 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255951 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255963 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255973 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.255983 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.256007 4854 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.256017 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.256026 4854 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.256036 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.256048 4854 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.256026 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.256100 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.256173 4854 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.256361 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.257168 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.257257 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.257426 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:04.757353537 +0000 UTC m=+20.745185802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.258113 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.257989 4854 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.258655 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.258759 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.258904 4854 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.259034 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:04.759009685 +0000 UTC m=+20.746842120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.260371 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.260663 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.260935 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.261258 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.261392 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.261594 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.261796 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.262018 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.262222 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.263116 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.263198 4854 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.263746 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264037 4854 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264083 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264100 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264116 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264130 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264162 4854 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264181 4854 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264200 4854 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264221 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264237 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264236 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264251 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264306 4854 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264295 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264324 4854 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264385 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264393 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264402 4854 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264679 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.264747 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.265124 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.265126 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.265187 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.265476 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.266540 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.266599 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.266617 4854 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.268866 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.268908 4854 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.268927 4854 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.268944 4854 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.268960 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.268975 4854 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.268970 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.268992 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.269010 4854 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.269025 4854 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.269551 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.269599 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.269612 4854 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.269630 4854 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.269644 4854 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.269679 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:04.769651936 +0000 UTC m=+20.757484321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.269735 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.269754 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.269770 4854 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.269955 4854 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.269981 4854 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270020 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270035 4854 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270065 4854 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270099 4854 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270114 4854 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270305 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270327 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270341 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270354 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270387 4854 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270418 4854 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270433 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270494 4854 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270509 4854 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270521 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270588 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270667 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270613 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270717 4854 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270736 4854 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270730 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270753 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270771 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270788 4854 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270842 4854 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270857 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270871 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270885 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270900 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270914 4854 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270909 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.270928 4854 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271111 4854 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271137 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271167 4854 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271184 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271199 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271213 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271226 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271240 4854 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271305 4854 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271323 4854 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271338 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271351 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271364 4854 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271377 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271392 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271405 4854 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271417 4854 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271432 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271445 4854 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271458 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271471 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271485 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271500 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271516 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271529 4854 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271543 4854 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271558 4854 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271572 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271591 4854 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271605 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271497 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271379 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.271635 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.272539 4854 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.274114 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.274236 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.274604 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.274942 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.275231 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.275347 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.275465 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.275495 4854 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.275521 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.275584 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:04.775555359 +0000 UTC m=+20.763387624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.275333 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.275686 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.278756 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.280416 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.280575 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.280885 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.281903 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.284354 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.284299 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.284752 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.285224 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.285615 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.285972 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.288028 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.289934 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.290710 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.292110 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.292182 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.292289 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.292425 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.292693 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.292848 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.292970 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.292984 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.293376 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.293789 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.293967 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.295139 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.296964 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.298006 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.300075 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.300197 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.300674 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.300682 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.300933 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.301414 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.301527 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.301972 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.302287 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.302718 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.302729 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.303538 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.304646 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.306377 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.306562 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.321532 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.334072 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374668 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374765 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374836 4854 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374857 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374871 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374884 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374897 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374911 4854 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374923 4854 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374937 4854 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374948 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374962 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374972 4854 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374984 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.374997 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375009 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375022 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375035 4854 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375047 4854 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375059 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375073 4854 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375089 4854 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375107 4854 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375120 4854 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375133 4854 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375181 4854 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375199 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375213 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375227 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375240 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375253 4854 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375267 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375280 4854 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375294 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375307 4854 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375322 4854 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375334 4854 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375347 4854 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375380 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375394 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375407 4854 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375420 4854 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375433 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375445 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375458 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375471 4854 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375483 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375496 4854 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375508 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375519 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375530 4854 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375542 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375551 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375562 4854 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375574 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375584 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375597 4854 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375606 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375619 4854 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375633 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375651 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375667 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375682 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375694 4854 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375708 4854 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375724 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375742 4854 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375759 4854 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375772 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375784 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375796 4854 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375811 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375822 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375835 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375845 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375855 4854 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375866 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375876 4854 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375888 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375899 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.375973 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.376018 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.553886 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.562485 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.566915 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 12:25:04 crc kubenswrapper[4854]: W1007 12:25:04.572086 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-a8d2d06e2cba742172f33f7c48c052dda83170a0d531a67646993e7bc3d6400c WatchSource:0}: Error finding container a8d2d06e2cba742172f33f7c48c052dda83170a0d531a67646993e7bc3d6400c: Status 404 returned error can't find the container with id a8d2d06e2cba742172f33f7c48c052dda83170a0d531a67646993e7bc3d6400c Oct 07 12:25:04 crc kubenswrapper[4854]: W1007 12:25:04.588962 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-c4bec367e7c2b2cdd8f548e910591f387feaac01c8b7dd9d1ab95dc955f6ada9 WatchSource:0}: Error finding container c4bec367e7c2b2cdd8f548e910591f387feaac01c8b7dd9d1ab95dc955f6ada9: Status 404 returned error can't find the container with id c4bec367e7c2b2cdd8f548e910591f387feaac01c8b7dd9d1ab95dc955f6ada9 Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.705661 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.706431 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.707591 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.708214 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.709167 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.709647 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.710241 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.711341 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.711940 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.712920 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.713497 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.715610 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.716196 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.716707 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.718079 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.719123 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.720361 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.720773 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.721375 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.722552 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.722986 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.724285 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.728177 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.728667 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.729826 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.730376 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.730957 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.733409 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.734052 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.735181 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.735624 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.736614 4854 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.736659 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.736730 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.738374 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.741021 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.741528 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.743484 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.744126 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.745038 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.745678 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.747925 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.750063 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.751354 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.752037 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.755964 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.756077 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.756528 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.757362 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.758322 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.759407 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.759947 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.760944 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.761478 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.762443 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.762999 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.763517 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.772450 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.778792 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.778863 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.778891 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.778913 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.778936 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.778988 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:05.778969555 +0000 UTC m=+21.766801810 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.779040 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.779053 4854 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.779061 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.779078 4854 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.779112 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.779181 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.779196 4854 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.779118 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:05.779103319 +0000 UTC m=+21.766935564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.779281 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:05.779264433 +0000 UTC m=+21.767096688 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.779299 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:05.779294074 +0000 UTC m=+21.767126329 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.779414 4854 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.779452 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:05.779444519 +0000 UTC m=+21.767276764 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.782714 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.797963 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.862643 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.863302 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.864988 4854 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a" exitCode=255 Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.865073 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a"} Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.865187 4854 scope.go:117] "RemoveContainer" containerID="e1cb040918cc593d2eb1d6656be36d11565041ce87451b235d76783ad504a16c" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.868605 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e80226dfb28fd9d2205ecba10801a554534a59e15355aabe8822c19affefa087"} Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.868687 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c4bec367e7c2b2cdd8f548e910591f387feaac01c8b7dd9d1ab95dc955f6ada9"} Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.870400 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ae5456dce313777b7f3283532294ecda1b8a7b2460fc6aa7ff94c4614ee92b89"} Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.872437 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a8d2d06e2cba742172f33f7c48c052dda83170a0d531a67646993e7bc3d6400c"} Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.878926 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.898477 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.923635 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.937437 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.948842 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.961052 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.988707 4854 scope.go:117] "RemoveContainer" containerID="d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a" Oct 07 12:25:04 crc kubenswrapper[4854]: E1007 12:25:04.988949 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 12:25:04 crc kubenswrapper[4854]: I1007 12:25:04.990083 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.075908 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-4m45f"] Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.076369 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4m45f" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.078435 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.079800 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.080624 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.096652 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.127702 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4m45f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxnjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4m45f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.150593 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"540475fe-5ef7-4652-9102-3f6cf983f0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ae34795ed2fc8c3fe6c000b7e45a2752aad9f160853bb5b214bcc2a25fb3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f7ba3fd3bb29e8808c12161f5d6f919e85920f49b10d6c42490f3c1c2c5067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba427dd9f64f0aaae3cab9458c9c7a64db6fe9b3acf4939fcc109a551a85892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1cb040918cc593d2eb1d6656be36d11565041ce87451b235d76783ad504a16c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T12:24:48Z\\\",\\\"message\\\":\\\"W1007 12:24:47.993886 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 12:24:47.994237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759839887 cert, and key in /tmp/serving-cert-945873907/serving-signer.crt, /tmp/serving-cert-945873907/serving-signer.key\\\\nI1007 12:24:48.205910 1 observer_polling.go:159] Starting file observer\\\\nW1007 12:24:48.210822 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 12:24:48.211063 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 12:24:48.213807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-945873907/tls.crt::/tmp/serving-cert-945873907/tls.key\\\\\\\"\\\\nF1007 12:24:48.440436 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T12:25:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 12:25:04.151841 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 12:25:04.152115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 12:25:04.153363 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3164975325/tls.crt::/tmp/serving-cert-3164975325/tls.key\\\\\\\"\\\\nI1007 12:25:04.710164 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 12:25:04.713643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 12:25:04.713671 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 12:25:04.713696 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 12:25:04.713703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 12:25:04.728062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 12:25:04.728092 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 12:25:04.728098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 12:25:04.728102 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 12:25:04.728106 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 12:25:04.728109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 12:25:04.728112 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 12:25:04.728420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 12:25:04.732086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T12:24:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9739cf9b935b4af8578327ef6891884eae28ee59a768c30deab7cc49ed73ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17fecde24011ce4cc70952a55756d12d2e3ae5be575d349c246bf678ea033408\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17fecde24011ce4cc70952a55756d12d2e3ae5be575d349c246bf678ea033408\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T12:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T12:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:24:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.172001 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.181765 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b-hosts-file\") pod \"node-resolver-4m45f\" (UID: \"dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b\") " pod="openshift-dns/node-resolver-4m45f" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.181823 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxnjl\" (UniqueName: \"kubernetes.io/projected/dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b-kube-api-access-sxnjl\") pod \"node-resolver-4m45f\" (UID: \"dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b\") " pod="openshift-dns/node-resolver-4m45f" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.184655 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.214479 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.230897 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.243909 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.282476 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b-hosts-file\") pod \"node-resolver-4m45f\" (UID: \"dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b\") " pod="openshift-dns/node-resolver-4m45f" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.282538 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxnjl\" (UniqueName: \"kubernetes.io/projected/dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b-kube-api-access-sxnjl\") pod \"node-resolver-4m45f\" (UID: \"dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b\") " pod="openshift-dns/node-resolver-4m45f" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.282656 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b-hosts-file\") pod \"node-resolver-4m45f\" (UID: \"dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b\") " pod="openshift-dns/node-resolver-4m45f" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.301322 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxnjl\" (UniqueName: \"kubernetes.io/projected/dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b-kube-api-access-sxnjl\") pod \"node-resolver-4m45f\" (UID: \"dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b\") " pod="openshift-dns/node-resolver-4m45f" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.389734 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4m45f" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.461833 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-vbjnw"] Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.462246 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-nkr42"] Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.462377 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m4vxt"] Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.462427 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.463518 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.463565 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: W1007 12:25:05.468192 4854 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": failed to list *v1.ConfigMap: configmaps "ovnkube-script-lib" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.468271 4854 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-script-lib\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.473568 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 07 12:25:05 crc kubenswrapper[4854]: W1007 12:25:05.473632 4854 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": failed to list *v1.Secret: secrets "ovn-node-metrics-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.473705 4854 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-node-metrics-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.473639 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.473813 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.473841 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 07 12:25:05 crc kubenswrapper[4854]: W1007 12:25:05.473956 4854 reflector.go:561] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.473996 4854 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.474010 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.474027 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 07 12:25:05 crc kubenswrapper[4854]: W1007 12:25:05.474047 4854 reflector.go:561] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.474063 4854 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.474072 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.474131 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 07 12:25:05 crc kubenswrapper[4854]: W1007 12:25:05.474143 4854 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": failed to list *v1.Secret: secrets "ovn-kubernetes-node-dockercfg-pwtwl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.474184 4854 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pwtwl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-node-dockercfg-pwtwl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.473815 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 07 12:25:05 crc kubenswrapper[4854]: W1007 12:25:05.474196 4854 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-config": failed to list *v1.ConfigMap: configmaps "ovnkube-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.474245 4854 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 12:25:05 crc kubenswrapper[4854]: W1007 12:25:05.474261 4854 reflector.go:561] object-"openshift-ovn-kubernetes"/"env-overrides": failed to list *v1.ConfigMap: configmaps "env-overrides" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.474320 4854 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"env-overrides\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"env-overrides\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.474359 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.481858 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-7hw6x"] Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.482795 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: W1007 12:25:05.484702 4854 reflector.go:561] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": failed to list *v1.Secret: secrets "multus-ancillary-tools-dockercfg-vnmsz" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.484756 4854 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vnmsz\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"multus-ancillary-tools-dockercfg-vnmsz\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 12:25:05 crc kubenswrapper[4854]: W1007 12:25:05.484812 4854 reflector.go:561] object-"openshift-multus"/"default-cni-sysctl-allowlist": failed to list *v1.ConfigMap: configmaps "default-cni-sysctl-allowlist" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.484824 4854 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"default-cni-sysctl-allowlist\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.508100 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.533329 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4m45f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxnjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4m45f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.558957 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587664 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-systemd-units\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587726 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f46sm\" (UniqueName: \"kubernetes.io/projected/262977a1-789b-4e3a-b893-27b1eccc894b-kube-api-access-f46sm\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587746 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-multus-conf-dir\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587784 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bczjh\" (UniqueName: \"kubernetes.io/projected/40b8b82d-cfd5-41d7-8673-5774db092c85-kube-api-access-bczjh\") pod \"machine-config-daemon-vbjnw\" (UID: \"40b8b82d-cfd5-41d7-8673-5774db092c85\") " pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587803 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-hostroot\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587818 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-slash\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587832 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-run-netns\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587847 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/40b8b82d-cfd5-41d7-8673-5774db092c85-rootfs\") pod \"machine-config-daemon-vbjnw\" (UID: \"40b8b82d-cfd5-41d7-8673-5774db092c85\") " pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587866 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-openvswitch\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587909 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-cni-netd\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587928 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-cni-bin\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587943 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587960 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/262977a1-789b-4e3a-b893-27b1eccc894b-cnibin\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587978 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-config\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.587993 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-system-cni-dir\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588010 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-cnibin\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588026 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-var-lib-cni-multus\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588042 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-var-lib-kubelet\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588058 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/262977a1-789b-4e3a-b893-27b1eccc894b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588075 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-var-lib-cni-bin\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588094 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khkwh\" (UniqueName: \"kubernetes.io/projected/260ab665-6a8a-44ee-9a16-5ff284b35eba-kube-api-access-khkwh\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588112 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-log-socket\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588127 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovn-node-metrics-cert\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588182 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-run-netns\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588199 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-etc-kubernetes\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588221 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-var-lib-openvswitch\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588237 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-env-overrides\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588253 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7zc7\" (UniqueName: \"kubernetes.io/projected/9ccb7160-6ff2-43f3-927b-5bf4aced4993-kube-api-access-b7zc7\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588288 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/262977a1-789b-4e3a-b893-27b1eccc894b-system-cni-dir\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588307 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-kubelet\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588324 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-systemd\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588340 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40b8b82d-cfd5-41d7-8673-5774db092c85-proxy-tls\") pod \"machine-config-daemon-vbjnw\" (UID: \"40b8b82d-cfd5-41d7-8673-5774db092c85\") " pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588358 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/40b8b82d-cfd5-41d7-8673-5774db092c85-mcd-auth-proxy-config\") pod \"machine-config-daemon-vbjnw\" (UID: \"40b8b82d-cfd5-41d7-8673-5774db092c85\") " pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588374 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-multus-socket-dir-parent\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588396 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-etc-openvswitch\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588411 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-run-ovn-kubernetes\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588427 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/260ab665-6a8a-44ee-9a16-5ff284b35eba-cni-binary-copy\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588443 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/262977a1-789b-4e3a-b893-27b1eccc894b-os-release\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588460 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/260ab665-6a8a-44ee-9a16-5ff284b35eba-multus-daemon-config\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588476 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/262977a1-789b-4e3a-b893-27b1eccc894b-cni-binary-copy\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588497 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/262977a1-789b-4e3a-b893-27b1eccc894b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588515 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-os-release\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588530 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-run-k8s-cni-cncf-io\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588547 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-multus-cni-dir\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588564 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-ovn\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588578 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-node-log\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588595 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-script-lib\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.588634 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-run-multus-certs\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.589549 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.615099 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"540475fe-5ef7-4652-9102-3f6cf983f0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ae34795ed2fc8c3fe6c000b7e45a2752aad9f160853bb5b214bcc2a25fb3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f7ba3fd3bb29e8808c12161f5d6f919e85920f49b10d6c42490f3c1c2c5067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba427dd9f64f0aaae3cab9458c9c7a64db6fe9b3acf4939fcc109a551a85892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1cb040918cc593d2eb1d6656be36d11565041ce87451b235d76783ad504a16c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T12:24:48Z\\\",\\\"message\\\":\\\"W1007 12:24:47.993886 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 12:24:47.994237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759839887 cert, and key in /tmp/serving-cert-945873907/serving-signer.crt, /tmp/serving-cert-945873907/serving-signer.key\\\\nI1007 12:24:48.205910 1 observer_polling.go:159] Starting file observer\\\\nW1007 12:24:48.210822 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 12:24:48.211063 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 12:24:48.213807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-945873907/tls.crt::/tmp/serving-cert-945873907/tls.key\\\\\\\"\\\\nF1007 12:24:48.440436 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T12:25:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 12:25:04.151841 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 12:25:04.152115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 12:25:04.153363 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3164975325/tls.crt::/tmp/serving-cert-3164975325/tls.key\\\\\\\"\\\\nI1007 12:25:04.710164 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 12:25:04.713643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 12:25:04.713671 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 12:25:04.713696 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 12:25:04.713703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 12:25:04.728062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 12:25:04.728092 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 12:25:04.728098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 12:25:04.728102 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 12:25:04.728106 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 12:25:04.728109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 12:25:04.728112 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 12:25:04.728420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 12:25:04.732086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T12:24:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9739cf9b935b4af8578327ef6891884eae28ee59a768c30deab7cc49ed73ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17fecde24011ce4cc70952a55756d12d2e3ae5be575d349c246bf678ea033408\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17fecde24011ce4cc70952a55756d12d2e3ae5be575d349c246bf678ea033408\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T12:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T12:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:24:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.634253 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.648852 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.662469 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.676750 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40b8b82d-cfd5-41d7-8673-5774db092c85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bczjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bczjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vbjnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.689732 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-env-overrides\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.689787 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7zc7\" (UniqueName: \"kubernetes.io/projected/9ccb7160-6ff2-43f3-927b-5bf4aced4993-kube-api-access-b7zc7\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.689811 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/262977a1-789b-4e3a-b893-27b1eccc894b-system-cni-dir\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.689834 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-etc-kubernetes\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.689870 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-var-lib-openvswitch\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.689896 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40b8b82d-cfd5-41d7-8673-5774db092c85-proxy-tls\") pod \"machine-config-daemon-vbjnw\" (UID: \"40b8b82d-cfd5-41d7-8673-5774db092c85\") " pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.689982 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-etc-kubernetes\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690018 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/40b8b82d-cfd5-41d7-8673-5774db092c85-mcd-auth-proxy-config\") pod \"machine-config-daemon-vbjnw\" (UID: \"40b8b82d-cfd5-41d7-8673-5774db092c85\") " pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690047 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-kubelet\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690087 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-kubelet\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690103 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-systemd\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.689985 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/262977a1-789b-4e3a-b893-27b1eccc894b-system-cni-dir\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690127 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-etc-openvswitch\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690160 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-run-ovn-kubernetes\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690180 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/260ab665-6a8a-44ee-9a16-5ff284b35eba-cni-binary-copy\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690196 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-multus-socket-dir-parent\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690199 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-etc-openvswitch\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690219 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/262977a1-789b-4e3a-b893-27b1eccc894b-os-release\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.689985 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-var-lib-openvswitch\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690311 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/260ab665-6a8a-44ee-9a16-5ff284b35eba-multus-daemon-config\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690333 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-os-release\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690352 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-run-ovn-kubernetes\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690396 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-run-k8s-cni-cncf-io\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690429 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/262977a1-789b-4e3a-b893-27b1eccc894b-cni-binary-copy\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690452 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/262977a1-789b-4e3a-b893-27b1eccc894b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690496 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-multus-cni-dir\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690525 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-node-log\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690545 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-script-lib\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690566 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-run-multus-certs\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690580 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/262977a1-789b-4e3a-b893-27b1eccc894b-os-release\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690591 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-ovn\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690619 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-systemd-units\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690643 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f46sm\" (UniqueName: \"kubernetes.io/projected/262977a1-789b-4e3a-b893-27b1eccc894b-kube-api-access-f46sm\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690641 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-os-release\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690667 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-multus-conf-dir\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690163 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-systemd\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690693 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bczjh\" (UniqueName: \"kubernetes.io/projected/40b8b82d-cfd5-41d7-8673-5774db092c85-kube-api-access-bczjh\") pod \"machine-config-daemon-vbjnw\" (UID: \"40b8b82d-cfd5-41d7-8673-5774db092c85\") " pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690707 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-node-log\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690717 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-hostroot\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690734 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-run-k8s-cni-cncf-io\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690741 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/40b8b82d-cfd5-41d7-8673-5774db092c85-rootfs\") pod \"machine-config-daemon-vbjnw\" (UID: \"40b8b82d-cfd5-41d7-8673-5774db092c85\") " pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690767 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-slash\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690790 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-run-netns\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690824 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-openvswitch\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690845 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-cni-netd\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690876 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-cni-bin\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690880 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/40b8b82d-cfd5-41d7-8673-5774db092c85-mcd-auth-proxy-config\") pod \"machine-config-daemon-vbjnw\" (UID: \"40b8b82d-cfd5-41d7-8673-5774db092c85\") " pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690895 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/260ab665-6a8a-44ee-9a16-5ff284b35eba-cni-binary-copy\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690902 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690930 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/262977a1-789b-4e3a-b893-27b1eccc894b-cnibin\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690939 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-systemd-units\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690956 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-cnibin\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690971 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-ovn\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690983 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-var-lib-cni-multus\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691008 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-var-lib-kubelet\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691026 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-cni-netd\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691038 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-config\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691065 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-slash\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691068 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-system-cni-dir\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691093 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-multus-cni-dir\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691100 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/262977a1-789b-4e3a-b893-27b1eccc894b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691111 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/40b8b82d-cfd5-41d7-8673-5774db092c85-rootfs\") pod \"machine-config-daemon-vbjnw\" (UID: \"40b8b82d-cfd5-41d7-8673-5774db092c85\") " pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691173 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690429 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-multus-socket-dir-parent\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691199 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-var-lib-cni-bin\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691134 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-var-lib-cni-bin\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691170 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-openvswitch\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691139 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-run-netns\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691134 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-cni-bin\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690935 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-run-multus-certs\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.690875 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-multus-conf-dir\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691204 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/262977a1-789b-4e3a-b893-27b1eccc894b-cnibin\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691237 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-cnibin\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691365 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-hostroot\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691415 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khkwh\" (UniqueName: \"kubernetes.io/projected/260ab665-6a8a-44ee-9a16-5ff284b35eba-kube-api-access-khkwh\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691430 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/262977a1-789b-4e3a-b893-27b1eccc894b-cni-binary-copy\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691464 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-var-lib-cni-multus\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691470 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-var-lib-kubelet\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691469 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/262977a1-789b-4e3a-b893-27b1eccc894b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691505 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-log-socket\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691538 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-system-cni-dir\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691550 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovn-node-metrics-cert\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691588 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-log-socket\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691710 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-run-netns\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.691756 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/260ab665-6a8a-44ee-9a16-5ff284b35eba-host-run-netns\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.692205 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/260ab665-6a8a-44ee-9a16-5ff284b35eba-multus-daemon-config\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.697865 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40b8b82d-cfd5-41d7-8673-5774db092c85-proxy-tls\") pod \"machine-config-daemon-vbjnw\" (UID: \"40b8b82d-cfd5-41d7-8673-5774db092c85\") " pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.701759 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.701826 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.701891 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.701951 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.702011 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.702112 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.702246 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hw6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262977a1-789b-4e3a-b893-27b1eccc894b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hw6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.713297 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bczjh\" (UniqueName: \"kubernetes.io/projected/40b8b82d-cfd5-41d7-8673-5774db092c85-kube-api-access-bczjh\") pod \"machine-config-daemon-vbjnw\" (UID: \"40b8b82d-cfd5-41d7-8673-5774db092c85\") " pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.713457 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khkwh\" (UniqueName: \"kubernetes.io/projected/260ab665-6a8a-44ee-9a16-5ff284b35eba-kube-api-access-khkwh\") pod \"multus-nkr42\" (UID: \"260ab665-6a8a-44ee-9a16-5ff284b35eba\") " pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.717298 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f46sm\" (UniqueName: \"kubernetes.io/projected/262977a1-789b-4e3a-b893-27b1eccc894b-kube-api-access-f46sm\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.717970 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.735090 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.748437 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.768116 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.786221 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.787214 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.792777 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.792960 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.793001 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.793043 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:07.793014276 +0000 UTC m=+23.780846531 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.793093 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.793101 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.793128 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.793136 4854 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.793200 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.793244 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:07.793219762 +0000 UTC m=+23.781052027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.793286 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.793302 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.793308 4854 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.793161 4854 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.793346 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:07.793338766 +0000 UTC m=+23.781171021 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.793313 4854 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.793370 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:07.793353196 +0000 UTC m=+23.781185451 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.793389 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:07.793383517 +0000 UTC m=+23.781215772 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.794262 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nkr42" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.802568 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4m45f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxnjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4m45f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: W1007 12:25:05.813175 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod260ab665_6a8a_44ee_9a16_5ff284b35eba.slice/crio-a35c8d2882af88836cb9d77265e8e39dadfb8c4f26388a7c95efa05d1831847b WatchSource:0}: Error finding container a35c8d2882af88836cb9d77265e8e39dadfb8c4f26388a7c95efa05d1831847b: Status 404 returned error can't find the container with id a35c8d2882af88836cb9d77265e8e39dadfb8c4f26388a7c95efa05d1831847b Oct 07 12:25:05 crc kubenswrapper[4854]: W1007 12:25:05.818403 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40b8b82d_cfd5_41d7_8673_5774db092c85.slice/crio-abc2a368d1c50b446c8e3aa5813093830254c7644b0a1574f0462e726ccbf2cb WatchSource:0}: Error finding container abc2a368d1c50b446c8e3aa5813093830254c7644b0a1574f0462e726ccbf2cb: Status 404 returned error can't find the container with id abc2a368d1c50b446c8e3aa5813093830254c7644b0a1574f0462e726ccbf2cb Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.826029 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"540475fe-5ef7-4652-9102-3f6cf983f0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ae34795ed2fc8c3fe6c000b7e45a2752aad9f160853bb5b214bcc2a25fb3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f7ba3fd3bb29e8808c12161f5d6f919e85920f49b10d6c42490f3c1c2c5067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba427dd9f64f0aaae3cab9458c9c7a64db6fe9b3acf4939fcc109a551a85892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1cb040918cc593d2eb1d6656be36d11565041ce87451b235d76783ad504a16c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T12:24:48Z\\\",\\\"message\\\":\\\"W1007 12:24:47.993886 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 12:24:47.994237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759839887 cert, and key in /tmp/serving-cert-945873907/serving-signer.crt, /tmp/serving-cert-945873907/serving-signer.key\\\\nI1007 12:24:48.205910 1 observer_polling.go:159] Starting file observer\\\\nW1007 12:24:48.210822 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 12:24:48.211063 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 12:24:48.213807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-945873907/tls.crt::/tmp/serving-cert-945873907/tls.key\\\\\\\"\\\\nF1007 12:24:48.440436 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T12:25:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 12:25:04.151841 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 12:25:04.152115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 12:25:04.153363 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3164975325/tls.crt::/tmp/serving-cert-3164975325/tls.key\\\\\\\"\\\\nI1007 12:25:04.710164 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 12:25:04.713643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 12:25:04.713671 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 12:25:04.713696 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 12:25:04.713703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 12:25:04.728062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 12:25:04.728092 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 12:25:04.728098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 12:25:04.728102 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 12:25:04.728106 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 12:25:04.728109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 12:25:04.728112 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 12:25:04.728420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 12:25:04.732086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T12:24:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9739cf9b935b4af8578327ef6891884eae28ee59a768c30deab7cc49ed73ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17fecde24011ce4cc70952a55756d12d2e3ae5be575d349c246bf678ea033408\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17fecde24011ce4cc70952a55756d12d2e3ae5be575d349c246bf678ea033408\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T12:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T12:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:24:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.841488 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.856671 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40b8b82d-cfd5-41d7-8673-5774db092c85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bczjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bczjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vbjnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.875248 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nkr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"260ab665-6a8a-44ee-9a16-5ff284b35eba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khkwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nkr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.877207 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2a3100ee29d9758c03f42747e30ca70f1f342430147e999be4d130452e0d54b1"} Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.878330 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"abc2a368d1c50b446c8e3aa5813093830254c7644b0a1574f0462e726ccbf2cb"} Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.879324 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nkr42" event={"ID":"260ab665-6a8a-44ee-9a16-5ff284b35eba","Type":"ContainerStarted","Data":"a35c8d2882af88836cb9d77265e8e39dadfb8c4f26388a7c95efa05d1831847b"} Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.880766 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.883130 4854 scope.go:117] "RemoveContainer" containerID="d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a" Oct 07 12:25:05 crc kubenswrapper[4854]: E1007 12:25:05.883297 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.885429 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4m45f" event={"ID":"dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b","Type":"ContainerStarted","Data":"0dd03f17293997d49594aaa1d1dd5cf367d839e179fc5587ff391c9800c47d6f"} Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.885516 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4m45f" event={"ID":"dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b","Type":"ContainerStarted","Data":"ccc3d471e972b7a77fdbbe3d65d54a88ac04fa73451b55661482f0e4a8f030da"} Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.887868 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d4de333fe49353da88c8c6f0785d1a8e13cda2e8faf8497f4f3e408b7cbac524"} Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.896917 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ccb7160-6ff2-43f3-927b-5bf4aced4993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4vxt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.910623 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a3100ee29d9758c03f42747e30ca70f1f342430147e999be4d130452e0d54b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.924331 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.940968 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.957048 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4de333fe49353da88c8c6f0785d1a8e13cda2e8faf8497f4f3e408b7cbac524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e80226dfb28fd9d2205ecba10801a554534a59e15355aabe8822c19affefa087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.982025 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hw6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262977a1-789b-4e3a-b893-27b1eccc894b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hw6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:05 crc kubenswrapper[4854]: I1007 12:25:05.998074 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:05Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.009516 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4m45f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd03f17293997d49594aaa1d1dd5cf367d839e179fc5587ff391c9800c47d6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxnjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4m45f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:06Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.028310 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"540475fe-5ef7-4652-9102-3f6cf983f0d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:24:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ae34795ed2fc8c3fe6c000b7e45a2752aad9f160853bb5b214bcc2a25fb3d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66f7ba3fd3bb29e8808c12161f5d6f919e85920f49b10d6c42490f3c1c2c5067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ba427dd9f64f0aaae3cab9458c9c7a64db6fe9b3acf4939fcc109a551a85892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T12:25:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 12:25:04.151841 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 12:25:04.152115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 12:25:04.153363 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3164975325/tls.crt::/tmp/serving-cert-3164975325/tls.key\\\\\\\"\\\\nI1007 12:25:04.710164 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 12:25:04.713643 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 12:25:04.713671 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 12:25:04.713696 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 12:25:04.713703 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 12:25:04.728062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 12:25:04.728092 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 12:25:04.728098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 12:25:04.728102 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 12:25:04.728106 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 12:25:04.728109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 12:25:04.728112 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 12:25:04.728420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 12:25:04.732086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T12:24:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9739cf9b935b4af8578327ef6891884eae28ee59a768c30deab7cc49ed73ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:24:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17fecde24011ce4cc70952a55756d12d2e3ae5be575d349c246bf678ea033408\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17fecde24011ce4cc70952a55756d12d2e3ae5be575d349c246bf678ea033408\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T12:24:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T12:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:24:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:06Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.043376 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:06Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.058166 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40b8b82d-cfd5-41d7-8673-5774db092c85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bczjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bczjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vbjnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:06Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.076203 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nkr42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"260ab665-6a8a-44ee-9a16-5ff284b35eba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-khkwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nkr42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:06Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.097358 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ccb7160-6ff2-43f3-927b-5bf4aced4993\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7zc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m4vxt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:06Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.500416 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.505507 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovn-node-metrics-cert\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.523301 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.682461 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 07 12:25:06 crc kubenswrapper[4854]: E1007 12:25:06.691160 4854 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/env-overrides: failed to sync configmap cache: timed out waiting for the condition Oct 07 12:25:06 crc kubenswrapper[4854]: E1007 12:25:06.691314 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-env-overrides podName:9ccb7160-6ff2-43f3-927b-5bf4aced4993 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:07.191277542 +0000 UTC m=+23.179109797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "env-overrides" (UniqueName: "kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-env-overrides") pod "ovnkube-node-m4vxt" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993") : failed to sync configmap cache: timed out waiting for the condition Oct 07 12:25:06 crc kubenswrapper[4854]: E1007 12:25:06.691641 4854 configmap.go:193] Couldn't get configMap openshift-multus/default-cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Oct 07 12:25:06 crc kubenswrapper[4854]: E1007 12:25:06.691674 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/262977a1-789b-4e3a-b893-27b1eccc894b-cni-sysctl-allowlist podName:262977a1-789b-4e3a-b893-27b1eccc894b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:07.191666203 +0000 UTC m=+23.179498458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/262977a1-789b-4e3a-b893-27b1eccc894b-cni-sysctl-allowlist") pod "multus-additional-cni-plugins-7hw6x" (UID: "262977a1-789b-4e3a-b893-27b1eccc894b") : failed to sync configmap cache: timed out waiting for the condition Oct 07 12:25:06 crc kubenswrapper[4854]: E1007 12:25:06.691698 4854 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/ovnkube-config: failed to sync configmap cache: timed out waiting for the condition Oct 07 12:25:06 crc kubenswrapper[4854]: E1007 12:25:06.691718 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-config podName:9ccb7160-6ff2-43f3-927b-5bf4aced4993 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:07.191712864 +0000 UTC m=+23.179545119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovnkube-config" (UniqueName: "kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-config") pod "ovnkube-node-m4vxt" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993") : failed to sync configmap cache: timed out waiting for the condition Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.691810 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-script-lib\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:06 crc kubenswrapper[4854]: E1007 12:25:06.704486 4854 projected.go:288] Couldn't get configMap openshift-ovn-kubernetes/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 07 12:25:06 crc kubenswrapper[4854]: E1007 12:25:06.704571 4854 projected.go:194] Error preparing data for projected volume kube-api-access-b7zc7 for pod openshift-ovn-kubernetes/ovnkube-node-m4vxt: failed to sync configmap cache: timed out waiting for the condition Oct 07 12:25:06 crc kubenswrapper[4854]: E1007 12:25:06.704644 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ccb7160-6ff2-43f3-927b-5bf4aced4993-kube-api-access-b7zc7 podName:9ccb7160-6ff2-43f3-927b-5bf4aced4993 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:07.204624492 +0000 UTC m=+23.192456737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-b7zc7" (UniqueName: "kubernetes.io/projected/9ccb7160-6ff2-43f3-927b-5bf4aced4993-kube-api-access-b7zc7") pod "ovnkube-node-m4vxt" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993") : failed to sync configmap cache: timed out waiting for the condition Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.754790 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.818057 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.892937 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"53d8a42d287a22fa4f3c611f84f8cfb550161a803c578daca9c8e24c3cbf7ba0"} Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.893017 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"4999b61b3dbea84a514ca4a92eeb5610c1abb6cd234764d1084d95f3698e08c8"} Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.894223 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.894730 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nkr42" event={"ID":"260ab665-6a8a-44ee-9a16-5ff284b35eba","Type":"ContainerStarted","Data":"892e830d128143e462f68fe62bb8ad0c67d8a5f4e51a57e1710e037d34ff9c51"} Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.908571 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.911385 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.912813 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a3100ee29d9758c03f42747e30ca70f1f342430147e999be4d130452e0d54b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:06Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.919089 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.926860 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:06Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.937604 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.941893 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:06Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.955542 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4de333fe49353da88c8c6f0785d1a8e13cda2e8faf8497f4f3e408b7cbac524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e80226dfb28fd9d2205ecba10801a554534a59e15355aabe8822c19affefa087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:06Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.958437 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.974678 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7hw6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"262977a1-789b-4e3a-b893-27b1eccc894b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f46sm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7hw6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:06Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.994085 4854 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4m45f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc886f7c-d5f2-4e9e-baf4-95aa9ca9f32b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T12:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd03f17293997d49594aaa1d1dd5cf367d839e179fc5587ff391c9800c47d6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T12:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxnjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T12:25:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4m45f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T12:25:06Z is after 2025-08-24T17:21:41Z" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.994452 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-vv8tc"] Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.994857 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vv8tc" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.996680 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.996685 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.996717 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 07 12:25:06 crc kubenswrapper[4854]: I1007 12:25:06.996929 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.045114 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.066361 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podStartSLOduration=3.066335143 podStartE2EDuration="3.066335143s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:07.06590173 +0000 UTC m=+23.053733995" watchObservedRunningTime="2025-10-07 12:25:07.066335143 +0000 UTC m=+23.054167398" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.106813 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99zqt\" (UniqueName: \"kubernetes.io/projected/2f147cb3-67f8-4395-bf0f-80e285c68945-kube-api-access-99zqt\") pod \"node-ca-vv8tc\" (UID: \"2f147cb3-67f8-4395-bf0f-80e285c68945\") " pod="openshift-image-registry/node-ca-vv8tc" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.106854 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f147cb3-67f8-4395-bf0f-80e285c68945-serviceca\") pod \"node-ca-vv8tc\" (UID: \"2f147cb3-67f8-4395-bf0f-80e285c68945\") " pod="openshift-image-registry/node-ca-vv8tc" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.106877 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f147cb3-67f8-4395-bf0f-80e285c68945-host\") pod \"node-ca-vv8tc\" (UID: \"2f147cb3-67f8-4395-bf0f-80e285c68945\") " pod="openshift-image-registry/node-ca-vv8tc" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.147965 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4m45f" podStartSLOduration=3.14794597 podStartE2EDuration="3.14794597s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:07.14761949 +0000 UTC m=+23.135451745" watchObservedRunningTime="2025-10-07 12:25:07.14794597 +0000 UTC m=+23.135778215" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.162244 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nkr42" podStartSLOduration=3.162216347 podStartE2EDuration="3.162216347s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:07.161787985 +0000 UTC m=+23.149620240" watchObservedRunningTime="2025-10-07 12:25:07.162216347 +0000 UTC m=+23.150048602" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.208422 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99zqt\" (UniqueName: \"kubernetes.io/projected/2f147cb3-67f8-4395-bf0f-80e285c68945-kube-api-access-99zqt\") pod \"node-ca-vv8tc\" (UID: \"2f147cb3-67f8-4395-bf0f-80e285c68945\") " pod="openshift-image-registry/node-ca-vv8tc" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.208505 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f147cb3-67f8-4395-bf0f-80e285c68945-serviceca\") pod \"node-ca-vv8tc\" (UID: \"2f147cb3-67f8-4395-bf0f-80e285c68945\") " pod="openshift-image-registry/node-ca-vv8tc" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.208537 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f147cb3-67f8-4395-bf0f-80e285c68945-host\") pod \"node-ca-vv8tc\" (UID: \"2f147cb3-67f8-4395-bf0f-80e285c68945\") " pod="openshift-image-registry/node-ca-vv8tc" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.208568 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-config\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.208601 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/262977a1-789b-4e3a-b893-27b1eccc894b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.208662 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-env-overrides\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.208689 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7zc7\" (UniqueName: \"kubernetes.io/projected/9ccb7160-6ff2-43f3-927b-5bf4aced4993-kube-api-access-b7zc7\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.208747 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f147cb3-67f8-4395-bf0f-80e285c68945-host\") pod \"node-ca-vv8tc\" (UID: \"2f147cb3-67f8-4395-bf0f-80e285c68945\") " pod="openshift-image-registry/node-ca-vv8tc" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.209495 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-env-overrides\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.209495 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/262977a1-789b-4e3a-b893-27b1eccc894b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7hw6x\" (UID: \"262977a1-789b-4e3a-b893-27b1eccc894b\") " pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.209585 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-config\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.210382 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f147cb3-67f8-4395-bf0f-80e285c68945-serviceca\") pod \"node-ca-vv8tc\" (UID: \"2f147cb3-67f8-4395-bf0f-80e285c68945\") " pod="openshift-image-registry/node-ca-vv8tc" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.213758 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7zc7\" (UniqueName: \"kubernetes.io/projected/9ccb7160-6ff2-43f3-927b-5bf4aced4993-kube-api-access-b7zc7\") pod \"ovnkube-node-m4vxt\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.232782 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99zqt\" (UniqueName: \"kubernetes.io/projected/2f147cb3-67f8-4395-bf0f-80e285c68945-kube-api-access-99zqt\") pod \"node-ca-vv8tc\" (UID: \"2f147cb3-67f8-4395-bf0f-80e285c68945\") " pod="openshift-image-registry/node-ca-vv8tc" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.259704 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.259682648 podStartE2EDuration="1.259682648s" podCreationTimestamp="2025-10-07 12:25:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:07.259076991 +0000 UTC m=+23.246909246" watchObservedRunningTime="2025-10-07 12:25:07.259682648 +0000 UTC m=+23.247514903" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.307685 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.314788 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7hw6x" Oct 07 12:25:07 crc kubenswrapper[4854]: W1007 12:25:07.329355 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod262977a1_789b_4e3a_b893_27b1eccc894b.slice/crio-26902e6d5545f7b7816fd21bd95ed762b346a4b7ff02a84f84944a7195154a63 WatchSource:0}: Error finding container 26902e6d5545f7b7816fd21bd95ed762b346a4b7ff02a84f84944a7195154a63: Status 404 returned error can't find the container with id 26902e6d5545f7b7816fd21bd95ed762b346a4b7ff02a84f84944a7195154a63 Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.352258 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86"] Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.352881 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.354538 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.355868 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.379631 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-m6k45"] Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.380095 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.380174 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6k45" podUID="f0099a86-9473-4618-9266-2eb460d09150" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.411832 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/260030f0-2db2-4d40-9567-2a43e0cef930-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ppb86\" (UID: \"260030f0-2db2-4d40-9567-2a43e0cef930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.411876 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/260030f0-2db2-4d40-9567-2a43e0cef930-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ppb86\" (UID: \"260030f0-2db2-4d40-9567-2a43e0cef930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.412775 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mdft\" (UniqueName: \"kubernetes.io/projected/260030f0-2db2-4d40-9567-2a43e0cef930-kube-api-access-7mdft\") pod \"ovnkube-control-plane-749d76644c-ppb86\" (UID: \"260030f0-2db2-4d40-9567-2a43e0cef930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.412827 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/260030f0-2db2-4d40-9567-2a43e0cef930-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ppb86\" (UID: \"260030f0-2db2-4d40-9567-2a43e0cef930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.514364 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kwgc\" (UniqueName: \"kubernetes.io/projected/f0099a86-9473-4618-9266-2eb460d09150-kube-api-access-5kwgc\") pod \"network-metrics-daemon-m6k45\" (UID: \"f0099a86-9473-4618-9266-2eb460d09150\") " pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.514432 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mdft\" (UniqueName: \"kubernetes.io/projected/260030f0-2db2-4d40-9567-2a43e0cef930-kube-api-access-7mdft\") pod \"ovnkube-control-plane-749d76644c-ppb86\" (UID: \"260030f0-2db2-4d40-9567-2a43e0cef930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.514469 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs\") pod \"network-metrics-daemon-m6k45\" (UID: \"f0099a86-9473-4618-9266-2eb460d09150\") " pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.514512 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/260030f0-2db2-4d40-9567-2a43e0cef930-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ppb86\" (UID: \"260030f0-2db2-4d40-9567-2a43e0cef930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.514639 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/260030f0-2db2-4d40-9567-2a43e0cef930-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ppb86\" (UID: \"260030f0-2db2-4d40-9567-2a43e0cef930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.514670 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/260030f0-2db2-4d40-9567-2a43e0cef930-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ppb86\" (UID: \"260030f0-2db2-4d40-9567-2a43e0cef930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.515739 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/260030f0-2db2-4d40-9567-2a43e0cef930-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ppb86\" (UID: \"260030f0-2db2-4d40-9567-2a43e0cef930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.515888 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/260030f0-2db2-4d40-9567-2a43e0cef930-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ppb86\" (UID: \"260030f0-2db2-4d40-9567-2a43e0cef930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.520418 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/260030f0-2db2-4d40-9567-2a43e0cef930-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ppb86\" (UID: \"260030f0-2db2-4d40-9567-2a43e0cef930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.535776 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mdft\" (UniqueName: \"kubernetes.io/projected/260030f0-2db2-4d40-9567-2a43e0cef930-kube-api-access-7mdft\") pod \"ovnkube-control-plane-749d76644c-ppb86\" (UID: \"260030f0-2db2-4d40-9567-2a43e0cef930\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.578919 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vv8tc" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.586125 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" Oct 07 12:25:07 crc kubenswrapper[4854]: W1007 12:25:07.601496 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f147cb3_67f8_4395_bf0f_80e285c68945.slice/crio-cc492ad3db215dd063f51b415d0ef769593cef41c8cd25bf775c5aff54c35014 WatchSource:0}: Error finding container cc492ad3db215dd063f51b415d0ef769593cef41c8cd25bf775c5aff54c35014: Status 404 returned error can't find the container with id cc492ad3db215dd063f51b415d0ef769593cef41c8cd25bf775c5aff54c35014 Oct 07 12:25:07 crc kubenswrapper[4854]: W1007 12:25:07.608789 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod260030f0_2db2_4d40_9567_2a43e0cef930.slice/crio-3b08a764418ff66ccf82a50a9c480e190ba83e4eda479544bb7131ee9f5894a8 WatchSource:0}: Error finding container 3b08a764418ff66ccf82a50a9c480e190ba83e4eda479544bb7131ee9f5894a8: Status 404 returned error can't find the container with id 3b08a764418ff66ccf82a50a9c480e190ba83e4eda479544bb7131ee9f5894a8 Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.616009 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kwgc\" (UniqueName: \"kubernetes.io/projected/f0099a86-9473-4618-9266-2eb460d09150-kube-api-access-5kwgc\") pod \"network-metrics-daemon-m6k45\" (UID: \"f0099a86-9473-4618-9266-2eb460d09150\") " pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.616059 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs\") pod \"network-metrics-daemon-m6k45\" (UID: \"f0099a86-9473-4618-9266-2eb460d09150\") " pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.616234 4854 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.616279 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs podName:f0099a86-9473-4618-9266-2eb460d09150 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:08.116264769 +0000 UTC m=+24.104097014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs") pod "network-metrics-daemon-m6k45" (UID: "f0099a86-9473-4618-9266-2eb460d09150") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.638661 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kwgc\" (UniqueName: \"kubernetes.io/projected/f0099a86-9473-4618-9266-2eb460d09150-kube-api-access-5kwgc\") pod \"network-metrics-daemon-m6k45\" (UID: \"f0099a86-9473-4618-9266-2eb460d09150\") " pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.701759 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.701794 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.701874 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.701907 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.702051 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.702171 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.819493 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.819909 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:11.819868275 +0000 UTC m=+27.807700540 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.820375 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.820416 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.820446 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.820483 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.820639 4854 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.820692 4854 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.820731 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:11.820705579 +0000 UTC m=+27.808538044 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.820739 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.820754 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.820778 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.820794 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.820802 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:11.820777661 +0000 UTC m=+27.808610126 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.820810 4854 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.820876 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:11.820855914 +0000 UTC m=+27.808688169 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.820804 4854 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.821009 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:11.820976067 +0000 UTC m=+27.808808322 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.901515 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" event={"ID":"260030f0-2db2-4d40-9567-2a43e0cef930","Type":"ContainerStarted","Data":"1b12263eb577d85b578421f16220cb7353a629fb03e8d1a17c5eb6a89453584a"} Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.901583 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" event={"ID":"260030f0-2db2-4d40-9567-2a43e0cef930","Type":"ContainerStarted","Data":"3b08a764418ff66ccf82a50a9c480e190ba83e4eda479544bb7131ee9f5894a8"} Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.903583 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vv8tc" event={"ID":"2f147cb3-67f8-4395-bf0f-80e285c68945","Type":"ContainerStarted","Data":"0e2655a95a565db7d6107efdd8728dd6dca59b50cd0807af48ab6705009d9fc6"} Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.903664 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vv8tc" event={"ID":"2f147cb3-67f8-4395-bf0f-80e285c68945","Type":"ContainerStarted","Data":"cc492ad3db215dd063f51b415d0ef769593cef41c8cd25bf775c5aff54c35014"} Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.905073 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"41c3a524b7775ddac28b71bf76e985022fa8b3b320d8fc30f8ea21f48fcf4b14"} Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.917565 4854 generic.go:334] "Generic (PLEG): container finished" podID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerID="572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75" exitCode=0 Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.917659 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerDied","Data":"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75"} Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.917747 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerStarted","Data":"81a1b553acc5e91bccf70674e862b6a2c4174d17ab70276e16fefe8b99f48902"} Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.919821 4854 generic.go:334] "Generic (PLEG): container finished" podID="262977a1-789b-4e3a-b893-27b1eccc894b" containerID="60c761aa0acc49c82a520db73587242df3526f22360c7f450fa32fc19210a7e2" exitCode=0 Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.919931 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hw6x" event={"ID":"262977a1-789b-4e3a-b893-27b1eccc894b","Type":"ContainerDied","Data":"60c761aa0acc49c82a520db73587242df3526f22360c7f450fa32fc19210a7e2"} Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.919996 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hw6x" event={"ID":"262977a1-789b-4e3a-b893-27b1eccc894b","Type":"ContainerStarted","Data":"26902e6d5545f7b7816fd21bd95ed762b346a4b7ff02a84f84944a7195154a63"} Oct 07 12:25:07 crc kubenswrapper[4854]: I1007 12:25:07.935016 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vv8tc" podStartSLOduration=3.934991642 podStartE2EDuration="3.934991642s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:07.932663254 +0000 UTC m=+23.920495509" watchObservedRunningTime="2025-10-07 12:25:07.934991642 +0000 UTC m=+23.922823897" Oct 07 12:25:07 crc kubenswrapper[4854]: E1007 12:25:07.943186 4854 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.125130 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs\") pod \"network-metrics-daemon-m6k45\" (UID: \"f0099a86-9473-4618-9266-2eb460d09150\") " pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:08 crc kubenswrapper[4854]: E1007 12:25:08.125478 4854 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 12:25:08 crc kubenswrapper[4854]: E1007 12:25:08.125756 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs podName:f0099a86-9473-4618-9266-2eb460d09150 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:09.125736822 +0000 UTC m=+25.113569067 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs") pod "network-metrics-daemon-m6k45" (UID: "f0099a86-9473-4618-9266-2eb460d09150") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.445737 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.449433 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.457452 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.928267 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerStarted","Data":"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12"} Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.928346 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerStarted","Data":"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996"} Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.928368 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerStarted","Data":"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35"} Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.928385 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerStarted","Data":"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f"} Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.928402 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerStarted","Data":"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906"} Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.928417 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerStarted","Data":"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac"} Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.930613 4854 generic.go:334] "Generic (PLEG): container finished" podID="262977a1-789b-4e3a-b893-27b1eccc894b" containerID="96ad06b2756584f0e70637506d5dc13be1094ac53524bd759e6f562b73783459" exitCode=0 Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.930642 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hw6x" event={"ID":"262977a1-789b-4e3a-b893-27b1eccc894b","Type":"ContainerDied","Data":"96ad06b2756584f0e70637506d5dc13be1094ac53524bd759e6f562b73783459"} Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.934522 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" event={"ID":"260030f0-2db2-4d40-9567-2a43e0cef930","Type":"ContainerStarted","Data":"1b95b207f3b45a506e1534bf53d2cee85eef7ba0f772b73e6a3e0d793da86132"} Oct 07 12:25:08 crc kubenswrapper[4854]: E1007 12:25:08.944353 4854 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.976381 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.976356414 podStartE2EDuration="976.356414ms" podCreationTimestamp="2025-10-07 12:25:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:08.975093187 +0000 UTC m=+24.962925432" watchObservedRunningTime="2025-10-07 12:25:08.976356414 +0000 UTC m=+24.964188679" Oct 07 12:25:08 crc kubenswrapper[4854]: I1007 12:25:08.995012 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ppb86" podStartSLOduration=3.994983669 podStartE2EDuration="3.994983669s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:08.993885356 +0000 UTC m=+24.981717661" watchObservedRunningTime="2025-10-07 12:25:08.994983669 +0000 UTC m=+24.982815924" Oct 07 12:25:09 crc kubenswrapper[4854]: I1007 12:25:09.139083 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs\") pod \"network-metrics-daemon-m6k45\" (UID: \"f0099a86-9473-4618-9266-2eb460d09150\") " pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:09 crc kubenswrapper[4854]: E1007 12:25:09.139268 4854 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 12:25:09 crc kubenswrapper[4854]: E1007 12:25:09.139314 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs podName:f0099a86-9473-4618-9266-2eb460d09150 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:11.13929992 +0000 UTC m=+27.127132175 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs") pod "network-metrics-daemon-m6k45" (UID: "f0099a86-9473-4618-9266-2eb460d09150") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 12:25:09 crc kubenswrapper[4854]: I1007 12:25:09.702437 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:09 crc kubenswrapper[4854]: I1007 12:25:09.702462 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:09 crc kubenswrapper[4854]: I1007 12:25:09.702518 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:09 crc kubenswrapper[4854]: E1007 12:25:09.702580 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 12:25:09 crc kubenswrapper[4854]: I1007 12:25:09.702706 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:09 crc kubenswrapper[4854]: E1007 12:25:09.702698 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 12:25:09 crc kubenswrapper[4854]: E1007 12:25:09.702763 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6k45" podUID="f0099a86-9473-4618-9266-2eb460d09150" Oct 07 12:25:09 crc kubenswrapper[4854]: E1007 12:25:09.702866 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 12:25:09 crc kubenswrapper[4854]: I1007 12:25:09.941325 4854 generic.go:334] "Generic (PLEG): container finished" podID="262977a1-789b-4e3a-b893-27b1eccc894b" containerID="fc55ffcfa2b751726e60bcbee62f680cb5686a070743c1e4df37cbc83ce1050f" exitCode=0 Oct 07 12:25:09 crc kubenswrapper[4854]: I1007 12:25:09.941395 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hw6x" event={"ID":"262977a1-789b-4e3a-b893-27b1eccc894b","Type":"ContainerDied","Data":"fc55ffcfa2b751726e60bcbee62f680cb5686a070743c1e4df37cbc83ce1050f"} Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.209798 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.210703 4854 scope.go:117] "RemoveContainer" containerID="d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a" Oct 07 12:25:10 crc kubenswrapper[4854]: E1007 12:25:10.210898 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.521279 4854 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.523608 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.523654 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.523666 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.523819 4854 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.533254 4854 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.533642 4854 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.534949 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.534986 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.534996 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.535014 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.535026 4854 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T12:25:10Z","lastTransitionTime":"2025-10-07T12:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.586523 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb"] Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.587127 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.589232 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.589972 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.590193 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.590231 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.657661 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.657760 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.658039 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.658190 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.658258 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.759269 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.759346 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.759379 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.759416 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.759443 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.759532 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.759695 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.760658 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.766745 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.779326 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ad9c693-c3dd-4e37-9c35-4ac452cb28da-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nshdb\" (UID: \"9ad9c693-c3dd-4e37-9c35-4ac452cb28da\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.936969 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.949960 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerStarted","Data":"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d"} Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.953749 4854 generic.go:334] "Generic (PLEG): container finished" podID="262977a1-789b-4e3a-b893-27b1eccc894b" containerID="118321c45b63146eb1672a2e907121254827b57fec77f0d2507a034a077f9b4f" exitCode=0 Oct 07 12:25:10 crc kubenswrapper[4854]: I1007 12:25:10.953807 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hw6x" event={"ID":"262977a1-789b-4e3a-b893-27b1eccc894b","Type":"ContainerDied","Data":"118321c45b63146eb1672a2e907121254827b57fec77f0d2507a034a077f9b4f"} Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.163419 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs\") pod \"network-metrics-daemon-m6k45\" (UID: \"f0099a86-9473-4618-9266-2eb460d09150\") " pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.163812 4854 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.163941 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs podName:f0099a86-9473-4618-9266-2eb460d09150 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:15.163907482 +0000 UTC m=+31.151739777 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs") pod "network-metrics-daemon-m6k45" (UID: "f0099a86-9473-4618-9266-2eb460d09150") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.702204 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.702235 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.702231 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.703100 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.703316 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.703527 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.703643 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6k45" podUID="f0099a86-9473-4618-9266-2eb460d09150" Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.704207 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.873950 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.874065 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.874090 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.874113 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.874141 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.874246 4854 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.874296 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:19.874282381 +0000 UTC m=+35.862114636 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.874339 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.874389 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.874406 4854 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.874501 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:19.874458277 +0000 UTC m=+35.862290712 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.874573 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.874642 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:19.874628332 +0000 UTC m=+35.862460787 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.874655 4854 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.874691 4854 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.874841 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:19.874801107 +0000 UTC m=+35.862633442 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.874879 4854 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 12:25:11 crc kubenswrapper[4854]: E1007 12:25:11.874992 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:19.874962061 +0000 UTC m=+35.862794486 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.958307 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" event={"ID":"9ad9c693-c3dd-4e37-9c35-4ac452cb28da","Type":"ContainerStarted","Data":"b5f4c6f76b2a73a8ec8506787796b0af4ea45d00e463a5a23a8afb50a29eff35"} Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.958373 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" event={"ID":"9ad9c693-c3dd-4e37-9c35-4ac452cb28da","Type":"ContainerStarted","Data":"bb7766083b788ab40a5723ca0745bd648c28839c7402ce8ce2ae76e99e9b684c"} Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.963959 4854 generic.go:334] "Generic (PLEG): container finished" podID="262977a1-789b-4e3a-b893-27b1eccc894b" containerID="354b7b810d645bc1021d7da1c9e7287d92e65f8f2912f63233506f5a61d5b86e" exitCode=0 Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.964035 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hw6x" event={"ID":"262977a1-789b-4e3a-b893-27b1eccc894b","Type":"ContainerDied","Data":"354b7b810d645bc1021d7da1c9e7287d92e65f8f2912f63233506f5a61d5b86e"} Oct 07 12:25:11 crc kubenswrapper[4854]: I1007 12:25:11.978926 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nshdb" podStartSLOduration=7.978898282 podStartE2EDuration="7.978898282s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:11.97747326 +0000 UTC m=+27.965305505" watchObservedRunningTime="2025-10-07 12:25:11.978898282 +0000 UTC m=+27.966730587" Oct 07 12:25:12 crc kubenswrapper[4854]: I1007 12:25:12.973099 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerStarted","Data":"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d"} Oct 07 12:25:12 crc kubenswrapper[4854]: I1007 12:25:12.973862 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:12 crc kubenswrapper[4854]: I1007 12:25:12.973889 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:12 crc kubenswrapper[4854]: I1007 12:25:12.985444 4854 generic.go:334] "Generic (PLEG): container finished" podID="262977a1-789b-4e3a-b893-27b1eccc894b" containerID="c5ce149872df23478463c6564d0ad30d66eab285c885bb19947fe172372b857b" exitCode=0 Oct 07 12:25:12 crc kubenswrapper[4854]: I1007 12:25:12.985518 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hw6x" event={"ID":"262977a1-789b-4e3a-b893-27b1eccc894b","Type":"ContainerDied","Data":"c5ce149872df23478463c6564d0ad30d66eab285c885bb19947fe172372b857b"} Oct 07 12:25:13 crc kubenswrapper[4854]: I1007 12:25:13.006516 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:13 crc kubenswrapper[4854]: I1007 12:25:13.012111 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:13 crc kubenswrapper[4854]: I1007 12:25:13.013516 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" podStartSLOduration=9.013484375 podStartE2EDuration="9.013484375s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:13.011457025 +0000 UTC m=+28.999289320" watchObservedRunningTime="2025-10-07 12:25:13.013484375 +0000 UTC m=+29.001316660" Oct 07 12:25:13 crc kubenswrapper[4854]: I1007 12:25:13.701744 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:13 crc kubenswrapper[4854]: I1007 12:25:13.701766 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:13 crc kubenswrapper[4854]: I1007 12:25:13.701861 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:13 crc kubenswrapper[4854]: E1007 12:25:13.701921 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 12:25:13 crc kubenswrapper[4854]: E1007 12:25:13.702089 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 12:25:13 crc kubenswrapper[4854]: E1007 12:25:13.702325 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6k45" podUID="f0099a86-9473-4618-9266-2eb460d09150" Oct 07 12:25:13 crc kubenswrapper[4854]: I1007 12:25:13.702415 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:13 crc kubenswrapper[4854]: E1007 12:25:13.702487 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 12:25:13 crc kubenswrapper[4854]: I1007 12:25:13.995329 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hw6x" event={"ID":"262977a1-789b-4e3a-b893-27b1eccc894b","Type":"ContainerStarted","Data":"49307a532823006e680cc1537d780156afc52245116c27fdb85b36737c9a5d4d"} Oct 07 12:25:13 crc kubenswrapper[4854]: I1007 12:25:13.995416 4854 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 12:25:14 crc kubenswrapper[4854]: I1007 12:25:14.033067 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7hw6x" podStartSLOduration=10.033027438 podStartE2EDuration="10.033027438s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:14.032297707 +0000 UTC m=+30.020130012" watchObservedRunningTime="2025-10-07 12:25:14.033027438 +0000 UTC m=+30.020859743" Oct 07 12:25:14 crc kubenswrapper[4854]: I1007 12:25:14.998782 4854 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 12:25:15 crc kubenswrapper[4854]: I1007 12:25:15.218968 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs\") pod \"network-metrics-daemon-m6k45\" (UID: \"f0099a86-9473-4618-9266-2eb460d09150\") " pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:15 crc kubenswrapper[4854]: E1007 12:25:15.219260 4854 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 12:25:15 crc kubenswrapper[4854]: E1007 12:25:15.219377 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs podName:f0099a86-9473-4618-9266-2eb460d09150 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:23.21934978 +0000 UTC m=+39.207182225 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs") pod "network-metrics-daemon-m6k45" (UID: "f0099a86-9473-4618-9266-2eb460d09150") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 12:25:15 crc kubenswrapper[4854]: I1007 12:25:15.418509 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-m6k45"] Oct 07 12:25:15 crc kubenswrapper[4854]: I1007 12:25:15.419059 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:15 crc kubenswrapper[4854]: E1007 12:25:15.419189 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6k45" podUID="f0099a86-9473-4618-9266-2eb460d09150" Oct 07 12:25:15 crc kubenswrapper[4854]: I1007 12:25:15.702389 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:15 crc kubenswrapper[4854]: I1007 12:25:15.702467 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:15 crc kubenswrapper[4854]: I1007 12:25:15.702504 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:15 crc kubenswrapper[4854]: E1007 12:25:15.702553 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 12:25:15 crc kubenswrapper[4854]: E1007 12:25:15.702623 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 12:25:15 crc kubenswrapper[4854]: E1007 12:25:15.702959 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 12:25:16 crc kubenswrapper[4854]: I1007 12:25:16.701857 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:16 crc kubenswrapper[4854]: E1007 12:25:16.702661 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m6k45" podUID="f0099a86-9473-4618-9266-2eb460d09150" Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.702342 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.702394 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.702499 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:17 crc kubenswrapper[4854]: E1007 12:25:17.702490 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 12:25:17 crc kubenswrapper[4854]: E1007 12:25:17.702658 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 12:25:17 crc kubenswrapper[4854]: E1007 12:25:17.702893 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.925285 4854 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.925536 4854 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.970277 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp"] Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.970959 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp" Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.970984 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6s458"] Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.971301 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.972596 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc"] Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.973351 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh"] Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.973444 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.973854 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.976588 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-87pqd"] Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.977831 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.980945 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-557n4"] Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.981467 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:17 crc kubenswrapper[4854]: I1007 12:25:17.985930 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.000028 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.000066 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.000232 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.000485 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.000581 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.000789 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.000887 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.002052 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.002198 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.010714 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.011859 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.018527 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.018911 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.019200 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.019380 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.019619 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.019788 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.020892 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.021029 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.021132 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.021235 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.021321 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.021415 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.021498 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.021583 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.021669 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.021747 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.021950 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.022013 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.022119 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.022271 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.022454 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.022489 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.022552 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.022656 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.022702 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.022753 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.028720 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.037251 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nrwsd"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.038284 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.040267 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.040843 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.042108 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.044343 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.045095 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-64q88"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.045588 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-64q88" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.046028 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.067743 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.068321 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.070478 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.070908 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.071023 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.071841 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.071893 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.071939 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.071967 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.072008 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.074992 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.072063 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.075049 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-config\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.072117 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.075185 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebe3d149-62bb-4898-bee6-cfe16ad226ac-config\") pod \"openshift-apiserver-operator-796bbdcf4f-p2nbp\" (UID: \"ebe3d149-62bb-4898-bee6-cfe16ad226ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.075225 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.075258 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2dcf6deb-727c-4615-ac08-6c9f7ded10f4-machine-approver-tls\") pod \"machine-approver-56656f9798-brrjc\" (UID: \"2dcf6deb-727c-4615-ac08-6c9f7ded10f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.075290 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2msk\" (UniqueName: \"kubernetes.io/projected/2c45dd65-0c11-4c29-8f62-d667bd61d974-kube-api-access-d2msk\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.072215 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.075318 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dcf6deb-727c-4615-ac08-6c9f7ded10f4-config\") pod \"machine-approver-56656f9798-brrjc\" (UID: \"2dcf6deb-727c-4615-ac08-6c9f7ded10f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.072416 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.075410 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjm7v\" (UniqueName: \"kubernetes.io/projected/15e62a94-cdbc-4474-9bb0-f62de50bd5e7-kube-api-access-fjm7v\") pod \"console-operator-58897d9998-557n4\" (UID: \"15e62a94-cdbc-4474-9bb0-f62de50bd5e7\") " pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.075434 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.072545 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.075490 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2s5b\" (UniqueName: \"kubernetes.io/projected/753a1abe-8f65-4721-ad3f-b207e3413ffa-kube-api-access-m2s5b\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.072568 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.075549 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c45dd65-0c11-4c29-8f62-d667bd61d974-audit-dir\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.075572 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.075612 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.072653 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.072752 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.075670 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebe3d149-62bb-4898-bee6-cfe16ad226ac-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-p2nbp\" (UID: \"ebe3d149-62bb-4898-bee6-cfe16ad226ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.075701 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e62a94-cdbc-4474-9bb0-f62de50bd5e7-config\") pod \"console-operator-58897d9998-557n4\" (UID: \"15e62a94-cdbc-4474-9bb0-f62de50bd5e7\") " pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.072781 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.071140 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.092804 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.092871 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15e62a94-cdbc-4474-9bb0-f62de50bd5e7-serving-cert\") pod \"console-operator-58897d9998-557n4\" (UID: \"15e62a94-cdbc-4474-9bb0-f62de50bd5e7\") " pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.092891 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-serving-cert\") pod \"route-controller-manager-6576b87f9c-972qh\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.092928 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2dcf6deb-727c-4615-ac08-6c9f7ded10f4-auth-proxy-config\") pod \"machine-approver-56656f9798-brrjc\" (UID: \"2dcf6deb-727c-4615-ac08-6c9f7ded10f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.092966 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4mvc\" (UniqueName: \"kubernetes.io/projected/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-kube-api-access-b4mvc\") pod \"route-controller-manager-6576b87f9c-972qh\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.092990 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-client-ca\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.093037 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.093064 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-config\") pod \"route-controller-manager-6576b87f9c-972qh\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.093092 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.093124 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-audit-policies\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.093175 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.093205 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.093226 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djwg4\" (UniqueName: \"kubernetes.io/projected/ebe3d149-62bb-4898-bee6-cfe16ad226ac-kube-api-access-djwg4\") pod \"openshift-apiserver-operator-796bbdcf4f-p2nbp\" (UID: \"ebe3d149-62bb-4898-bee6-cfe16ad226ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.093262 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-client-ca\") pod \"route-controller-manager-6576b87f9c-972qh\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.093292 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/753a1abe-8f65-4721-ad3f-b207e3413ffa-serving-cert\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.093318 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.093343 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvrqq\" (UniqueName: \"kubernetes.io/projected/2dcf6deb-727c-4615-ac08-6c9f7ded10f4-kube-api-access-dvrqq\") pod \"machine-approver-56656f9798-brrjc\" (UID: \"2dcf6deb-727c-4615-ac08-6c9f7ded10f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.093370 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15e62a94-cdbc-4474-9bb0-f62de50bd5e7-trusted-ca\") pod \"console-operator-58897d9998-557n4\" (UID: \"15e62a94-cdbc-4474-9bb0-f62de50bd5e7\") " pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.093433 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.094709 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.097641 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-n644j"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.098343 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.098489 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.099287 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.104947 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.105162 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.105335 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.105384 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.105440 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.105714 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.105771 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.105891 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tsdfx"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.105928 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.106041 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.106301 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.106356 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.106054 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.106515 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.106585 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.106654 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.106719 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.106754 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.106782 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pzzxz"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.106910 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.107425 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.110442 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.110589 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7frlz"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.111376 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-9cjsc"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.111577 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7frlz" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.111797 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.112087 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.112498 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.112735 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.113589 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.114689 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-44m5s"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.115530 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.115834 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.116736 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.118593 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.118680 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.118828 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.118894 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.119022 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.119073 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.119175 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.119657 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-vwn4j"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.120071 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.120335 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.128651 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.128678 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.128928 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129007 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129053 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129092 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129192 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129233 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129294 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129389 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129433 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129466 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129489 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129521 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129091 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129603 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129685 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.129939 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.131655 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.131714 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.134329 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.136051 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.144306 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.146372 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.149070 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.149743 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.150109 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.152545 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.156059 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.156509 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.160626 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-p8fbp"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.160821 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.162865 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.162882 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-p8fbp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.168289 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nkh4b"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.168763 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.171021 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.171175 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.171904 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wns76"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.172462 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.172605 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.172752 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wns76" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.172756 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.173721 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n75xz"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.174207 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n75xz" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.174443 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.174474 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-w9n84"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.175645 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.178206 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dc2nk"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.178746 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xhnnw"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.179036 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dc2nk" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.179135 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.179579 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.179829 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.180034 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.181162 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-t99kd"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.181739 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t99kd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.181942 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.182289 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.183913 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nrwsd"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.185318 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-557n4"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.185923 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6s458"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.186826 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.187775 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-n644j"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.188857 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kqnvp"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.189971 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-64q88"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.190114 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kqnvp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.191506 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7frlz"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.192037 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.192869 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.192979 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pzzxz"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.194050 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nkh4b"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.194786 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.194844 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvrqq\" (UniqueName: \"kubernetes.io/projected/2dcf6deb-727c-4615-ac08-6c9f7ded10f4-kube-api-access-dvrqq\") pod \"machine-approver-56656f9798-brrjc\" (UID: \"2dcf6deb-727c-4615-ac08-6c9f7ded10f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.194890 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.194960 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlgm4\" (UniqueName: \"kubernetes.io/projected/43359b2f-63d6-4e81-888e-e219dca84ec3-kube-api-access-tlgm4\") pod \"downloads-7954f5f757-64q88\" (UID: \"43359b2f-63d6-4e81-888e-e219dca84ec3\") " pod="openshift-console/downloads-7954f5f757-64q88" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195027 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx4kw\" (UniqueName: \"kubernetes.io/projected/edc4e120-e930-4fbe-a871-bd63d3de85c9-kube-api-access-rx4kw\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195086 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/753a1abe-8f65-4721-ad3f-b207e3413ffa-serving-cert\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195124 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15e62a94-cdbc-4474-9bb0-f62de50bd5e7-trusted-ca\") pod \"console-operator-58897d9998-557n4\" (UID: \"15e62a94-cdbc-4474-9bb0-f62de50bd5e7\") " pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195179 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/edc4e120-e930-4fbe-a871-bd63d3de85c9-audit-policies\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195225 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195285 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c4a5d028-220b-40aa-a907-95015d7880c8-node-pullsecrets\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195333 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-776fh\" (UniqueName: \"kubernetes.io/projected/c4a5d028-220b-40aa-a907-95015d7880c8-kube-api-access-776fh\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195366 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195406 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-config\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195435 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-config\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195459 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebe3d149-62bb-4898-bee6-cfe16ad226ac-config\") pod \"openshift-apiserver-operator-796bbdcf4f-p2nbp\" (UID: \"ebe3d149-62bb-4898-bee6-cfe16ad226ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195503 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195534 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2dcf6deb-727c-4615-ac08-6c9f7ded10f4-machine-approver-tls\") pod \"machine-approver-56656f9798-brrjc\" (UID: \"2dcf6deb-727c-4615-ac08-6c9f7ded10f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195579 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f10c8d-51a0-4e7a-b7ab-af11699298af-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195605 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dcf6deb-727c-4615-ac08-6c9f7ded10f4-config\") pod \"machine-approver-56656f9798-brrjc\" (UID: \"2dcf6deb-727c-4615-ac08-6c9f7ded10f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195652 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2msk\" (UniqueName: \"kubernetes.io/projected/2c45dd65-0c11-4c29-8f62-d667bd61d974-kube-api-access-d2msk\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195680 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c4a5d028-220b-40aa-a907-95015d7880c8-audit-dir\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195752 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/edc4e120-e930-4fbe-a871-bd63d3de85c9-etcd-client\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195782 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f10c8d-51a0-4e7a-b7ab-af11699298af-service-ca-bundle\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195837 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjm7v\" (UniqueName: \"kubernetes.io/projected/15e62a94-cdbc-4474-9bb0-f62de50bd5e7-kube-api-access-fjm7v\") pod \"console-operator-58897d9998-557n4\" (UID: \"15e62a94-cdbc-4474-9bb0-f62de50bd5e7\") " pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195864 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195918 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-image-import-ca\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195945 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c625acfa-7d7b-4b52-b3a2-55c5f817966b-serving-cert\") pod \"openshift-config-operator-7777fb866f-r4lpq\" (UID: \"c625acfa-7d7b-4b52-b3a2-55c5f817966b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.195988 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f10c8d-51a0-4e7a-b7ab-af11699298af-config\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196015 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2s5b\" (UniqueName: \"kubernetes.io/projected/753a1abe-8f65-4721-ad3f-b207e3413ffa-kube-api-access-m2s5b\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196063 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c4a5d028-220b-40aa-a907-95015d7880c8-encryption-config\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196125 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196174 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edc4e120-e930-4fbe-a871-bd63d3de85c9-serving-cert\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196201 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/edc4e120-e930-4fbe-a871-bd63d3de85c9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196222 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsgmk\" (UniqueName: \"kubernetes.io/projected/c625acfa-7d7b-4b52-b3a2-55c5f817966b-kube-api-access-lsgmk\") pod \"openshift-config-operator-7777fb866f-r4lpq\" (UID: \"c625acfa-7d7b-4b52-b3a2-55c5f817966b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196264 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c45dd65-0c11-4c29-8f62-d667bd61d974-audit-dir\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196291 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196341 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4a5d028-220b-40aa-a907-95015d7880c8-serving-cert\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196369 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196429 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e62a94-cdbc-4474-9bb0-f62de50bd5e7-config\") pod \"console-operator-58897d9998-557n4\" (UID: \"15e62a94-cdbc-4474-9bb0-f62de50bd5e7\") " pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196452 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f10c8d-51a0-4e7a-b7ab-af11699298af-serving-cert\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196692 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebe3d149-62bb-4898-bee6-cfe16ad226ac-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-p2nbp\" (UID: \"ebe3d149-62bb-4898-bee6-cfe16ad226ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196743 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.196020 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.197645 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2dcf6deb-727c-4615-ac08-6c9f7ded10f4-auth-proxy-config\") pod \"machine-approver-56656f9798-brrjc\" (UID: \"2dcf6deb-727c-4615-ac08-6c9f7ded10f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.197721 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15e62a94-cdbc-4474-9bb0-f62de50bd5e7-serving-cert\") pod \"console-operator-58897d9998-557n4\" (UID: \"15e62a94-cdbc-4474-9bb0-f62de50bd5e7\") " pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.197745 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-serving-cert\") pod \"route-controller-manager-6576b87f9c-972qh\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.197769 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-audit\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.197789 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/edc4e120-e930-4fbe-a871-bd63d3de85c9-encryption-config\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.197819 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-client-ca\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.197838 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.197858 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4a5d028-220b-40aa-a907-95015d7880c8-etcd-client\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.197883 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c625acfa-7d7b-4b52-b3a2-55c5f817966b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r4lpq\" (UID: \"c625acfa-7d7b-4b52-b3a2-55c5f817966b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.197908 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4mvc\" (UniqueName: \"kubernetes.io/projected/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-kube-api-access-b4mvc\") pod \"route-controller-manager-6576b87f9c-972qh\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.197928 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/edc4e120-e930-4fbe-a871-bd63d3de85c9-audit-dir\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.197948 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-etcd-serving-ca\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.197968 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/edc4e120-e930-4fbe-a871-bd63d3de85c9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.197991 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-config\") pod \"route-controller-manager-6576b87f9c-972qh\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.198013 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.198037 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-audit-policies\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.198057 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.198080 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.198100 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djwg4\" (UniqueName: \"kubernetes.io/projected/ebe3d149-62bb-4898-bee6-cfe16ad226ac-kube-api-access-djwg4\") pod \"openshift-apiserver-operator-796bbdcf4f-p2nbp\" (UID: \"ebe3d149-62bb-4898-bee6-cfe16ad226ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.198126 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjcb7\" (UniqueName: \"kubernetes.io/projected/e0f10c8d-51a0-4e7a-b7ab-af11699298af-kube-api-access-fjcb7\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.198169 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-client-ca\") pod \"route-controller-manager-6576b87f9c-972qh\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.199046 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-client-ca\") pod \"route-controller-manager-6576b87f9c-972qh\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.199714 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-w6wh4"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.199804 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15e62a94-cdbc-4474-9bb0-f62de50bd5e7-trusted-ca\") pod \"console-operator-58897d9998-557n4\" (UID: \"15e62a94-cdbc-4474-9bb0-f62de50bd5e7\") " pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.200574 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.200595 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.200573 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dcf6deb-727c-4615-ac08-6c9f7ded10f4-config\") pod \"machine-approver-56656f9798-brrjc\" (UID: \"2dcf6deb-727c-4615-ac08-6c9f7ded10f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.200749 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-w6wh4" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.200901 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebe3d149-62bb-4898-bee6-cfe16ad226ac-config\") pod \"openshift-apiserver-operator-796bbdcf4f-p2nbp\" (UID: \"ebe3d149-62bb-4898-bee6-cfe16ad226ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.200911 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-config\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.201958 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2dcf6deb-727c-4615-ac08-6c9f7ded10f4-auth-proxy-config\") pod \"machine-approver-56656f9798-brrjc\" (UID: \"2dcf6deb-727c-4615-ac08-6c9f7ded10f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.202664 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-audit-policies\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.203363 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-client-ca\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.204033 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-config\") pod \"route-controller-manager-6576b87f9c-972qh\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.204056 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e62a94-cdbc-4474-9bb0-f62de50bd5e7-config\") pod \"console-operator-58897d9998-557n4\" (UID: \"15e62a94-cdbc-4474-9bb0-f62de50bd5e7\") " pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.204094 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.204194 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c45dd65-0c11-4c29-8f62-d667bd61d974-audit-dir\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.204585 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.204778 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.204981 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.205960 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.204051 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.206454 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.206564 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebe3d149-62bb-4898-bee6-cfe16ad226ac-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-p2nbp\" (UID: \"ebe3d149-62bb-4898-bee6-cfe16ad226ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.208641 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2dcf6deb-727c-4615-ac08-6c9f7ded10f4-machine-approver-tls\") pod \"machine-approver-56656f9798-brrjc\" (UID: \"2dcf6deb-727c-4615-ac08-6c9f7ded10f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.208667 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15e62a94-cdbc-4474-9bb0-f62de50bd5e7-serving-cert\") pod \"console-operator-58897d9998-557n4\" (UID: \"15e62a94-cdbc-4474-9bb0-f62de50bd5e7\") " pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.209086 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.209315 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/753a1abe-8f65-4721-ad3f-b207e3413ffa-serving-cert\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.209498 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.209496 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-serving-cert\") pod \"route-controller-manager-6576b87f9c-972qh\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.209572 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.209728 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.211744 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.211994 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.213082 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.213793 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.217203 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.222167 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9cjsc"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.225251 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.230702 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.231980 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tsdfx"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.233473 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.234628 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.234641 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-87pqd"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.235997 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wns76"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.237109 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n75xz"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.238386 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-44m5s"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.239531 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-p8fbp"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.240903 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t99kd"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.242234 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.243575 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.244798 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.245805 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dc2nk"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.247005 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kqnvp"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.249522 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-w9n84"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.250797 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.252063 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xhvq4"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.252392 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.254493 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.254587 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xhvq4"] Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.254618 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.272198 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.292884 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.299398 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f10c8d-51a0-4e7a-b7ab-af11699298af-serving-cert\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.299465 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/98fd3935-23d7-4ee3-8325-222533852b78-signing-key\") pod \"service-ca-9c57cc56f-w9n84\" (UID: \"98fd3935-23d7-4ee3-8325-222533852b78\") " pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.299503 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-audit\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.299536 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/edc4e120-e930-4fbe-a871-bd63d3de85c9-encryption-config\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.299565 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c625acfa-7d7b-4b52-b3a2-55c5f817966b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r4lpq\" (UID: \"c625acfa-7d7b-4b52-b3a2-55c5f817966b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.299603 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4a5d028-220b-40aa-a907-95015d7880c8-etcd-client\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.299635 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/edc4e120-e930-4fbe-a871-bd63d3de85c9-audit-dir\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.299672 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-etcd-serving-ca\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.300041 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/edc4e120-e930-4fbe-a871-bd63d3de85c9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.300113 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2200948-73a4-4ee6-bd24-d8f4f8e4418f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qkppp\" (UID: \"f2200948-73a4-4ee6-bd24-d8f4f8e4418f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.300136 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c625acfa-7d7b-4b52-b3a2-55c5f817966b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r4lpq\" (UID: \"c625acfa-7d7b-4b52-b3a2-55c5f817966b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.299790 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/edc4e120-e930-4fbe-a871-bd63d3de85c9-audit-dir\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.300354 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/98fd3935-23d7-4ee3-8325-222533852b78-signing-cabundle\") pod \"service-ca-9c57cc56f-w9n84\" (UID: \"98fd3935-23d7-4ee3-8325-222533852b78\") " pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.300436 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-default-certificate\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.300832 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/edc4e120-e930-4fbe-a871-bd63d3de85c9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.300945 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-audit\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.300951 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-etcd-serving-ca\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301037 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjcb7\" (UniqueName: \"kubernetes.io/projected/e0f10c8d-51a0-4e7a-b7ab-af11699298af-kube-api-access-fjcb7\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301134 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16abc87c-7821-4c71-89c6-d523fec8e4a8-apiservice-cert\") pod \"packageserver-d55dfcdfc-24ztk\" (UID: \"16abc87c-7821-4c71-89c6-d523fec8e4a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301222 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx4kw\" (UniqueName: \"kubernetes.io/projected/edc4e120-e930-4fbe-a871-bd63d3de85c9-kube-api-access-rx4kw\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301293 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlgm4\" (UniqueName: \"kubernetes.io/projected/43359b2f-63d6-4e81-888e-e219dca84ec3-kube-api-access-tlgm4\") pod \"downloads-7954f5f757-64q88\" (UID: \"43359b2f-63d6-4e81-888e-e219dca84ec3\") " pod="openshift-console/downloads-7954f5f757-64q88" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301440 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/edc4e120-e930-4fbe-a871-bd63d3de85c9-audit-policies\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301494 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2200948-73a4-4ee6-bd24-d8f4f8e4418f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qkppp\" (UID: \"f2200948-73a4-4ee6-bd24-d8f4f8e4418f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301564 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c4a5d028-220b-40aa-a907-95015d7880c8-node-pullsecrets\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301592 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-776fh\" (UniqueName: \"kubernetes.io/projected/c4a5d028-220b-40aa-a907-95015d7880c8-kube-api-access-776fh\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301613 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-config\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301688 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-metrics-certs\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301733 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f10c8d-51a0-4e7a-b7ab-af11699298af-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301753 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhfz7\" (UniqueName: \"kubernetes.io/projected/16abc87c-7821-4c71-89c6-d523fec8e4a8-kube-api-access-vhfz7\") pod \"packageserver-d55dfcdfc-24ztk\" (UID: \"16abc87c-7821-4c71-89c6-d523fec8e4a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301799 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1f51bac-5e94-403b-a197-bf85caf57f23-service-ca-bundle\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301824 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c4a5d028-220b-40aa-a907-95015d7880c8-audit-dir\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301839 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/edc4e120-e930-4fbe-a871-bd63d3de85c9-etcd-client\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301874 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f10c8d-51a0-4e7a-b7ab-af11699298af-service-ca-bundle\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301915 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-image-import-ca\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301951 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9n6z\" (UniqueName: \"kubernetes.io/projected/d9fa2d33-9f7a-4bbf-9c14-085ec357aafd-kube-api-access-r9n6z\") pod \"cluster-samples-operator-665b6dd947-vkkdd\" (UID: \"d9fa2d33-9f7a-4bbf-9c14-085ec357aafd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301970 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/16abc87c-7821-4c71-89c6-d523fec8e4a8-tmpfs\") pod \"packageserver-d55dfcdfc-24ztk\" (UID: \"16abc87c-7821-4c71-89c6-d523fec8e4a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.301998 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c4a5d028-220b-40aa-a907-95015d7880c8-encryption-config\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302033 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c625acfa-7d7b-4b52-b3a2-55c5f817966b-serving-cert\") pod \"openshift-config-operator-7777fb866f-r4lpq\" (UID: \"c625acfa-7d7b-4b52-b3a2-55c5f817966b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302050 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f10c8d-51a0-4e7a-b7ab-af11699298af-config\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302072 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54qj5\" (UniqueName: \"kubernetes.io/projected/c1f51bac-5e94-403b-a197-bf85caf57f23-kube-api-access-54qj5\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302129 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302178 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fa2d33-9f7a-4bbf-9c14-085ec357aafd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vkkdd\" (UID: \"d9fa2d33-9f7a-4bbf-9c14-085ec357aafd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302201 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16abc87c-7821-4c71-89c6-d523fec8e4a8-webhook-cert\") pod \"packageserver-d55dfcdfc-24ztk\" (UID: \"16abc87c-7821-4c71-89c6-d523fec8e4a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302201 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/edc4e120-e930-4fbe-a871-bd63d3de85c9-audit-policies\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302223 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rprrp\" (UniqueName: \"kubernetes.io/projected/98fd3935-23d7-4ee3-8325-222533852b78-kube-api-access-rprrp\") pod \"service-ca-9c57cc56f-w9n84\" (UID: \"98fd3935-23d7-4ee3-8325-222533852b78\") " pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302270 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/edc4e120-e930-4fbe-a871-bd63d3de85c9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302290 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edc4e120-e930-4fbe-a871-bd63d3de85c9-serving-cert\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302308 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2200948-73a4-4ee6-bd24-d8f4f8e4418f-config\") pod \"kube-apiserver-operator-766d6c64bb-qkppp\" (UID: \"f2200948-73a4-4ee6-bd24-d8f4f8e4418f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302343 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4a5d028-220b-40aa-a907-95015d7880c8-serving-cert\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302365 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsgmk\" (UniqueName: \"kubernetes.io/projected/c625acfa-7d7b-4b52-b3a2-55c5f817966b-kube-api-access-lsgmk\") pod \"openshift-config-operator-7777fb866f-r4lpq\" (UID: \"c625acfa-7d7b-4b52-b3a2-55c5f817966b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302386 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-stats-auth\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302567 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c4a5d028-220b-40aa-a907-95015d7880c8-node-pullsecrets\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.302946 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f10c8d-51a0-4e7a-b7ab-af11699298af-service-ca-bundle\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.303310 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f10c8d-51a0-4e7a-b7ab-af11699298af-serving-cert\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.303400 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c4a5d028-220b-40aa-a907-95015d7880c8-audit-dir\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.303445 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-config\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.303913 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-image-import-ca\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.304031 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/edc4e120-e930-4fbe-a871-bd63d3de85c9-encryption-config\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.304055 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0f10c8d-51a0-4e7a-b7ab-af11699298af-config\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.304404 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f10c8d-51a0-4e7a-b7ab-af11699298af-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.304584 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4a5d028-220b-40aa-a907-95015d7880c8-etcd-client\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.304982 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/edc4e120-e930-4fbe-a871-bd63d3de85c9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.305173 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4a5d028-220b-40aa-a907-95015d7880c8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.305695 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c4a5d028-220b-40aa-a907-95015d7880c8-encryption-config\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.306627 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/edc4e120-e930-4fbe-a871-bd63d3de85c9-etcd-client\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.306630 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4a5d028-220b-40aa-a907-95015d7880c8-serving-cert\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.307019 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c625acfa-7d7b-4b52-b3a2-55c5f817966b-serving-cert\") pod \"openshift-config-operator-7777fb866f-r4lpq\" (UID: \"c625acfa-7d7b-4b52-b3a2-55c5f817966b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.307914 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edc4e120-e930-4fbe-a871-bd63d3de85c9-serving-cert\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.313549 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.333216 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.352641 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.372818 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.393345 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403336 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-metrics-certs\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403373 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhfz7\" (UniqueName: \"kubernetes.io/projected/16abc87c-7821-4c71-89c6-d523fec8e4a8-kube-api-access-vhfz7\") pod \"packageserver-d55dfcdfc-24ztk\" (UID: \"16abc87c-7821-4c71-89c6-d523fec8e4a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403409 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1f51bac-5e94-403b-a197-bf85caf57f23-service-ca-bundle\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403444 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9n6z\" (UniqueName: \"kubernetes.io/projected/d9fa2d33-9f7a-4bbf-9c14-085ec357aafd-kube-api-access-r9n6z\") pod \"cluster-samples-operator-665b6dd947-vkkdd\" (UID: \"d9fa2d33-9f7a-4bbf-9c14-085ec357aafd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403470 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54qj5\" (UniqueName: \"kubernetes.io/projected/c1f51bac-5e94-403b-a197-bf85caf57f23-kube-api-access-54qj5\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403489 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/16abc87c-7821-4c71-89c6-d523fec8e4a8-tmpfs\") pod \"packageserver-d55dfcdfc-24ztk\" (UID: \"16abc87c-7821-4c71-89c6-d523fec8e4a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403507 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fa2d33-9f7a-4bbf-9c14-085ec357aafd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vkkdd\" (UID: \"d9fa2d33-9f7a-4bbf-9c14-085ec357aafd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403527 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16abc87c-7821-4c71-89c6-d523fec8e4a8-webhook-cert\") pod \"packageserver-d55dfcdfc-24ztk\" (UID: \"16abc87c-7821-4c71-89c6-d523fec8e4a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403545 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rprrp\" (UniqueName: \"kubernetes.io/projected/98fd3935-23d7-4ee3-8325-222533852b78-kube-api-access-rprrp\") pod \"service-ca-9c57cc56f-w9n84\" (UID: \"98fd3935-23d7-4ee3-8325-222533852b78\") " pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403578 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2200948-73a4-4ee6-bd24-d8f4f8e4418f-config\") pod \"kube-apiserver-operator-766d6c64bb-qkppp\" (UID: \"f2200948-73a4-4ee6-bd24-d8f4f8e4418f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403604 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-stats-auth\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403631 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/98fd3935-23d7-4ee3-8325-222533852b78-signing-key\") pod \"service-ca-9c57cc56f-w9n84\" (UID: \"98fd3935-23d7-4ee3-8325-222533852b78\") " pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403658 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2200948-73a4-4ee6-bd24-d8f4f8e4418f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qkppp\" (UID: \"f2200948-73a4-4ee6-bd24-d8f4f8e4418f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403674 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/98fd3935-23d7-4ee3-8325-222533852b78-signing-cabundle\") pod \"service-ca-9c57cc56f-w9n84\" (UID: \"98fd3935-23d7-4ee3-8325-222533852b78\") " pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403691 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-default-certificate\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403721 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16abc87c-7821-4c71-89c6-d523fec8e4a8-apiservice-cert\") pod \"packageserver-d55dfcdfc-24ztk\" (UID: \"16abc87c-7821-4c71-89c6-d523fec8e4a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.403768 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2200948-73a4-4ee6-bd24-d8f4f8e4418f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qkppp\" (UID: \"f2200948-73a4-4ee6-bd24-d8f4f8e4418f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.404261 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/16abc87c-7821-4c71-89c6-d523fec8e4a8-tmpfs\") pod \"packageserver-d55dfcdfc-24ztk\" (UID: \"16abc87c-7821-4c71-89c6-d523fec8e4a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.404322 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2200948-73a4-4ee6-bd24-d8f4f8e4418f-config\") pod \"kube-apiserver-operator-766d6c64bb-qkppp\" (UID: \"f2200948-73a4-4ee6-bd24-d8f4f8e4418f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.407047 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2200948-73a4-4ee6-bd24-d8f4f8e4418f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qkppp\" (UID: \"f2200948-73a4-4ee6-bd24-d8f4f8e4418f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.454646 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.454719 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.457117 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.473617 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.493310 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.513080 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.533822 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.547092 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fa2d33-9f7a-4bbf-9c14-085ec357aafd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vkkdd\" (UID: \"d9fa2d33-9f7a-4bbf-9c14-085ec357aafd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.553848 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.574313 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.594222 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.614108 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.633644 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.653631 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.675249 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.693874 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.701779 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.712362 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.733405 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.754065 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.780983 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.794459 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.833978 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.854350 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.874029 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.892790 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.912794 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.933726 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.954502 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.974576 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 07 12:25:18 crc kubenswrapper[4854]: I1007 12:25:18.993618 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.014281 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.033775 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.037941 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16abc87c-7821-4c71-89c6-d523fec8e4a8-webhook-cert\") pod \"packageserver-d55dfcdfc-24ztk\" (UID: \"16abc87c-7821-4c71-89c6-d523fec8e4a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.039624 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16abc87c-7821-4c71-89c6-d523fec8e4a8-apiservice-cert\") pod \"packageserver-d55dfcdfc-24ztk\" (UID: \"16abc87c-7821-4c71-89c6-d523fec8e4a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.054031 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.073771 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.094897 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.113002 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.142432 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.152533 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.171918 4854 request.go:700] Waited for 1.001094833s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/secrets?fieldSelector=metadata.name%3Dmarketplace-operator-metrics&limit=500&resourceVersion=0 Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.173251 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.194403 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.213110 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.233082 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.252646 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.272500 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.292946 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.317331 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.332598 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.353593 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.371824 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.393002 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 07 12:25:19 crc kubenswrapper[4854]: E1007 12:25:19.404230 4854 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Oct 07 12:25:19 crc kubenswrapper[4854]: E1007 12:25:19.404312 4854 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Oct 07 12:25:19 crc kubenswrapper[4854]: E1007 12:25:19.404314 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-stats-auth podName:c1f51bac-5e94-403b-a197-bf85caf57f23 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:19.904287835 +0000 UTC m=+35.892120090 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-stats-auth") pod "router-default-5444994796-xhnnw" (UID: "c1f51bac-5e94-403b-a197-bf85caf57f23") : failed to sync secret cache: timed out waiting for the condition Oct 07 12:25:19 crc kubenswrapper[4854]: E1007 12:25:19.404356 4854 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Oct 07 12:25:19 crc kubenswrapper[4854]: E1007 12:25:19.404389 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c1f51bac-5e94-403b-a197-bf85caf57f23-service-ca-bundle podName:c1f51bac-5e94-403b-a197-bf85caf57f23 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:19.904370447 +0000 UTC m=+35.892202712 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c1f51bac-5e94-403b-a197-bf85caf57f23-service-ca-bundle") pod "router-default-5444994796-xhnnw" (UID: "c1f51bac-5e94-403b-a197-bf85caf57f23") : failed to sync configmap cache: timed out waiting for the condition Oct 07 12:25:19 crc kubenswrapper[4854]: E1007 12:25:19.404381 4854 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Oct 07 12:25:19 crc kubenswrapper[4854]: E1007 12:25:19.404413 4854 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Oct 07 12:25:19 crc kubenswrapper[4854]: E1007 12:25:19.404436 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98fd3935-23d7-4ee3-8325-222533852b78-signing-cabundle podName:98fd3935-23d7-4ee3-8325-222533852b78 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:19.904426699 +0000 UTC m=+35.892258964 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/98fd3935-23d7-4ee3-8325-222533852b78-signing-cabundle") pod "service-ca-9c57cc56f-w9n84" (UID: "98fd3935-23d7-4ee3-8325-222533852b78") : failed to sync configmap cache: timed out waiting for the condition Oct 07 12:25:19 crc kubenswrapper[4854]: E1007 12:25:19.404464 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98fd3935-23d7-4ee3-8325-222533852b78-signing-key podName:98fd3935-23d7-4ee3-8325-222533852b78 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:19.904449089 +0000 UTC m=+35.892281344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/98fd3935-23d7-4ee3-8325-222533852b78-signing-key") pod "service-ca-9c57cc56f-w9n84" (UID: "98fd3935-23d7-4ee3-8325-222533852b78") : failed to sync secret cache: timed out waiting for the condition Oct 07 12:25:19 crc kubenswrapper[4854]: E1007 12:25:19.404479 4854 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Oct 07 12:25:19 crc kubenswrapper[4854]: E1007 12:25:19.404534 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-default-certificate podName:c1f51bac-5e94-403b-a197-bf85caf57f23 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:19.90447894 +0000 UTC m=+35.892311225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-default-certificate") pod "router-default-5444994796-xhnnw" (UID: "c1f51bac-5e94-403b-a197-bf85caf57f23") : failed to sync secret cache: timed out waiting for the condition Oct 07 12:25:19 crc kubenswrapper[4854]: E1007 12:25:19.404598 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-metrics-certs podName:c1f51bac-5e94-403b-a197-bf85caf57f23 nodeName:}" failed. No retries permitted until 2025-10-07 12:25:19.904565643 +0000 UTC m=+35.892397928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-metrics-certs") pod "router-default-5444994796-xhnnw" (UID: "c1f51bac-5e94-403b-a197-bf85caf57f23") : failed to sync secret cache: timed out waiting for the condition Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.412638 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.432922 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.453085 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.473848 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.492872 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.513633 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.533466 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.552677 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.573382 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.592627 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.614023 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.633961 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.652880 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.673501 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.692854 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.701720 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.701720 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.702191 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.713984 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.733366 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.760977 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.773112 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.792050 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.813293 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.833048 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.853800 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.872863 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.892913 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.913202 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.929933 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:19 crc kubenswrapper[4854]: E1007 12:25:19.930130 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:35.930099116 +0000 UTC m=+51.917931371 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.930247 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.930286 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.930343 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-stats-auth\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.930381 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.930417 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/98fd3935-23d7-4ee3-8325-222533852b78-signing-key\") pod \"service-ca-9c57cc56f-w9n84\" (UID: \"98fd3935-23d7-4ee3-8325-222533852b78\") " pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.930485 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/98fd3935-23d7-4ee3-8325-222533852b78-signing-cabundle\") pod \"service-ca-9c57cc56f-w9n84\" (UID: \"98fd3935-23d7-4ee3-8325-222533852b78\") " pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.930528 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-default-certificate\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.930612 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.930667 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-metrics-certs\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.930718 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1f51bac-5e94-403b-a197-bf85caf57f23-service-ca-bundle\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.931602 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1f51bac-5e94-403b-a197-bf85caf57f23-service-ca-bundle\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.931784 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/98fd3935-23d7-4ee3-8325-222533852b78-signing-cabundle\") pod \"service-ca-9c57cc56f-w9n84\" (UID: \"98fd3935-23d7-4ee3-8325-222533852b78\") " pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.933629 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.934521 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-stats-auth\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.934853 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-metrics-certs\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.935305 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c1f51bac-5e94-403b-a197-bf85caf57f23-default-certificate\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.935854 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/98fd3935-23d7-4ee3-8325-222533852b78-signing-key\") pod \"service-ca-9c57cc56f-w9n84\" (UID: \"98fd3935-23d7-4ee3-8325-222533852b78\") " pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.972514 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.972776 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2msk\" (UniqueName: \"kubernetes.io/projected/2c45dd65-0c11-4c29-8f62-d667bd61d974-kube-api-access-d2msk\") pod \"oauth-openshift-558db77b4-87pqd\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:19 crc kubenswrapper[4854]: I1007 12:25:19.993287 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.012772 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.075673 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4mvc\" (UniqueName: \"kubernetes.io/projected/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-kube-api-access-b4mvc\") pod \"route-controller-manager-6576b87f9c-972qh\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.079177 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjm7v\" (UniqueName: \"kubernetes.io/projected/15e62a94-cdbc-4474-9bb0-f62de50bd5e7-kube-api-access-fjm7v\") pod \"console-operator-58897d9998-557n4\" (UID: \"15e62a94-cdbc-4474-9bb0-f62de50bd5e7\") " pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.090886 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djwg4\" (UniqueName: \"kubernetes.io/projected/ebe3d149-62bb-4898-bee6-cfe16ad226ac-kube-api-access-djwg4\") pod \"openshift-apiserver-operator-796bbdcf4f-p2nbp\" (UID: \"ebe3d149-62bb-4898-bee6-cfe16ad226ac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.114320 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvrqq\" (UniqueName: \"kubernetes.io/projected/2dcf6deb-727c-4615-ac08-6c9f7ded10f4-kube-api-access-dvrqq\") pod \"machine-approver-56656f9798-brrjc\" (UID: \"2dcf6deb-727c-4615-ac08-6c9f7ded10f4\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.117250 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.145934 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2s5b\" (UniqueName: \"kubernetes.io/projected/753a1abe-8f65-4721-ad3f-b207e3413ffa-kube-api-access-m2s5b\") pod \"controller-manager-879f6c89f-6s458\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.147985 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.153279 4854 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.174182 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.174557 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:20 crc kubenswrapper[4854]: W1007 12:25:20.174857 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dcf6deb_727c_4615_ac08_6c9f7ded10f4.slice/crio-aa401d64289d994dd0d119622a9f1f886dd8d2a0e914063f1f5912522ffebf45 WatchSource:0}: Error finding container aa401d64289d994dd0d119622a9f1f886dd8d2a0e914063f1f5912522ffebf45: Status 404 returned error can't find the container with id aa401d64289d994dd0d119622a9f1f886dd8d2a0e914063f1f5912522ffebf45 Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.190998 4854 request.go:700] Waited for 1.935868816s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.193048 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.199935 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.217384 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.238933 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjcb7\" (UniqueName: \"kubernetes.io/projected/e0f10c8d-51a0-4e7a-b7ab-af11699298af-kube-api-access-fjcb7\") pod \"authentication-operator-69f744f599-n644j\" (UID: \"e0f10c8d-51a0-4e7a-b7ab-af11699298af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.248952 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx4kw\" (UniqueName: \"kubernetes.io/projected/edc4e120-e930-4fbe-a871-bd63d3de85c9-kube-api-access-rx4kw\") pod \"apiserver-7bbb656c7d-c7nrq\" (UID: \"edc4e120-e930-4fbe-a871-bd63d3de85c9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.272621 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.281073 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlgm4\" (UniqueName: \"kubernetes.io/projected/43359b2f-63d6-4e81-888e-e219dca84ec3-kube-api-access-tlgm4\") pod \"downloads-7954f5f757-64q88\" (UID: \"43359b2f-63d6-4e81-888e-e219dca84ec3\") " pod="openshift-console/downloads-7954f5f757-64q88" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.284747 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.290548 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-776fh\" (UniqueName: \"kubernetes.io/projected/c4a5d028-220b-40aa-a907-95015d7880c8-kube-api-access-776fh\") pod \"apiserver-76f77b778f-nrwsd\" (UID: \"c4a5d028-220b-40aa-a907-95015d7880c8\") " pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.310809 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsgmk\" (UniqueName: \"kubernetes.io/projected/c625acfa-7d7b-4b52-b3a2-55c5f817966b-kube-api-access-lsgmk\") pod \"openshift-config-operator-7777fb866f-r4lpq\" (UID: \"c625acfa-7d7b-4b52-b3a2-55c5f817966b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.333687 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhfz7\" (UniqueName: \"kubernetes.io/projected/16abc87c-7821-4c71-89c6-d523fec8e4a8-kube-api-access-vhfz7\") pod \"packageserver-d55dfcdfc-24ztk\" (UID: \"16abc87c-7821-4c71-89c6-d523fec8e4a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.357024 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9n6z\" (UniqueName: \"kubernetes.io/projected/d9fa2d33-9f7a-4bbf-9c14-085ec357aafd-kube-api-access-r9n6z\") pod \"cluster-samples-operator-665b6dd947-vkkdd\" (UID: \"d9fa2d33-9f7a-4bbf-9c14-085ec357aafd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.369103 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rprrp\" (UniqueName: \"kubernetes.io/projected/98fd3935-23d7-4ee3-8325-222533852b78-kube-api-access-rprrp\") pod \"service-ca-9c57cc56f-w9n84\" (UID: \"98fd3935-23d7-4ee3-8325-222533852b78\") " pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.389974 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2200948-73a4-4ee6-bd24-d8f4f8e4418f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qkppp\" (UID: \"f2200948-73a4-4ee6-bd24-d8f4f8e4418f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.397282 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.409619 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54qj5\" (UniqueName: \"kubernetes.io/projected/c1f51bac-5e94-403b-a197-bf85caf57f23-kube-api-access-54qj5\") pod \"router-default-5444994796-xhnnw\" (UID: \"c1f51bac-5e94-403b-a197-bf85caf57f23\") " pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.413839 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.418823 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp"] Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.430400 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.433163 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.434433 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh"] Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.475819 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.478050 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.501017 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.508385 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.511139 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.512983 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.524159 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-557n4"] Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.525182 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.533963 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.540535 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.541761 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.544082 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-87pqd"] Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.544649 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-64q88" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.560042 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.591766 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.603592 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.605970 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-n644j"] Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.620804 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq"] Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.623666 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.634668 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.643446 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8db95f90-9b16-440f-8329-be3ec6d7c1a0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dwgwz\" (UID: \"8db95f90-9b16-440f-8329-be3ec6d7c1a0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.643668 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8db95f90-9b16-440f-8329-be3ec6d7c1a0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dwgwz\" (UID: \"8db95f90-9b16-440f-8329-be3ec6d7c1a0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.643695 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb27cf86-2629-4ee4-85d0-a56129166c65-config\") pod \"kube-controller-manager-operator-78b949d7b-gf54n\" (UID: \"cb27cf86-2629-4ee4-85d0-a56129166c65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.643778 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.643900 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8cqq\" (UniqueName: \"kubernetes.io/projected/a4d3106d-58e9-4cb6-bcfd-6e151b16969b-kube-api-access-l8cqq\") pod \"machine-api-operator-5694c8668f-tsdfx\" (UID: \"a4d3106d-58e9-4cb6-bcfd-6e151b16969b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644205 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4d3106d-58e9-4cb6-bcfd-6e151b16969b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tsdfx\" (UID: \"a4d3106d-58e9-4cb6-bcfd-6e151b16969b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644256 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f41f688-ca0b-4a67-99b6-7108b7980bcd-proxy-tls\") pod \"machine-config-operator-74547568cd-hr45q\" (UID: \"0f41f688-ca0b-4a67-99b6-7108b7980bcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644292 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-trusted-ca-bundle\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644326 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb27cf86-2629-4ee4-85d0-a56129166c65-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gf54n\" (UID: \"cb27cf86-2629-4ee4-85d0-a56129166c65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644371 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qtsr\" (UniqueName: \"kubernetes.io/projected/c9afb6d4-a946-4c1c-995e-330f39a1f346-kube-api-access-7qtsr\") pod \"control-plane-machine-set-operator-78cbb6b69f-dc2nk\" (UID: \"c9afb6d4-a946-4c1c-995e-330f39a1f346\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dc2nk" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644409 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/329204da-6485-459a-bf30-0ae870c46ca2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nkh4b\" (UID: \"329204da-6485-459a-bf30-0ae870c46ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644430 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c4aa321-2b80-4272-a6e3-ecfbefcbc10e-config\") pod \"service-ca-operator-777779d784-n75xz\" (UID: \"3c4aa321-2b80-4272-a6e3-ecfbefcbc10e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n75xz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644450 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfbst\" (UniqueName: \"kubernetes.io/projected/329204da-6485-459a-bf30-0ae870c46ca2-kube-api-access-kfbst\") pod \"marketplace-operator-79b997595-nkh4b\" (UID: \"329204da-6485-459a-bf30-0ae870c46ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644491 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b533ed73-fa56-4742-b10c-34beb63d4bba-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rv6rb\" (UID: \"b533ed73-fa56-4742-b10c-34beb63d4bba\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644521 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mtd4\" (UniqueName: \"kubernetes.io/projected/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-kube-api-access-4mtd4\") pod \"cni-sysctl-allowlist-ds-vwn4j\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644567 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c115670-445c-4d56-83ed-9e8155a08d89-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kzt9x\" (UID: \"4c115670-445c-4d56-83ed-9e8155a08d89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644607 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxqs4\" (UniqueName: \"kubernetes.io/projected/b533ed73-fa56-4742-b10c-34beb63d4bba-kube-api-access-pxqs4\") pod \"package-server-manager-789f6589d5-rv6rb\" (UID: \"b533ed73-fa56-4742-b10c-34beb63d4bba\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644626 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/63387347-a234-4371-a7b9-c70d0ef73574-etcd-service-ca\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644646 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4cz6\" (UniqueName: \"kubernetes.io/projected/e83624ac-e8aa-429f-b313-8c5c4fc6c88e-kube-api-access-q4cz6\") pod \"openshift-controller-manager-operator-756b6f6bc6-tt4xp\" (UID: \"e83624ac-e8aa-429f-b313-8c5c4fc6c88e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644665 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4mzd\" (UniqueName: \"kubernetes.io/projected/a791f903-64e0-4312-a4ac-f5639200f261-kube-api-access-t4mzd\") pod \"olm-operator-6b444d44fb-tj4lv\" (UID: \"a791f903-64e0-4312-a4ac-f5639200f261\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644695 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8552bac7-9651-4a2c-b65e-28b2c091a0f5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bw66g\" (UID: \"8552bac7-9651-4a2c-b65e-28b2c091a0f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644714 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-976hn\" (UniqueName: \"kubernetes.io/projected/a28154d9-7951-4d75-80be-4bc5cac0ffee-kube-api-access-976hn\") pod \"dns-default-t99kd\" (UID: \"a28154d9-7951-4d75-80be-4bc5cac0ffee\") " pod="openshift-dns/dns-default-t99kd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644734 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8bfa807d-ccf5-4c95-9912-9b039b0bad42-profile-collector-cert\") pod \"catalog-operator-68c6474976-npmk7\" (UID: \"8bfa807d-ccf5-4c95-9912-9b039b0bad42\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644938 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtrt7\" (UniqueName: \"kubernetes.io/projected/a5796307-74e5-4cb6-99db-b1ba95dacb54-kube-api-access-xtrt7\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644961 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-vwn4j\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644978 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b2b320c8-834c-43b3-ab9b-c97bba4fb2b2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-p8fbp\" (UID: \"b2b320c8-834c-43b3-ab9b-c97bba4fb2b2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p8fbp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.644996 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdhsh\" (UniqueName: \"kubernetes.io/projected/d06ab521-e811-4ddf-93ec-d076d089db0c-kube-api-access-jdhsh\") pod \"ingress-canary-kqnvp\" (UID: \"d06ab521-e811-4ddf-93ec-d076d089db0c\") " pod="openshift-ingress-canary/ingress-canary-kqnvp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645013 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f83bc29b-b3be-4578-bae7-d2867242278c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645029 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9afb6d4-a946-4c1c-995e-330f39a1f346-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dc2nk\" (UID: \"c9afb6d4-a946-4c1c-995e-330f39a1f346\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dc2nk" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645057 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-oauth-serving-cert\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645073 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-config-volume\") pod \"collect-profiles-29330655-nh22s\" (UID: \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645092 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c115670-445c-4d56-83ed-9e8155a08d89-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kzt9x\" (UID: \"4c115670-445c-4d56-83ed-9e8155a08d89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645114 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f83bc29b-b3be-4578-bae7-d2867242278c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645131 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb27cf86-2629-4ee4-85d0-a56129166c65-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gf54n\" (UID: \"cb27cf86-2629-4ee4-85d0-a56129166c65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645166 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fdfad7e-e478-4167-b2d6-5b8295354328-metrics-tls\") pod \"dns-operator-744455d44c-7frlz\" (UID: \"9fdfad7e-e478-4167-b2d6-5b8295354328\") " pod="openshift-dns-operator/dns-operator-744455d44c-7frlz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645184 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh5nt\" (UniqueName: \"kubernetes.io/projected/8db95f90-9b16-440f-8329-be3ec6d7c1a0-kube-api-access-mh5nt\") pod \"cluster-image-registry-operator-dc59b4c8b-dwgwz\" (UID: \"8db95f90-9b16-440f-8329-be3ec6d7c1a0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645203 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f41f688-ca0b-4a67-99b6-7108b7980bcd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hr45q\" (UID: \"0f41f688-ca0b-4a67-99b6-7108b7980bcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645237 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kjj5\" (UniqueName: \"kubernetes.io/projected/0f41f688-ca0b-4a67-99b6-7108b7980bcd-kube-api-access-6kjj5\") pod \"machine-config-operator-74547568cd-hr45q\" (UID: \"0f41f688-ca0b-4a67-99b6-7108b7980bcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645257 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tkb2\" (UniqueName: \"kubernetes.io/projected/615e917b-47ba-4399-a549-496561ca164e-kube-api-access-5tkb2\") pod \"ingress-operator-5b745b69d9-rhcf8\" (UID: \"615e917b-47ba-4399-a549-496561ca164e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645289 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e83624ac-e8aa-429f-b313-8c5c4fc6c88e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tt4xp\" (UID: \"e83624ac-e8aa-429f-b313-8c5c4fc6c88e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645323 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/63387347-a234-4371-a7b9-c70d0ef73574-etcd-client\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645337 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8bfa807d-ccf5-4c95-9912-9b039b0bad42-srv-cert\") pod \"catalog-operator-68c6474976-npmk7\" (UID: \"8bfa807d-ccf5-4c95-9912-9b039b0bad42\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645363 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-bound-sa-token\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645382 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63387347-a234-4371-a7b9-c70d0ef73574-serving-cert\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645418 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f83bc29b-b3be-4578-bae7-d2867242278c-registry-certificates\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645438 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf48v\" (UniqueName: \"kubernetes.io/projected/f918bb49-d2c5-48f2-ae8f-235bf17c2328-kube-api-access-nf48v\") pod \"machine-config-server-w6wh4\" (UID: \"f918bb49-d2c5-48f2-ae8f-235bf17c2328\") " pod="openshift-machine-config-operator/machine-config-server-w6wh4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645454 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8552bac7-9651-4a2c-b65e-28b2c091a0f5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bw66g\" (UID: \"8552bac7-9651-4a2c-b65e-28b2c091a0f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645509 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/329204da-6485-459a-bf30-0ae870c46ca2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nkh4b\" (UID: \"329204da-6485-459a-bf30-0ae870c46ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645526 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qxhv\" (UniqueName: \"kubernetes.io/projected/b2b320c8-834c-43b3-ab9b-c97bba4fb2b2-kube-api-access-8qxhv\") pod \"multus-admission-controller-857f4d67dd-p8fbp\" (UID: \"b2b320c8-834c-43b3-ab9b-c97bba4fb2b2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p8fbp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645541 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e83624ac-e8aa-429f-b313-8c5c4fc6c88e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tt4xp\" (UID: \"e83624ac-e8aa-429f-b313-8c5c4fc6c88e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645556 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/615e917b-47ba-4399-a549-496561ca164e-trusted-ca\") pod \"ingress-operator-5b745b69d9-rhcf8\" (UID: \"615e917b-47ba-4399-a549-496561ca164e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645581 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg6vc\" (UniqueName: \"kubernetes.io/projected/4c115670-445c-4d56-83ed-9e8155a08d89-kube-api-access-rg6vc\") pod \"kube-storage-version-migrator-operator-b67b599dd-kzt9x\" (UID: \"4c115670-445c-4d56-83ed-9e8155a08d89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645599 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsbh7\" (UniqueName: \"kubernetes.io/projected/8bfa807d-ccf5-4c95-9912-9b039b0bad42-kube-api-access-dsbh7\") pod \"catalog-operator-68c6474976-npmk7\" (UID: \"8bfa807d-ccf5-4c95-9912-9b039b0bad42\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645616 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmm84\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-kube-api-access-zmm84\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645640 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/63387347-a234-4371-a7b9-c70d0ef73574-etcd-ca\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645660 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpx52\" (UniqueName: \"kubernetes.io/projected/c35b7e55-a7c4-47ed-a58a-559247edfaca-kube-api-access-vpx52\") pod \"machine-config-controller-84d6567774-gh8fl\" (UID: \"c35b7e55-a7c4-47ed-a58a-559247edfaca\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645681 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a791f903-64e0-4312-a4ac-f5639200f261-srv-cert\") pod \"olm-operator-6b444d44fb-tj4lv\" (UID: \"a791f903-64e0-4312-a4ac-f5639200f261\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645698 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a791f903-64e0-4312-a4ac-f5639200f261-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tj4lv\" (UID: \"a791f903-64e0-4312-a4ac-f5639200f261\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645726 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c4aa321-2b80-4272-a6e3-ecfbefcbc10e-serving-cert\") pod \"service-ca-operator-777779d784-n75xz\" (UID: \"3c4aa321-2b80-4272-a6e3-ecfbefcbc10e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n75xz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645744 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a28154d9-7951-4d75-80be-4bc5cac0ffee-config-volume\") pod \"dns-default-t99kd\" (UID: \"a28154d9-7951-4d75-80be-4bc5cac0ffee\") " pod="openshift-dns/dns-default-t99kd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645806 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-serving-cert\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645828 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-oauth-config\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645849 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-service-ca\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645869 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2242m\" (UniqueName: \"kubernetes.io/projected/9fdfad7e-e478-4167-b2d6-5b8295354328-kube-api-access-2242m\") pod \"dns-operator-744455d44c-7frlz\" (UID: \"9fdfad7e-e478-4167-b2d6-5b8295354328\") " pod="openshift-dns-operator/dns-operator-744455d44c-7frlz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645903 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-config\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645923 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a28154d9-7951-4d75-80be-4bc5cac0ffee-metrics-tls\") pod \"dns-default-t99kd\" (UID: \"a28154d9-7951-4d75-80be-4bc5cac0ffee\") " pod="openshift-dns/dns-default-t99kd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645941 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6qnf\" (UniqueName: \"kubernetes.io/projected/3c4aa321-2b80-4272-a6e3-ecfbefcbc10e-kube-api-access-b6qnf\") pod \"service-ca-operator-777779d784-n75xz\" (UID: \"3c4aa321-2b80-4272-a6e3-ecfbefcbc10e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n75xz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645976 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-vwn4j\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.645997 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-ready\") pod \"cni-sysctl-allowlist-ds-vwn4j\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.646024 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8db95f90-9b16-440f-8329-be3ec6d7c1a0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dwgwz\" (UID: \"8db95f90-9b16-440f-8329-be3ec6d7c1a0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.646045 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-secret-volume\") pod \"collect-profiles-29330655-nh22s\" (UID: \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647343 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c35b7e55-a7c4-47ed-a58a-559247edfaca-proxy-tls\") pod \"machine-config-controller-84d6567774-gh8fl\" (UID: \"c35b7e55-a7c4-47ed-a58a-559247edfaca\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647377 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c35b7e55-a7c4-47ed-a58a-559247edfaca-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gh8fl\" (UID: \"c35b7e55-a7c4-47ed-a58a-559247edfaca\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647405 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/615e917b-47ba-4399-a549-496561ca164e-metrics-tls\") pod \"ingress-operator-5b745b69d9-rhcf8\" (UID: \"615e917b-47ba-4399-a549-496561ca164e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647434 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f918bb49-d2c5-48f2-ae8f-235bf17c2328-certs\") pod \"machine-config-server-w6wh4\" (UID: \"f918bb49-d2c5-48f2-ae8f-235bf17c2328\") " pod="openshift-machine-config-operator/machine-config-server-w6wh4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647457 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0f41f688-ca0b-4a67-99b6-7108b7980bcd-images\") pod \"machine-config-operator-74547568cd-hr45q\" (UID: \"0f41f688-ca0b-4a67-99b6-7108b7980bcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647481 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/615e917b-47ba-4399-a549-496561ca164e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rhcf8\" (UID: \"615e917b-47ba-4399-a549-496561ca164e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647537 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647632 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63387347-a234-4371-a7b9-c70d0ef73574-config\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647654 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqz6c\" (UniqueName: \"kubernetes.io/projected/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-kube-api-access-wqz6c\") pod \"collect-profiles-29330655-nh22s\" (UID: \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647685 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8552bac7-9651-4a2c-b65e-28b2c091a0f5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bw66g\" (UID: \"8552bac7-9651-4a2c-b65e-28b2c091a0f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647711 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kpxh\" (UniqueName: \"kubernetes.io/projected/df20a6de-dc59-4af7-b284-9fef4b249be9-kube-api-access-2kpxh\") pod \"migrator-59844c95c7-wns76\" (UID: \"df20a6de-dc59-4af7-b284-9fef4b249be9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wns76" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647730 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcr6z\" (UniqueName: \"kubernetes.io/projected/63387347-a234-4371-a7b9-c70d0ef73574-kube-api-access-fcr6z\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647749 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-registry-tls\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647770 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a4d3106d-58e9-4cb6-bcfd-6e151b16969b-images\") pod \"machine-api-operator-5694c8668f-tsdfx\" (UID: \"a4d3106d-58e9-4cb6-bcfd-6e151b16969b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647791 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06ab521-e811-4ddf-93ec-d076d089db0c-cert\") pod \"ingress-canary-kqnvp\" (UID: \"d06ab521-e811-4ddf-93ec-d076d089db0c\") " pod="openshift-ingress-canary/ingress-canary-kqnvp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647837 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f83bc29b-b3be-4578-bae7-d2867242278c-trusted-ca\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647867 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4d3106d-58e9-4cb6-bcfd-6e151b16969b-config\") pod \"machine-api-operator-5694c8668f-tsdfx\" (UID: \"a4d3106d-58e9-4cb6-bcfd-6e151b16969b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.647899 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f918bb49-d2c5-48f2-ae8f-235bf17c2328-node-bootstrap-token\") pod \"machine-config-server-w6wh4\" (UID: \"f918bb49-d2c5-48f2-ae8f-235bf17c2328\") " pod="openshift-machine-config-operator/machine-config-server-w6wh4" Oct 07 12:25:20 crc kubenswrapper[4854]: E1007 12:25:20.648371 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:21.148340434 +0000 UTC m=+37.136172689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.666055 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.750393 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:20 crc kubenswrapper[4854]: E1007 12:25:20.751196 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:21.251179212 +0000 UTC m=+37.239011467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.751870 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8552bac7-9651-4a2c-b65e-28b2c091a0f5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bw66g\" (UID: \"8552bac7-9651-4a2c-b65e-28b2c091a0f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.751906 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f83bc29b-b3be-4578-bae7-d2867242278c-registry-certificates\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.751924 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf48v\" (UniqueName: \"kubernetes.io/projected/f918bb49-d2c5-48f2-ae8f-235bf17c2328-kube-api-access-nf48v\") pod \"machine-config-server-w6wh4\" (UID: \"f918bb49-d2c5-48f2-ae8f-235bf17c2328\") " pod="openshift-machine-config-operator/machine-config-server-w6wh4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.751942 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/329204da-6485-459a-bf30-0ae870c46ca2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nkh4b\" (UID: \"329204da-6485-459a-bf30-0ae870c46ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.751959 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qxhv\" (UniqueName: \"kubernetes.io/projected/b2b320c8-834c-43b3-ab9b-c97bba4fb2b2-kube-api-access-8qxhv\") pod \"multus-admission-controller-857f4d67dd-p8fbp\" (UID: \"b2b320c8-834c-43b3-ab9b-c97bba4fb2b2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p8fbp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.751978 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e83624ac-e8aa-429f-b313-8c5c4fc6c88e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tt4xp\" (UID: \"e83624ac-e8aa-429f-b313-8c5c4fc6c88e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.751996 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/615e917b-47ba-4399-a549-496561ca164e-trusted-ca\") pod \"ingress-operator-5b745b69d9-rhcf8\" (UID: \"615e917b-47ba-4399-a549-496561ca164e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752025 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg6vc\" (UniqueName: \"kubernetes.io/projected/4c115670-445c-4d56-83ed-9e8155a08d89-kube-api-access-rg6vc\") pod \"kube-storage-version-migrator-operator-b67b599dd-kzt9x\" (UID: \"4c115670-445c-4d56-83ed-9e8155a08d89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752041 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsbh7\" (UniqueName: \"kubernetes.io/projected/8bfa807d-ccf5-4c95-9912-9b039b0bad42-kube-api-access-dsbh7\") pod \"catalog-operator-68c6474976-npmk7\" (UID: \"8bfa807d-ccf5-4c95-9912-9b039b0bad42\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752056 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmm84\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-kube-api-access-zmm84\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752072 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/63387347-a234-4371-a7b9-c70d0ef73574-etcd-ca\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752091 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpx52\" (UniqueName: \"kubernetes.io/projected/c35b7e55-a7c4-47ed-a58a-559247edfaca-kube-api-access-vpx52\") pod \"machine-config-controller-84d6567774-gh8fl\" (UID: \"c35b7e55-a7c4-47ed-a58a-559247edfaca\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752116 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a791f903-64e0-4312-a4ac-f5639200f261-srv-cert\") pod \"olm-operator-6b444d44fb-tj4lv\" (UID: \"a791f903-64e0-4312-a4ac-f5639200f261\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752132 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a791f903-64e0-4312-a4ac-f5639200f261-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tj4lv\" (UID: \"a791f903-64e0-4312-a4ac-f5639200f261\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752167 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c4aa321-2b80-4272-a6e3-ecfbefcbc10e-serving-cert\") pod \"service-ca-operator-777779d784-n75xz\" (UID: \"3c4aa321-2b80-4272-a6e3-ecfbefcbc10e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n75xz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752183 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a28154d9-7951-4d75-80be-4bc5cac0ffee-config-volume\") pod \"dns-default-t99kd\" (UID: \"a28154d9-7951-4d75-80be-4bc5cac0ffee\") " pod="openshift-dns/dns-default-t99kd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752199 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2242m\" (UniqueName: \"kubernetes.io/projected/9fdfad7e-e478-4167-b2d6-5b8295354328-kube-api-access-2242m\") pod \"dns-operator-744455d44c-7frlz\" (UID: \"9fdfad7e-e478-4167-b2d6-5b8295354328\") " pod="openshift-dns-operator/dns-operator-744455d44c-7frlz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752219 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-serving-cert\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752234 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-oauth-config\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752249 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-service-ca\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752265 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6qnf\" (UniqueName: \"kubernetes.io/projected/3c4aa321-2b80-4272-a6e3-ecfbefcbc10e-kube-api-access-b6qnf\") pod \"service-ca-operator-777779d784-n75xz\" (UID: \"3c4aa321-2b80-4272-a6e3-ecfbefcbc10e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n75xz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752281 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-config\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752299 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a28154d9-7951-4d75-80be-4bc5cac0ffee-metrics-tls\") pod \"dns-default-t99kd\" (UID: \"a28154d9-7951-4d75-80be-4bc5cac0ffee\") " pod="openshift-dns/dns-default-t99kd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752326 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-vwn4j\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752342 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-ready\") pod \"cni-sysctl-allowlist-ds-vwn4j\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.752385 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8db95f90-9b16-440f-8329-be3ec6d7c1a0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dwgwz\" (UID: \"8db95f90-9b16-440f-8329-be3ec6d7c1a0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.754079 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-vwn4j\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.754684 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-csi-data-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.754814 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-secret-volume\") pod \"collect-profiles-29330655-nh22s\" (UID: \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.754863 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-socket-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.754885 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c35b7e55-a7c4-47ed-a58a-559247edfaca-proxy-tls\") pod \"machine-config-controller-84d6567774-gh8fl\" (UID: \"c35b7e55-a7c4-47ed-a58a-559247edfaca\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.754904 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c35b7e55-a7c4-47ed-a58a-559247edfaca-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gh8fl\" (UID: \"c35b7e55-a7c4-47ed-a58a-559247edfaca\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.754923 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/615e917b-47ba-4399-a549-496561ca164e-metrics-tls\") pod \"ingress-operator-5b745b69d9-rhcf8\" (UID: \"615e917b-47ba-4399-a549-496561ca164e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.754943 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f918bb49-d2c5-48f2-ae8f-235bf17c2328-certs\") pod \"machine-config-server-w6wh4\" (UID: \"f918bb49-d2c5-48f2-ae8f-235bf17c2328\") " pod="openshift-machine-config-operator/machine-config-server-w6wh4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.754964 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0f41f688-ca0b-4a67-99b6-7108b7980bcd-images\") pod \"machine-config-operator-74547568cd-hr45q\" (UID: \"0f41f688-ca0b-4a67-99b6-7108b7980bcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.754984 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/615e917b-47ba-4399-a549-496561ca164e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rhcf8\" (UID: \"615e917b-47ba-4399-a549-496561ca164e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755038 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755061 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63387347-a234-4371-a7b9-c70d0ef73574-config\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755082 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqz6c\" (UniqueName: \"kubernetes.io/projected/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-kube-api-access-wqz6c\") pod \"collect-profiles-29330655-nh22s\" (UID: \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755111 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8552bac7-9651-4a2c-b65e-28b2c091a0f5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bw66g\" (UID: \"8552bac7-9651-4a2c-b65e-28b2c091a0f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755130 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-registration-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755204 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kpxh\" (UniqueName: \"kubernetes.io/projected/df20a6de-dc59-4af7-b284-9fef4b249be9-kube-api-access-2kpxh\") pod \"migrator-59844c95c7-wns76\" (UID: \"df20a6de-dc59-4af7-b284-9fef4b249be9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wns76" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755228 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcr6z\" (UniqueName: \"kubernetes.io/projected/63387347-a234-4371-a7b9-c70d0ef73574-kube-api-access-fcr6z\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755252 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-222k4\" (UniqueName: \"kubernetes.io/projected/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-kube-api-access-222k4\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755279 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-registry-tls\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755302 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a4d3106d-58e9-4cb6-bcfd-6e151b16969b-images\") pod \"machine-api-operator-5694c8668f-tsdfx\" (UID: \"a4d3106d-58e9-4cb6-bcfd-6e151b16969b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755323 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06ab521-e811-4ddf-93ec-d076d089db0c-cert\") pod \"ingress-canary-kqnvp\" (UID: \"d06ab521-e811-4ddf-93ec-d076d089db0c\") " pod="openshift-ingress-canary/ingress-canary-kqnvp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755344 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f918bb49-d2c5-48f2-ae8f-235bf17c2328-node-bootstrap-token\") pod \"machine-config-server-w6wh4\" (UID: \"f918bb49-d2c5-48f2-ae8f-235bf17c2328\") " pod="openshift-machine-config-operator/machine-config-server-w6wh4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755386 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f83bc29b-b3be-4578-bae7-d2867242278c-trusted-ca\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755408 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4d3106d-58e9-4cb6-bcfd-6e151b16969b-config\") pod \"machine-api-operator-5694c8668f-tsdfx\" (UID: \"a4d3106d-58e9-4cb6-bcfd-6e151b16969b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755431 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8db95f90-9b16-440f-8329-be3ec6d7c1a0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dwgwz\" (UID: \"8db95f90-9b16-440f-8329-be3ec6d7c1a0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755452 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8db95f90-9b16-440f-8329-be3ec6d7c1a0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dwgwz\" (UID: \"8db95f90-9b16-440f-8329-be3ec6d7c1a0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755475 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb27cf86-2629-4ee4-85d0-a56129166c65-config\") pod \"kube-controller-manager-operator-78b949d7b-gf54n\" (UID: \"cb27cf86-2629-4ee4-85d0-a56129166c65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755509 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8cqq\" (UniqueName: \"kubernetes.io/projected/a4d3106d-58e9-4cb6-bcfd-6e151b16969b-kube-api-access-l8cqq\") pod \"machine-api-operator-5694c8668f-tsdfx\" (UID: \"a4d3106d-58e9-4cb6-bcfd-6e151b16969b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755530 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f41f688-ca0b-4a67-99b6-7108b7980bcd-proxy-tls\") pod \"machine-config-operator-74547568cd-hr45q\" (UID: \"0f41f688-ca0b-4a67-99b6-7108b7980bcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755564 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4d3106d-58e9-4cb6-bcfd-6e151b16969b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tsdfx\" (UID: \"a4d3106d-58e9-4cb6-bcfd-6e151b16969b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755635 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-trusted-ca-bundle\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755653 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb27cf86-2629-4ee4-85d0-a56129166c65-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gf54n\" (UID: \"cb27cf86-2629-4ee4-85d0-a56129166c65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755688 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qtsr\" (UniqueName: \"kubernetes.io/projected/c9afb6d4-a946-4c1c-995e-330f39a1f346-kube-api-access-7qtsr\") pod \"control-plane-machine-set-operator-78cbb6b69f-dc2nk\" (UID: \"c9afb6d4-a946-4c1c-995e-330f39a1f346\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dc2nk" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755713 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/329204da-6485-459a-bf30-0ae870c46ca2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nkh4b\" (UID: \"329204da-6485-459a-bf30-0ae870c46ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755730 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c4aa321-2b80-4272-a6e3-ecfbefcbc10e-config\") pod \"service-ca-operator-777779d784-n75xz\" (UID: \"3c4aa321-2b80-4272-a6e3-ecfbefcbc10e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n75xz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755747 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfbst\" (UniqueName: \"kubernetes.io/projected/329204da-6485-459a-bf30-0ae870c46ca2-kube-api-access-kfbst\") pod \"marketplace-operator-79b997595-nkh4b\" (UID: \"329204da-6485-459a-bf30-0ae870c46ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755780 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b533ed73-fa56-4742-b10c-34beb63d4bba-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rv6rb\" (UID: \"b533ed73-fa56-4742-b10c-34beb63d4bba\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755799 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c115670-445c-4d56-83ed-9e8155a08d89-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kzt9x\" (UID: \"4c115670-445c-4d56-83ed-9e8155a08d89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755821 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mtd4\" (UniqueName: \"kubernetes.io/projected/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-kube-api-access-4mtd4\") pod \"cni-sysctl-allowlist-ds-vwn4j\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755852 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxqs4\" (UniqueName: \"kubernetes.io/projected/b533ed73-fa56-4742-b10c-34beb63d4bba-kube-api-access-pxqs4\") pod \"package-server-manager-789f6589d5-rv6rb\" (UID: \"b533ed73-fa56-4742-b10c-34beb63d4bba\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755868 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/63387347-a234-4371-a7b9-c70d0ef73574-etcd-service-ca\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755886 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4cz6\" (UniqueName: \"kubernetes.io/projected/e83624ac-e8aa-429f-b313-8c5c4fc6c88e-kube-api-access-q4cz6\") pod \"openshift-controller-manager-operator-756b6f6bc6-tt4xp\" (UID: \"e83624ac-e8aa-429f-b313-8c5c4fc6c88e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755905 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4mzd\" (UniqueName: \"kubernetes.io/projected/a791f903-64e0-4312-a4ac-f5639200f261-kube-api-access-t4mzd\") pod \"olm-operator-6b444d44fb-tj4lv\" (UID: \"a791f903-64e0-4312-a4ac-f5639200f261\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755897 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-config\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.755927 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8552bac7-9651-4a2c-b65e-28b2c091a0f5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bw66g\" (UID: \"8552bac7-9651-4a2c-b65e-28b2c091a0f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756006 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-976hn\" (UniqueName: \"kubernetes.io/projected/a28154d9-7951-4d75-80be-4bc5cac0ffee-kube-api-access-976hn\") pod \"dns-default-t99kd\" (UID: \"a28154d9-7951-4d75-80be-4bc5cac0ffee\") " pod="openshift-dns/dns-default-t99kd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756033 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8bfa807d-ccf5-4c95-9912-9b039b0bad42-profile-collector-cert\") pod \"catalog-operator-68c6474976-npmk7\" (UID: \"8bfa807d-ccf5-4c95-9912-9b039b0bad42\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756073 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-plugins-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756096 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtrt7\" (UniqueName: \"kubernetes.io/projected/a5796307-74e5-4cb6-99db-b1ba95dacb54-kube-api-access-xtrt7\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756114 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-vwn4j\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756182 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b2b320c8-834c-43b3-ab9b-c97bba4fb2b2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-p8fbp\" (UID: \"b2b320c8-834c-43b3-ab9b-c97bba4fb2b2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p8fbp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756213 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdhsh\" (UniqueName: \"kubernetes.io/projected/d06ab521-e811-4ddf-93ec-d076d089db0c-kube-api-access-jdhsh\") pod \"ingress-canary-kqnvp\" (UID: \"d06ab521-e811-4ddf-93ec-d076d089db0c\") " pod="openshift-ingress-canary/ingress-canary-kqnvp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756235 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f83bc29b-b3be-4578-bae7-d2867242278c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756283 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9afb6d4-a946-4c1c-995e-330f39a1f346-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dc2nk\" (UID: \"c9afb6d4-a946-4c1c-995e-330f39a1f346\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dc2nk" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756308 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-config-volume\") pod \"collect-profiles-29330655-nh22s\" (UID: \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756367 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-oauth-serving-cert\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756393 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c115670-445c-4d56-83ed-9e8155a08d89-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kzt9x\" (UID: \"4c115670-445c-4d56-83ed-9e8155a08d89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756443 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f83bc29b-b3be-4578-bae7-d2867242278c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756463 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb27cf86-2629-4ee4-85d0-a56129166c65-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gf54n\" (UID: \"cb27cf86-2629-4ee4-85d0-a56129166c65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756523 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fdfad7e-e478-4167-b2d6-5b8295354328-metrics-tls\") pod \"dns-operator-744455d44c-7frlz\" (UID: \"9fdfad7e-e478-4167-b2d6-5b8295354328\") " pod="openshift-dns-operator/dns-operator-744455d44c-7frlz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756548 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh5nt\" (UniqueName: \"kubernetes.io/projected/8db95f90-9b16-440f-8329-be3ec6d7c1a0-kube-api-access-mh5nt\") pod \"cluster-image-registry-operator-dc59b4c8b-dwgwz\" (UID: \"8db95f90-9b16-440f-8329-be3ec6d7c1a0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756591 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f41f688-ca0b-4a67-99b6-7108b7980bcd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hr45q\" (UID: \"0f41f688-ca0b-4a67-99b6-7108b7980bcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.756611 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kjj5\" (UniqueName: \"kubernetes.io/projected/0f41f688-ca0b-4a67-99b6-7108b7980bcd-kube-api-access-6kjj5\") pod \"machine-config-operator-74547568cd-hr45q\" (UID: \"0f41f688-ca0b-4a67-99b6-7108b7980bcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.758366 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tkb2\" (UniqueName: \"kubernetes.io/projected/615e917b-47ba-4399-a549-496561ca164e-kube-api-access-5tkb2\") pod \"ingress-operator-5b745b69d9-rhcf8\" (UID: \"615e917b-47ba-4399-a549-496561ca164e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.758496 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e83624ac-e8aa-429f-b313-8c5c4fc6c88e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tt4xp\" (UID: \"e83624ac-e8aa-429f-b313-8c5c4fc6c88e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.758545 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-mountpoint-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.758604 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/63387347-a234-4371-a7b9-c70d0ef73574-etcd-client\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.758644 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8bfa807d-ccf5-4c95-9912-9b039b0bad42-srv-cert\") pod \"catalog-operator-68c6474976-npmk7\" (UID: \"8bfa807d-ccf5-4c95-9912-9b039b0bad42\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.758687 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-bound-sa-token\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.758729 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63387347-a234-4371-a7b9-c70d0ef73574-serving-cert\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.758755 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6s458"] Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.758904 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a28154d9-7951-4d75-80be-4bc5cac0ffee-config-volume\") pod \"dns-default-t99kd\" (UID: \"a28154d9-7951-4d75-80be-4bc5cac0ffee\") " pod="openshift-dns/dns-default-t99kd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.759662 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-ready\") pod \"cni-sysctl-allowlist-ds-vwn4j\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:20 crc kubenswrapper[4854]: E1007 12:25:20.760059 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:21.260039402 +0000 UTC m=+37.247871657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.761980 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-trusted-ca-bundle\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.762016 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-config-volume\") pod \"collect-profiles-29330655-nh22s\" (UID: \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.765889 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f83bc29b-b3be-4578-bae7-d2867242278c-trusted-ca\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.766823 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0f41f688-ca0b-4a67-99b6-7108b7980bcd-images\") pod \"machine-config-operator-74547568cd-hr45q\" (UID: \"0f41f688-ca0b-4a67-99b6-7108b7980bcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.766903 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c35b7e55-a7c4-47ed-a58a-559247edfaca-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gh8fl\" (UID: \"c35b7e55-a7c4-47ed-a58a-559247edfaca\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.767712 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4d3106d-58e9-4cb6-bcfd-6e151b16969b-config\") pod \"machine-api-operator-5694c8668f-tsdfx\" (UID: \"a4d3106d-58e9-4cb6-bcfd-6e151b16969b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.770270 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-oauth-serving-cert\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.773476 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a791f903-64e0-4312-a4ac-f5639200f261-srv-cert\") pod \"olm-operator-6b444d44fb-tj4lv\" (UID: \"a791f903-64e0-4312-a4ac-f5639200f261\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.774187 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c35b7e55-a7c4-47ed-a58a-559247edfaca-proxy-tls\") pod \"machine-config-controller-84d6567774-gh8fl\" (UID: \"c35b7e55-a7c4-47ed-a58a-559247edfaca\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.774593 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63387347-a234-4371-a7b9-c70d0ef73574-serving-cert\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.777782 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-vwn4j\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.777822 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8bfa807d-ccf5-4c95-9912-9b039b0bad42-profile-collector-cert\") pod \"catalog-operator-68c6474976-npmk7\" (UID: \"8bfa807d-ccf5-4c95-9912-9b039b0bad42\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.778247 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a791f903-64e0-4312-a4ac-f5639200f261-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tj4lv\" (UID: \"a791f903-64e0-4312-a4ac-f5639200f261\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.778718 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c115670-445c-4d56-83ed-9e8155a08d89-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kzt9x\" (UID: \"4c115670-445c-4d56-83ed-9e8155a08d89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.782009 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/63387347-a234-4371-a7b9-c70d0ef73574-etcd-ca\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.782534 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f83bc29b-b3be-4578-bae7-d2867242278c-registry-certificates\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.782889 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/615e917b-47ba-4399-a549-496561ca164e-trusted-ca\") pod \"ingress-operator-5b745b69d9-rhcf8\" (UID: \"615e917b-47ba-4399-a549-496561ca164e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.783291 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8db95f90-9b16-440f-8329-be3ec6d7c1a0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dwgwz\" (UID: \"8db95f90-9b16-440f-8329-be3ec6d7c1a0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.784044 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a28154d9-7951-4d75-80be-4bc5cac0ffee-metrics-tls\") pod \"dns-default-t99kd\" (UID: \"a28154d9-7951-4d75-80be-4bc5cac0ffee\") " pod="openshift-dns/dns-default-t99kd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.784932 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8552bac7-9651-4a2c-b65e-28b2c091a0f5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bw66g\" (UID: \"8552bac7-9651-4a2c-b65e-28b2c091a0f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.785426 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c4aa321-2b80-4272-a6e3-ecfbefcbc10e-serving-cert\") pod \"service-ca-operator-777779d784-n75xz\" (UID: \"3c4aa321-2b80-4272-a6e3-ecfbefcbc10e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n75xz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.785557 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb27cf86-2629-4ee4-85d0-a56129166c65-config\") pod \"kube-controller-manager-operator-78b949d7b-gf54n\" (UID: \"cb27cf86-2629-4ee4-85d0-a56129166c65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.785683 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f83bc29b-b3be-4578-bae7-d2867242278c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.786608 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e83624ac-e8aa-429f-b313-8c5c4fc6c88e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tt4xp\" (UID: \"e83624ac-e8aa-429f-b313-8c5c4fc6c88e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.787101 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/329204da-6485-459a-bf30-0ae870c46ca2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nkh4b\" (UID: \"329204da-6485-459a-bf30-0ae870c46ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.787867 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63387347-a234-4371-a7b9-c70d0ef73574-config\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.788244 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-service-ca\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.790158 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a4d3106d-58e9-4cb6-bcfd-6e151b16969b-images\") pod \"machine-api-operator-5694c8668f-tsdfx\" (UID: \"a4d3106d-58e9-4cb6-bcfd-6e151b16969b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.789961 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c4aa321-2b80-4272-a6e3-ecfbefcbc10e-config\") pod \"service-ca-operator-777779d784-n75xz\" (UID: \"3c4aa321-2b80-4272-a6e3-ecfbefcbc10e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n75xz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.791641 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f41f688-ca0b-4a67-99b6-7108b7980bcd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hr45q\" (UID: \"0f41f688-ca0b-4a67-99b6-7108b7980bcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.793544 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/63387347-a234-4371-a7b9-c70d0ef73574-etcd-service-ca\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.798786 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b533ed73-fa56-4742-b10c-34beb63d4bba-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rv6rb\" (UID: \"b533ed73-fa56-4742-b10c-34beb63d4bba\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.798879 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/615e917b-47ba-4399-a549-496561ca164e-metrics-tls\") pod \"ingress-operator-5b745b69d9-rhcf8\" (UID: \"615e917b-47ba-4399-a549-496561ca164e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.799259 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f83bc29b-b3be-4578-bae7-d2867242278c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.799268 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4d3106d-58e9-4cb6-bcfd-6e151b16969b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tsdfx\" (UID: \"a4d3106d-58e9-4cb6-bcfd-6e151b16969b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.799380 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f918bb49-d2c5-48f2-ae8f-235bf17c2328-certs\") pod \"machine-config-server-w6wh4\" (UID: \"f918bb49-d2c5-48f2-ae8f-235bf17c2328\") " pod="openshift-machine-config-operator/machine-config-server-w6wh4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.799450 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-oauth-config\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.799660 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb27cf86-2629-4ee4-85d0-a56129166c65-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gf54n\" (UID: \"cb27cf86-2629-4ee4-85d0-a56129166c65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.800191 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-secret-volume\") pod \"collect-profiles-29330655-nh22s\" (UID: \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.800333 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8db95f90-9b16-440f-8329-be3ec6d7c1a0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dwgwz\" (UID: \"8db95f90-9b16-440f-8329-be3ec6d7c1a0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.800876 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk"] Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.800978 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/63387347-a234-4371-a7b9-c70d0ef73574-etcd-client\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.801409 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fdfad7e-e478-4167-b2d6-5b8295354328-metrics-tls\") pod \"dns-operator-744455d44c-7frlz\" (UID: \"9fdfad7e-e478-4167-b2d6-5b8295354328\") " pod="openshift-dns-operator/dns-operator-744455d44c-7frlz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.802158 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-serving-cert\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.802507 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6qnf\" (UniqueName: \"kubernetes.io/projected/3c4aa321-2b80-4272-a6e3-ecfbefcbc10e-kube-api-access-b6qnf\") pod \"service-ca-operator-777779d784-n75xz\" (UID: \"3c4aa321-2b80-4272-a6e3-ecfbefcbc10e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n75xz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.802742 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b2b320c8-834c-43b3-ab9b-c97bba4fb2b2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-p8fbp\" (UID: \"b2b320c8-834c-43b3-ab9b-c97bba4fb2b2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p8fbp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.803276 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-registry-tls\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.804527 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f918bb49-d2c5-48f2-ae8f-235bf17c2328-node-bootstrap-token\") pod \"machine-config-server-w6wh4\" (UID: \"f918bb49-d2c5-48f2-ae8f-235bf17c2328\") " pod="openshift-machine-config-operator/machine-config-server-w6wh4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.804572 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8bfa807d-ccf5-4c95-9912-9b039b0bad42-srv-cert\") pod \"catalog-operator-68c6474976-npmk7\" (UID: \"8bfa807d-ccf5-4c95-9912-9b039b0bad42\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.804759 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e83624ac-e8aa-429f-b313-8c5c4fc6c88e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tt4xp\" (UID: \"e83624ac-e8aa-429f-b313-8c5c4fc6c88e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.805168 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06ab521-e811-4ddf-93ec-d076d089db0c-cert\") pod \"ingress-canary-kqnvp\" (UID: \"d06ab521-e811-4ddf-93ec-d076d089db0c\") " pod="openshift-ingress-canary/ingress-canary-kqnvp" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.805376 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/329204da-6485-459a-bf30-0ae870c46ca2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nkh4b\" (UID: \"329204da-6485-459a-bf30-0ae870c46ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.806066 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c115670-445c-4d56-83ed-9e8155a08d89-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kzt9x\" (UID: \"4c115670-445c-4d56-83ed-9e8155a08d89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.809789 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8552bac7-9651-4a2c-b65e-28b2c091a0f5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bw66g\" (UID: \"8552bac7-9651-4a2c-b65e-28b2c091a0f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.810001 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f41f688-ca0b-4a67-99b6-7108b7980bcd-proxy-tls\") pod \"machine-config-operator-74547568cd-hr45q\" (UID: \"0f41f688-ca0b-4a67-99b6-7108b7980bcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.810087 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9afb6d4-a946-4c1c-995e-330f39a1f346-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dc2nk\" (UID: \"c9afb6d4-a946-4c1c-995e-330f39a1f346\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dc2nk" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.830755 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2242m\" (UniqueName: \"kubernetes.io/projected/9fdfad7e-e478-4167-b2d6-5b8295354328-kube-api-access-2242m\") pod \"dns-operator-744455d44c-7frlz\" (UID: \"9fdfad7e-e478-4167-b2d6-5b8295354328\") " pod="openshift-dns-operator/dns-operator-744455d44c-7frlz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.831688 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/615e917b-47ba-4399-a549-496561ca164e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rhcf8\" (UID: \"615e917b-47ba-4399-a549-496561ca164e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.860868 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8db95f90-9b16-440f-8329-be3ec6d7c1a0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dwgwz\" (UID: \"8db95f90-9b16-440f-8329-be3ec6d7c1a0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.874547 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n75xz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.878306 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:20 crc kubenswrapper[4854]: E1007 12:25:20.878835 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:21.378804286 +0000 UTC m=+37.366636541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.891443 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-csi-data-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.891508 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-socket-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.891551 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.891589 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-registration-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.891623 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-222k4\" (UniqueName: \"kubernetes.io/projected/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-kube-api-access-222k4\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.891869 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-plugins-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.891944 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-mountpoint-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.892054 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-socket-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.892088 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-mountpoint-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.892183 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-csi-data-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.892245 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-registration-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.892321 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-plugins-dir\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:20 crc kubenswrapper[4854]: E1007 12:25:20.892576 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:21.392557088 +0000 UTC m=+37.380389333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.902595 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-976hn\" (UniqueName: \"kubernetes.io/projected/a28154d9-7951-4d75-80be-4bc5cac0ffee-kube-api-access-976hn\") pod \"dns-default-t99kd\" (UID: \"a28154d9-7951-4d75-80be-4bc5cac0ffee\") " pod="openshift-dns/dns-default-t99kd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.910796 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf48v\" (UniqueName: \"kubernetes.io/projected/f918bb49-d2c5-48f2-ae8f-235bf17c2328-kube-api-access-nf48v\") pod \"machine-config-server-w6wh4\" (UID: \"f918bb49-d2c5-48f2-ae8f-235bf17c2328\") " pod="openshift-machine-config-operator/machine-config-server-w6wh4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.930640 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t99kd" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.939794 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-w6wh4" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.940360 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd"] Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.953733 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7frlz" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.955225 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8552bac7-9651-4a2c-b65e-28b2c091a0f5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-bw66g\" (UID: \"8552bac7-9651-4a2c-b65e-28b2c091a0f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.958768 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.976654 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtrt7\" (UniqueName: \"kubernetes.io/projected/a5796307-74e5-4cb6-99db-b1ba95dacb54-kube-api-access-xtrt7\") pod \"console-f9d7485db-9cjsc\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.977762 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.980936 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-w9n84"] Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.983678 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-64q88"] Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.984231 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxqs4\" (UniqueName: \"kubernetes.io/projected/b533ed73-fa56-4742-b10c-34beb63d4bba-kube-api-access-pxqs4\") pod \"package-server-manager-789f6589d5-rv6rb\" (UID: \"b533ed73-fa56-4742-b10c-34beb63d4bba\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb" Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.993821 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:20 crc kubenswrapper[4854]: E1007 12:25:20.994393 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:21.494368386 +0000 UTC m=+37.482200651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:20 crc kubenswrapper[4854]: I1007 12:25:20.996017 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb27cf86-2629-4ee4-85d0-a56129166c65-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gf54n\" (UID: \"cb27cf86-2629-4ee4-85d0-a56129166c65\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.019768 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qtsr\" (UniqueName: \"kubernetes.io/projected/c9afb6d4-a946-4c1c-995e-330f39a1f346-kube-api-access-7qtsr\") pod \"control-plane-machine-set-operator-78cbb6b69f-dc2nk\" (UID: \"c9afb6d4-a946-4c1c-995e-330f39a1f346\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dc2nk" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.033271 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mtd4\" (UniqueName: \"kubernetes.io/projected/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-kube-api-access-4mtd4\") pod \"cni-sysctl-allowlist-ds-vwn4j\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.036194 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq"] Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.039700 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.057515 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpx52\" (UniqueName: \"kubernetes.io/projected/c35b7e55-a7c4-47ed-a58a-559247edfaca-kube-api-access-vpx52\") pod \"machine-config-controller-84d6567774-gh8fl\" (UID: \"c35b7e55-a7c4-47ed-a58a-559247edfaca\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.061863 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" event={"ID":"edc4e120-e930-4fbe-a871-bd63d3de85c9","Type":"ContainerStarted","Data":"25870ff1e1deef1cc982c299b5d616c66bbb322dbd917dcd2c6c7625b1ab1cc4"} Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.071004 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" event={"ID":"16abc87c-7821-4c71-89c6-d523fec8e4a8","Type":"ContainerStarted","Data":"cb332c9ea5969b3720c84c9c583e903e9f698c84f364567258fbeff117687106"} Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.077417 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg6vc\" (UniqueName: \"kubernetes.io/projected/4c115670-445c-4d56-83ed-9e8155a08d89-kube-api-access-rg6vc\") pod \"kube-storage-version-migrator-operator-b67b599dd-kzt9x\" (UID: \"4c115670-445c-4d56-83ed-9e8155a08d89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.091828 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nrwsd"] Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.093434 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" event={"ID":"2dcf6deb-727c-4615-ac08-6c9f7ded10f4","Type":"ContainerStarted","Data":"e362651e56107987634bfa67ac4ee03778a79b05a5feb90d10351d65a40a59ad"} Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.093485 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" event={"ID":"2dcf6deb-727c-4615-ac08-6c9f7ded10f4","Type":"ContainerStarted","Data":"aa401d64289d994dd0d119622a9f1f886dd8d2a0e914063f1f5912522ffebf45"} Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.094082 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kjj5\" (UniqueName: \"kubernetes.io/projected/0f41f688-ca0b-4a67-99b6-7108b7980bcd-kube-api-access-6kjj5\") pod \"machine-config-operator-74547568cd-hr45q\" (UID: \"0f41f688-ca0b-4a67-99b6-7108b7980bcd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.096291 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:21 crc kubenswrapper[4854]: E1007 12:25:21.096744 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:21.59671162 +0000 UTC m=+37.584543875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.097104 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" event={"ID":"753a1abe-8f65-4721-ad3f-b207e3413ffa","Type":"ContainerStarted","Data":"0cb4a09de0bcf16e9ed65dfb0112334da7e29bb8f18795347b3c98e9471dcfe8"} Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.099427 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" event={"ID":"e0f10c8d-51a0-4e7a-b7ab-af11699298af","Type":"ContainerStarted","Data":"f7abe2d0387c3fac3673c10c36ccd97b074dd440a32f874a11b0323af0a837d0"} Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.109957 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" event={"ID":"2c45dd65-0c11-4c29-8f62-d667bd61d974","Type":"ContainerStarted","Data":"61254e358b2dac7f661aecb8afb88edc26eed1baa32f0e9beb48f5e20a2e4840"} Oct 07 12:25:21 crc kubenswrapper[4854]: W1007 12:25:21.110867 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc625acfa_7d7b_4b52_b3a2_55c5f817966b.slice/crio-d4a2cf31b7383801d574bcd1bdba04748b432af05c9765318af1034a87207365 WatchSource:0}: Error finding container d4a2cf31b7383801d574bcd1bdba04748b432af05c9765318af1034a87207365: Status 404 returned error can't find the container with id d4a2cf31b7383801d574bcd1bdba04748b432af05c9765318af1034a87207365 Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.116393 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xhnnw" event={"ID":"c1f51bac-5e94-403b-a197-bf85caf57f23","Type":"ContainerStarted","Data":"cacbbef2a991cfca9886e30e361efba2496faff99102db8cc887db31b672e924"} Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.126604 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmm84\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-kube-api-access-zmm84\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.131774 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsbh7\" (UniqueName: \"kubernetes.io/projected/8bfa807d-ccf5-4c95-9912-9b039b0bad42-kube-api-access-dsbh7\") pod \"catalog-operator-68c6474976-npmk7\" (UID: \"8bfa807d-ccf5-4c95-9912-9b039b0bad42\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.147808 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-557n4" event={"ID":"15e62a94-cdbc-4474-9bb0-f62de50bd5e7","Type":"ContainerStarted","Data":"ce017ed1f5fbbc104749e1cb322a1f95e00f4a206737f61f84c58308a9b992ac"} Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.147870 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-557n4" event={"ID":"15e62a94-cdbc-4474-9bb0-f62de50bd5e7","Type":"ContainerStarted","Data":"7bb5d3b5d5109c3c570703d9d563d0da0ac8fa6bbd160d035a5f89797a9a8291"} Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.149085 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.154865 4854 patch_prober.go:28] interesting pod/console-operator-58897d9998-557n4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.154930 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-557n4" podUID="15e62a94-cdbc-4474-9bb0-f62de50bd5e7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.158599 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqz6c\" (UniqueName: \"kubernetes.io/projected/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-kube-api-access-wqz6c\") pod \"collect-profiles-29330655-nh22s\" (UID: \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.163766 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" event={"ID":"6c4c0dff-7632-4208-ad0e-475eb69bbc3b","Type":"ContainerStarted","Data":"f93699bad30ab9b4213570ff13f662f7a74fa1a38cf616478d0940f9e5ddc952"} Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.163815 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" event={"ID":"6c4c0dff-7632-4208-ad0e-475eb69bbc3b","Type":"ContainerStarted","Data":"089f5d1cf5985d174a1b74c24feac01758cd10004d5ef269bc5a5a233eaa8232"} Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.164397 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.172109 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp" event={"ID":"ebe3d149-62bb-4898-bee6-cfe16ad226ac","Type":"ContainerStarted","Data":"eca031ba31f8d49c87871da8558186695525a99ad2bfc31c56b771a0b58aa088"} Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.172207 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp" event={"ID":"ebe3d149-62bb-4898-bee6-cfe16ad226ac","Type":"ContainerStarted","Data":"6e17a9fb37b42d0e273366fab9d60d561df04047ae9b6ebd8e30946fe73ef8a9"} Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.176523 4854 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-972qh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.176601 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" podUID="6c4c0dff-7632-4208-ad0e-475eb69bbc3b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.184207 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.188924 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8cqq\" (UniqueName: \"kubernetes.io/projected/a4d3106d-58e9-4cb6-bcfd-6e151b16969b-kube-api-access-l8cqq\") pod \"machine-api-operator-5694c8668f-tsdfx\" (UID: \"a4d3106d-58e9-4cb6-bcfd-6e151b16969b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.196097 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dc2nk" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.197037 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp"] Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.197475 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:21 crc kubenswrapper[4854]: E1007 12:25:21.197658 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:21.697631382 +0000 UTC m=+37.685463637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.197844 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.199193 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdhsh\" (UniqueName: \"kubernetes.io/projected/d06ab521-e811-4ddf-93ec-d076d089db0c-kube-api-access-jdhsh\") pod \"ingress-canary-kqnvp\" (UID: \"d06ab521-e811-4ddf-93ec-d076d089db0c\") " pod="openshift-ingress-canary/ingress-canary-kqnvp" Oct 07 12:25:21 crc kubenswrapper[4854]: E1007 12:25:21.199998 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:21.699987511 +0000 UTC m=+37.687819766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:21 crc kubenswrapper[4854]: W1007 12:25:21.204707 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-ee39464b2a7c02c5b8fb312a097481618a87d3923448d3caf7455502d290fc1e WatchSource:0}: Error finding container ee39464b2a7c02c5b8fb312a097481618a87d3923448d3caf7455502d290fc1e: Status 404 returned error can't find the container with id ee39464b2a7c02c5b8fb312a097481618a87d3923448d3caf7455502d290fc1e Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.210205 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.212733 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh5nt\" (UniqueName: \"kubernetes.io/projected/8db95f90-9b16-440f-8329-be3ec6d7c1a0-kube-api-access-mh5nt\") pod \"cluster-image-registry-operator-dc59b4c8b-dwgwz\" (UID: \"8db95f90-9b16-440f-8329-be3ec6d7c1a0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.225277 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.229615 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4mzd\" (UniqueName: \"kubernetes.io/projected/a791f903-64e0-4312-a4ac-f5639200f261-kube-api-access-t4mzd\") pod \"olm-operator-6b444d44fb-tj4lv\" (UID: \"a791f903-64e0-4312-a4ac-f5639200f261\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.232418 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kqnvp" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.237524 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" Oct 07 12:25:21 crc kubenswrapper[4854]: W1007 12:25:21.237775 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2200948_73a4_4ee6_bd24_d8f4f8e4418f.slice/crio-ed79b3a7b7b204173f640e3c7d6043f1c94587c975d7a8bff8142fb8a2f5093c WatchSource:0}: Error finding container ed79b3a7b7b204173f640e3c7d6043f1c94587c975d7a8bff8142fb8a2f5093c: Status 404 returned error can't find the container with id ed79b3a7b7b204173f640e3c7d6043f1c94587c975d7a8bff8142fb8a2f5093c Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.256830 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-bound-sa-token\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.273590 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcr6z\" (UniqueName: \"kubernetes.io/projected/63387347-a234-4371-a7b9-c70d0ef73574-kube-api-access-fcr6z\") pod \"etcd-operator-b45778765-pzzxz\" (UID: \"63387347-a234-4371-a7b9-c70d0ef73574\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:21 crc kubenswrapper[4854]: W1007 12:25:21.288493 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-12c14fe994c1331ed43abef08f4e687c3495d51be106fe36fe89d193f80562b0 WatchSource:0}: Error finding container 12c14fe994c1331ed43abef08f4e687c3495d51be106fe36fe89d193f80562b0: Status 404 returned error can't find the container with id 12c14fe994c1331ed43abef08f4e687c3495d51be106fe36fe89d193f80562b0 Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.300078 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:21 crc kubenswrapper[4854]: E1007 12:25:21.300561 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:21.800526002 +0000 UTC m=+37.788358257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.300656 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:21 crc kubenswrapper[4854]: E1007 12:25:21.301917 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:21.801895162 +0000 UTC m=+37.789727417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.307688 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.308510 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kpxh\" (UniqueName: \"kubernetes.io/projected/df20a6de-dc59-4af7-b284-9fef4b249be9-kube-api-access-2kpxh\") pod \"migrator-59844c95c7-wns76\" (UID: \"df20a6de-dc59-4af7-b284-9fef4b249be9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wns76" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.315638 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qxhv\" (UniqueName: \"kubernetes.io/projected/b2b320c8-834c-43b3-ab9b-c97bba4fb2b2-kube-api-access-8qxhv\") pod \"multus-admission-controller-857f4d67dd-p8fbp\" (UID: \"b2b320c8-834c-43b3-ab9b-c97bba4fb2b2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-p8fbp" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.317591 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n75xz"] Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.355157 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tkb2\" (UniqueName: \"kubernetes.io/projected/615e917b-47ba-4399-a549-496561ca164e-kube-api-access-5tkb2\") pod \"ingress-operator-5b745b69d9-rhcf8\" (UID: \"615e917b-47ba-4399-a549-496561ca164e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.357423 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.365843 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.370446 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4cz6\" (UniqueName: \"kubernetes.io/projected/e83624ac-e8aa-429f-b313-8c5c4fc6c88e-kube-api-access-q4cz6\") pod \"openshift-controller-manager-operator-756b6f6bc6-tt4xp\" (UID: \"e83624ac-e8aa-429f-b313-8c5c4fc6c88e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.375368 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfbst\" (UniqueName: \"kubernetes.io/projected/329204da-6485-459a-bf30-0ae870c46ca2-kube-api-access-kfbst\") pod \"marketplace-operator-79b997595-nkh4b\" (UID: \"329204da-6485-459a-bf30-0ae870c46ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.383009 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-p8fbp" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.390670 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.400119 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.407581 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:21 crc kubenswrapper[4854]: E1007 12:25:21.408236 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:21.908194481 +0000 UTC m=+37.896026746 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.411649 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-222k4\" (UniqueName: \"kubernetes.io/projected/6d0a97ca-97f0-4bc3-8ee2-bf043accc158-kube-api-access-222k4\") pod \"csi-hostpathplugin-xhvq4\" (UID: \"6d0a97ca-97f0-4bc3-8ee2-bf043accc158\") " pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.460588 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wns76" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.470092 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.491854 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.499577 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.509879 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:21 crc kubenswrapper[4854]: E1007 12:25:21.510308 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:22.010288797 +0000 UTC m=+37.998121052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.544188 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.554258 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.559386 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t99kd"] Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.560139 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n"] Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.616884 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:21 crc kubenswrapper[4854]: E1007 12:25:21.617758 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:22.11773463 +0000 UTC m=+38.105566885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.624296 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g"] Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.636972 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9cjsc"] Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.652094 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.682900 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7frlz"] Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.722011 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:21 crc kubenswrapper[4854]: E1007 12:25:21.723288 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:22.223264807 +0000 UTC m=+38.211097062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.734600 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p2nbp" podStartSLOduration=17.734569078 podStartE2EDuration="17.734569078s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:21.729891431 +0000 UTC m=+37.717723686" watchObservedRunningTime="2025-10-07 12:25:21.734569078 +0000 UTC m=+37.722401323" Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.768259 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dc2nk"] Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.822855 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:21 crc kubenswrapper[4854]: E1007 12:25:21.825872 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:22.325836038 +0000 UTC m=+38.313668293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.826561 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:21 crc kubenswrapper[4854]: E1007 12:25:21.827021 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:22.327011192 +0000 UTC m=+38.314843447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:21 crc kubenswrapper[4854]: W1007 12:25:21.829312 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda28154d9_7951_4d75_80be_4bc5cac0ffee.slice/crio-e425b3cf9248bb1b385dfe91d4794ce3008c42ba198df43ac16ae73ddad6c2a6 WatchSource:0}: Error finding container e425b3cf9248bb1b385dfe91d4794ce3008c42ba198df43ac16ae73ddad6c2a6: Status 404 returned error can't find the container with id e425b3cf9248bb1b385dfe91d4794ce3008c42ba198df43ac16ae73ddad6c2a6 Oct 07 12:25:21 crc kubenswrapper[4854]: W1007 12:25:21.868223 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9afb6d4_a946_4c1c_995e_330f39a1f346.slice/crio-489018e4136ea6179e34721237724d8707676a1487f31121a703af9dc2095728 WatchSource:0}: Error finding container 489018e4136ea6179e34721237724d8707676a1487f31121a703af9dc2095728: Status 404 returned error can't find the container with id 489018e4136ea6179e34721237724d8707676a1487f31121a703af9dc2095728 Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.944018 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:21 crc kubenswrapper[4854]: E1007 12:25:21.944503 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:22.444477748 +0000 UTC m=+38.432310003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.944764 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:21 crc kubenswrapper[4854]: E1007 12:25:21.945197 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:22.445189299 +0000 UTC m=+38.433021554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:21 crc kubenswrapper[4854]: I1007 12:25:21.973092 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb"] Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.021233 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kqnvp"] Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.026712 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl"] Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.045642 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:22 crc kubenswrapper[4854]: E1007 12:25:22.046710 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:22.546682158 +0000 UTC m=+38.534514413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.074910 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s"] Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.121028 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" podStartSLOduration=17.120993702 podStartE2EDuration="17.120993702s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:22.09768058 +0000 UTC m=+38.085512835" watchObservedRunningTime="2025-10-07 12:25:22.120993702 +0000 UTC m=+38.108825947" Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.127806 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tsdfx"] Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.127898 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-p8fbp"] Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.148597 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:22 crc kubenswrapper[4854]: E1007 12:25:22.149427 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:22.649414843 +0000 UTC m=+38.637247098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.188428 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g" event={"ID":"8552bac7-9651-4a2c-b65e-28b2c091a0f5","Type":"ContainerStarted","Data":"c5dac8fc4b8e5f92d3f2fdc610b96118bef9f6ec091b80d9e9eb7e2785b4faa6"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.206992 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd" event={"ID":"d9fa2d33-9f7a-4bbf-9c14-085ec357aafd","Type":"ContainerStarted","Data":"a6d38ef447f5f75555b668784acdc16f26854520a37c1ebf514e89547e18b3ad"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.207078 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd" event={"ID":"d9fa2d33-9f7a-4bbf-9c14-085ec357aafd","Type":"ContainerStarted","Data":"6d2e4d6513f52ba4fa8e55ec09e82f721d54bc11a6d96f8dd41866a9b2005617"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.208946 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7"] Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.237797 4854 generic.go:334] "Generic (PLEG): container finished" podID="c625acfa-7d7b-4b52-b3a2-55c5f817966b" containerID="a7903ddf3ebfc247d2de757bdec54d38a9fd166db149f875ee7a881bae8bb024" exitCode=0 Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.237887 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" event={"ID":"c625acfa-7d7b-4b52-b3a2-55c5f817966b","Type":"ContainerDied","Data":"a7903ddf3ebfc247d2de757bdec54d38a9fd166db149f875ee7a881bae8bb024"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.237924 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" event={"ID":"c625acfa-7d7b-4b52-b3a2-55c5f817966b","Type":"ContainerStarted","Data":"d4a2cf31b7383801d574bcd1bdba04748b432af05c9765318af1034a87207365"} Oct 07 12:25:22 crc kubenswrapper[4854]: W1007 12:25:22.240716 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb533ed73_fa56_4742_b10c_34beb63d4bba.slice/crio-dc50c23c5a27720522a2266fca6b338d6de7159cf45e030f9b9c35068ec63473 WatchSource:0}: Error finding container dc50c23c5a27720522a2266fca6b338d6de7159cf45e030f9b9c35068ec63473: Status 404 returned error can't find the container with id dc50c23c5a27720522a2266fca6b338d6de7159cf45e030f9b9c35068ec63473 Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.248542 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" event={"ID":"16abc87c-7821-4c71-89c6-d523fec8e4a8","Type":"ContainerStarted","Data":"e9f7058e0f26adcf68105549d634efcc2c7c1f14efad91a2c7b8f90b26f16d20"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.249798 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.251989 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-64q88" event={"ID":"43359b2f-63d6-4e81-888e-e219dca84ec3","Type":"ContainerStarted","Data":"54ffacbc5f93f3acbe3bc6ee7d88312fe52ab059177a06f4a447c29966186a1b"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.252045 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-64q88" event={"ID":"43359b2f-63d6-4e81-888e-e219dca84ec3","Type":"ContainerStarted","Data":"2a9a070225a64049ef093c6e98d67c8be9cafe96321168973118c91bd1893d3c"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.253115 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-64q88" Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.254330 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" event={"ID":"753a1abe-8f65-4721-ad3f-b207e3413ffa","Type":"ContainerStarted","Data":"0841c829e516db789b99eb47c1714434662b71c48a0c67251407ed29307c248e"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.254900 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.256663 4854 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-24ztk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" start-of-body= Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.256705 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" podUID="16abc87c-7821-4c71-89c6-d523fec8e4a8" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.262321 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:22 crc kubenswrapper[4854]: E1007 12:25:22.263190 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:22.763135409 +0000 UTC m=+38.750967664 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.279383 4854 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6s458 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.279386 4854 patch_prober.go:28] interesting pod/downloads-7954f5f757-64q88 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.279423 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" podUID="753a1abe-8f65-4721-ad3f-b207e3413ffa" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.279475 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-64q88" podUID="43359b2f-63d6-4e81-888e-e219dca84ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.308908 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" event={"ID":"2c45dd65-0c11-4c29-8f62-d667bd61d974","Type":"ContainerStarted","Data":"9c29b9fce1b630c9742dd9195d65dcf3c9d269806911501797983305c3954b05"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.310600 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.332435 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dc2nk" event={"ID":"c9afb6d4-a946-4c1c-995e-330f39a1f346","Type":"ContainerStarted","Data":"489018e4136ea6179e34721237724d8707676a1487f31121a703af9dc2095728"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.368619 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:22 crc kubenswrapper[4854]: E1007 12:25:22.387487 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:22.887446326 +0000 UTC m=+38.875278591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.394623 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7frlz" event={"ID":"9fdfad7e-e478-4167-b2d6-5b8295354328","Type":"ContainerStarted","Data":"dfefb3b58fa2b87d0bdbdc36dca227bbe86606b505698b902c3c696e787f46ec"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.411750 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q"] Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.421356 4854 generic.go:334] "Generic (PLEG): container finished" podID="edc4e120-e930-4fbe-a871-bd63d3de85c9" containerID="8eba6c84dc9b9cf1a4f8b8d43809f13bb17d02d165acfea4444e3720162fcfc0" exitCode=0 Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.421480 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" event={"ID":"edc4e120-e930-4fbe-a871-bd63d3de85c9","Type":"ContainerDied","Data":"8eba6c84dc9b9cf1a4f8b8d43809f13bb17d02d165acfea4444e3720162fcfc0"} Oct 07 12:25:22 crc kubenswrapper[4854]: W1007 12:25:22.435197 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4d3106d_58e9_4cb6_bcfd_6e151b16969b.slice/crio-13bb31de8c9a34ad43690c838caccb0c04ad6d15e51cadf3be167a3fdb1a9b37 WatchSource:0}: Error finding container 13bb31de8c9a34ad43690c838caccb0c04ad6d15e51cadf3be167a3fdb1a9b37: Status 404 returned error can't find the container with id 13bb31de8c9a34ad43690c838caccb0c04ad6d15e51cadf3be167a3fdb1a9b37 Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.439820 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x"] Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.462918 4854 generic.go:334] "Generic (PLEG): container finished" podID="c4a5d028-220b-40aa-a907-95015d7880c8" containerID="cdbaf138fe052ffd4381dbe57fd774942f840f697866b178ac2cda42c4a26f3e" exitCode=0 Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.463023 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" event={"ID":"c4a5d028-220b-40aa-a907-95015d7880c8","Type":"ContainerDied","Data":"cdbaf138fe052ffd4381dbe57fd774942f840f697866b178ac2cda42c4a26f3e"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.463056 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" event={"ID":"c4a5d028-220b-40aa-a907-95015d7880c8","Type":"ContainerStarted","Data":"6c16aaca0693f927ed6459267881831525ce52f3b7c0856dbb6b3d127cb54666"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.470401 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:22 crc kubenswrapper[4854]: E1007 12:25:22.471198 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:22.971140704 +0000 UTC m=+38.958972959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.473684 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n75xz" event={"ID":"3c4aa321-2b80-4272-a6e3-ecfbefcbc10e","Type":"ContainerStarted","Data":"3c8f6ac21c7b7623b55508e6363c22137bc53611dfadf01cb75916667a983b2b"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.475756 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9cjsc" event={"ID":"a5796307-74e5-4cb6-99db-b1ba95dacb54","Type":"ContainerStarted","Data":"a1aae0653c19084f5bed47cfc33a4b3081c9efd4c13c949be506d3b8e14500ed"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.477858 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-w6wh4" event={"ID":"f918bb49-d2c5-48f2-ae8f-235bf17c2328","Type":"ContainerStarted","Data":"9ce42aa208240add2248648ec9aaedbbaf42352c98bf90d4083a178bca184129"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.477880 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-w6wh4" event={"ID":"f918bb49-d2c5-48f2-ae8f-235bf17c2328","Type":"ContainerStarted","Data":"a6b2ef11c18ffb99f6cd73f52c3f51d61c0d9a626e59a171c88c38e3fa75f5c0"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.481606 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" event={"ID":"2dcf6deb-727c-4615-ac08-6c9f7ded10f4","Type":"ContainerStarted","Data":"8e8f025504ed03e348eba2162f02d4cdbe1383bd13179835614972e9c6dedf6d"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.486178 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" event={"ID":"98fd3935-23d7-4ee3-8325-222533852b78","Type":"ContainerStarted","Data":"a6e10632989ef8fc9015aac3450938a79fce512a7c448578bd3d56a0e16ccbd1"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.486247 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" event={"ID":"98fd3935-23d7-4ee3-8325-222533852b78","Type":"ContainerStarted","Data":"286be4d40ec9e85b3b5f4e4965dbab996a6d257206d36eb11ed2204321afb471"} Oct 07 12:25:22 crc kubenswrapper[4854]: W1007 12:25:22.493275 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f41f688_ca0b_4a67_99b6_7108b7980bcd.slice/crio-95ef419a7bfa9c3fd7444c799f8d90bf48ba6d6b84db48e74b0968e5d77bdcf7 WatchSource:0}: Error finding container 95ef419a7bfa9c3fd7444c799f8d90bf48ba6d6b84db48e74b0968e5d77bdcf7: Status 404 returned error can't find the container with id 95ef419a7bfa9c3fd7444c799f8d90bf48ba6d6b84db48e74b0968e5d77bdcf7 Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.496047 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" event={"ID":"e0f10c8d-51a0-4e7a-b7ab-af11699298af","Type":"ContainerStarted","Data":"df6246aa6c883e279cb8c9fc5a74ae74aa9d9acf2855a1979f86bb779ec7d379"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.500941 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" event={"ID":"3e06c8ed-2c8c-4c09-a352-31432fbd7d40","Type":"ContainerStarted","Data":"2f67d006864cc4fe132246b02b59392011d20bfb3335b1bd06418da479e56e96"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.506429 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n" event={"ID":"cb27cf86-2629-4ee4-85d0-a56129166c65","Type":"ContainerStarted","Data":"1de16b7c4477de3e0601dbced35d9c53f536bd2577a9495c4cf43a021e7c62a2"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.509576 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp" event={"ID":"f2200948-73a4-4ee6-bd24-d8f4f8e4418f","Type":"ContainerStarted","Data":"ed79b3a7b7b204173f640e3c7d6043f1c94587c975d7a8bff8142fb8a2f5093c"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.520627 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4c67c9f2e19831a5ccc105fcf75476af620d1b1f8e9e7c07dabf866d2a093aaa"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.520713 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ee39464b2a7c02c5b8fb312a097481618a87d3923448d3caf7455502d290fc1e"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.527419 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-557n4" podStartSLOduration=18.527300267 podStartE2EDuration="18.527300267s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:22.526418411 +0000 UTC m=+38.514250666" watchObservedRunningTime="2025-10-07 12:25:22.527300267 +0000 UTC m=+38.515132522" Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.538095 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t99kd" event={"ID":"a28154d9-7951-4d75-80be-4bc5cac0ffee","Type":"ContainerStarted","Data":"e425b3cf9248bb1b385dfe91d4794ce3008c42ba198df43ac16ae73ddad6c2a6"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.584178 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:22 crc kubenswrapper[4854]: E1007 12:25:22.596381 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:23.096356207 +0000 UTC m=+39.084188502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.600105 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wns76"] Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.600193 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nkh4b"] Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.685111 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"aa880e76fc2a7734d596e0ff10a6fe59695bb912a6b9b2baa0ee52676fddee11"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.685940 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:22 crc kubenswrapper[4854]: E1007 12:25:22.686964 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:23.186925536 +0000 UTC m=+39.174757791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.687166 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:22 crc kubenswrapper[4854]: E1007 12:25:22.687768 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:23.18775178 +0000 UTC m=+39.175584045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.694196 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"12c14fe994c1331ed43abef08f4e687c3495d51be106fe36fe89d193f80562b0"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.701226 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xhnnw" event={"ID":"c1f51bac-5e94-403b-a197-bf85caf57f23","Type":"ContainerStarted","Data":"18b1414275709c7fc71583bed7a73866cf4e5e4a3faf0d9b68809f3076d3cb99"} Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.715098 4854 patch_prober.go:28] interesting pod/console-operator-58897d9998-557n4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.715281 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-557n4" podUID="15e62a94-cdbc-4474-9bb0-f62de50bd5e7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.789818 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:22 crc kubenswrapper[4854]: E1007 12:25:22.791505 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:23.291476304 +0000 UTC m=+39.279308559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.895249 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:22 crc kubenswrapper[4854]: E1007 12:25:22.895808 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:23.395793556 +0000 UTC m=+39.383625811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.942911 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.943605 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.970945 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-w6wh4" podStartSLOduration=4.970926223 podStartE2EDuration="4.970926223s" podCreationTimestamp="2025-10-07 12:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:22.927309398 +0000 UTC m=+38.915141653" watchObservedRunningTime="2025-10-07 12:25:22.970926223 +0000 UTC m=+38.958758478" Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.973882 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-64q88" podStartSLOduration=18.97387204 podStartE2EDuration="18.97387204s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:22.969170242 +0000 UTC m=+38.957002507" watchObservedRunningTime="2025-10-07 12:25:22.97387204 +0000 UTC m=+38.961704315" Oct 07 12:25:22 crc kubenswrapper[4854]: I1007 12:25:22.998866 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:22 crc kubenswrapper[4854]: E1007 12:25:22.999768 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:23.499750597 +0000 UTC m=+39.487582852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.070861 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" podStartSLOduration=18.070842206 podStartE2EDuration="18.070842206s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:23.068638242 +0000 UTC m=+39.056470497" watchObservedRunningTime="2025-10-07 12:25:23.070842206 +0000 UTC m=+39.058674461" Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.102171 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:23 crc kubenswrapper[4854]: E1007 12:25:23.102550 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:23.602533963 +0000 UTC m=+39.590366218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.118999 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" podStartSLOduration=18.118976034 podStartE2EDuration="18.118976034s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:23.114131802 +0000 UTC m=+39.101964057" watchObservedRunningTime="2025-10-07 12:25:23.118976034 +0000 UTC m=+39.106808289" Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.204469 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:23 crc kubenswrapper[4854]: E1007 12:25:23.205697 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:23.70567775 +0000 UTC m=+39.693510005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.237638 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xhnnw" podStartSLOduration=18.237618065 podStartE2EDuration="18.237618065s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:23.235303507 +0000 UTC m=+39.223135772" watchObservedRunningTime="2025-10-07 12:25:23.237618065 +0000 UTC m=+39.225450320" Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.258121 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv"] Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.309564 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:23 crc kubenswrapper[4854]: E1007 12:25:23.309944 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:23.8099294 +0000 UTC m=+39.797761645 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.310657 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs\") pod \"network-metrics-daemon-m6k45\" (UID: \"f0099a86-9473-4618-9266-2eb460d09150\") " pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.318783 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-w9n84" podStartSLOduration=18.318755688 podStartE2EDuration="18.318755688s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:23.31404743 +0000 UTC m=+39.301879685" watchObservedRunningTime="2025-10-07 12:25:23.318755688 +0000 UTC m=+39.306587943" Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.347179 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0099a86-9473-4618-9266-2eb460d09150-metrics-certs\") pod \"network-metrics-daemon-m6k45\" (UID: \"f0099a86-9473-4618-9266-2eb460d09150\") " pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.394392 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-n644j" podStartSLOduration=19.39436149 podStartE2EDuration="19.39436149s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:23.389115156 +0000 UTC m=+39.376947431" watchObservedRunningTime="2025-10-07 12:25:23.39436149 +0000 UTC m=+39.382193735" Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.395265 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" podStartSLOduration=19.395258466 podStartE2EDuration="19.395258466s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:23.353102663 +0000 UTC m=+39.340934918" watchObservedRunningTime="2025-10-07 12:25:23.395258466 +0000 UTC m=+39.383090721" Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.431664 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-brrjc" podStartSLOduration=19.43163326 podStartE2EDuration="19.43163326s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:23.430396794 +0000 UTC m=+39.418229049" watchObservedRunningTime="2025-10-07 12:25:23.43163326 +0000 UTC m=+39.419465515" Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.432011 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:23 crc kubenswrapper[4854]: E1007 12:25:23.432654 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:23.932631209 +0000 UTC m=+39.920463464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.518277 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m6k45" Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.535430 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:23 crc kubenswrapper[4854]: E1007 12:25:23.536195 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:24.036176478 +0000 UTC m=+40.024008733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.581357 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz"] Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.605785 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pzzxz"] Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.606172 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8"] Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.614483 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.623543 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:23 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:23 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:23 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.623759 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.640423 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:23 crc kubenswrapper[4854]: E1007 12:25:23.644513 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:24.144488596 +0000 UTC m=+40.132320851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.662238 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xhvq4"] Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.665330 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp"] Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.727950 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n" event={"ID":"cb27cf86-2629-4ee4-85d0-a56129166c65","Type":"ContainerStarted","Data":"8778e82304c53b1e206e0e7d1e5c9cade2413b03b4a827a807e43b0b5590f073"} Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.729510 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x" event={"ID":"4c115670-445c-4d56-83ed-9e8155a08d89","Type":"ContainerStarted","Data":"7ea757e79d289dcc5505f5bf454a620d553c5e2e9d3d499a5e456305a0513d42"} Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.746111 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:23 crc kubenswrapper[4854]: E1007 12:25:23.746654 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:24.246637323 +0000 UTC m=+40.234469578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.761966 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" event={"ID":"c35b7e55-a7c4-47ed-a58a-559247edfaca","Type":"ContainerStarted","Data":"55391e9987b0304d388c3b173ff3a322285f3a0ec3a7ad03f70a420cccb5d940"} Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.762054 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" event={"ID":"c35b7e55-a7c4-47ed-a58a-559247edfaca","Type":"ContainerStarted","Data":"50826e08643446e2b41516704cf05cb6f5b02f0949789a3e1ee97eea5b06da3e"} Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.782483 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t99kd" event={"ID":"a28154d9-7951-4d75-80be-4bc5cac0ffee","Type":"ContainerStarted","Data":"6fe680e95c71577b818b9f54d3ed1241458aaa006abc76580d40e275c33008bd"} Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.836254 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"be2676dfd3b63e436e04f67435ee48071680d145288a8b2165065d6c55f8d9ff"} Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.837293 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:25:23 crc kubenswrapper[4854]: E1007 12:25:23.853495 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:24.353456318 +0000 UTC m=+40.341288563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.854251 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.854604 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:23 crc kubenswrapper[4854]: E1007 12:25:23.855089 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:24.355074295 +0000 UTC m=+40.342906550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.871986 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9cjsc" event={"ID":"a5796307-74e5-4cb6-99db-b1ba95dacb54","Type":"ContainerStarted","Data":"dc06d1a97fc67438f36a0185d1fc220345e6e9084964dad503012af7e5bb3a61"} Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.927584 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-p8fbp" event={"ID":"b2b320c8-834c-43b3-ab9b-c97bba4fb2b2","Type":"ContainerStarted","Data":"3911d5d69a2283ef6f541aba868f891b03192e43a343410c9ff0093f8e22bf0a"} Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.962214 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:23 crc kubenswrapper[4854]: E1007 12:25:23.964125 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:24.464103234 +0000 UTC m=+40.451935499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:23 crc kubenswrapper[4854]: I1007 12:25:23.994770 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-9cjsc" podStartSLOduration=19.994747421 podStartE2EDuration="19.994747421s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:23.993971118 +0000 UTC m=+39.981803373" watchObservedRunningTime="2025-10-07 12:25:23.994747421 +0000 UTC m=+39.982579676" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.020959 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp" event={"ID":"f2200948-73a4-4ee6-bd24-d8f4f8e4418f","Type":"ContainerStarted","Data":"ea313603f1ada85271cb23d63ae031cca8f6dc978415965623b15af47eddc80f"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.046969 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kqnvp" event={"ID":"d06ab521-e811-4ddf-93ec-d076d089db0c","Type":"ContainerStarted","Data":"e5d54bddc22cc155f8b2d551df029d37b2ff5033e96dccf1bab569742f7898aa"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.064486 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:24 crc kubenswrapper[4854]: E1007 12:25:24.066017 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:24.565986305 +0000 UTC m=+40.553818560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.083168 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qkppp" podStartSLOduration=19.083127666 podStartE2EDuration="19.083127666s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:24.07540618 +0000 UTC m=+40.063238435" watchObservedRunningTime="2025-10-07 12:25:24.083127666 +0000 UTC m=+40.070959911" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.134247 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kqnvp" podStartSLOduration=6.1342101 podStartE2EDuration="6.1342101s" podCreationTimestamp="2025-10-07 12:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:24.131719287 +0000 UTC m=+40.119551542" watchObservedRunningTime="2025-10-07 12:25:24.1342101 +0000 UTC m=+40.122042355" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.155854 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" event={"ID":"a791f903-64e0-4312-a4ac-f5639200f261","Type":"ContainerStarted","Data":"172f62630abb981371fad7ba7ffed078c6599581c856b5f09a97a47f3889b518"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.169829 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:24 crc kubenswrapper[4854]: E1007 12:25:24.170518 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:24.670484501 +0000 UTC m=+40.658316756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.177929 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3f9eb8431903921d0bd6e6fbac1ed50cba4548000369ea10168dc2e2b200b9af"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.253814 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wns76" event={"ID":"df20a6de-dc59-4af7-b284-9fef4b249be9","Type":"ContainerStarted","Data":"e4103595fdd6190cf2d2c4343b2858833257e7d84a9516e084f34522e9b5cd5c"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.256379 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" event={"ID":"0f41f688-ca0b-4a67-99b6-7108b7980bcd","Type":"ContainerStarted","Data":"95ef419a7bfa9c3fd7444c799f8d90bf48ba6d6b84db48e74b0968e5d77bdcf7"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.258025 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dc2nk" event={"ID":"c9afb6d4-a946-4c1c-995e-330f39a1f346","Type":"ContainerStarted","Data":"810a45b1318075b2ad3465938412696be43ff4e6d9004b5c994f1f29e9e2c457"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.271285 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:24 crc kubenswrapper[4854]: E1007 12:25:24.277747 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:24.777726958 +0000 UTC m=+40.765559213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.296816 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dc2nk" podStartSLOduration=19.296797586 podStartE2EDuration="19.296797586s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:24.296656572 +0000 UTC m=+40.284488817" watchObservedRunningTime="2025-10-07 12:25:24.296797586 +0000 UTC m=+40.284629841" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.322625 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" event={"ID":"3e06c8ed-2c8c-4c09-a352-31432fbd7d40","Type":"ContainerStarted","Data":"f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.323208 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.368095 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n75xz" event={"ID":"3c4aa321-2b80-4272-a6e3-ecfbefcbc10e","Type":"ContainerStarted","Data":"8bd9653b85470038aa9c1f5744eb6b24cc43e644a17bb73a5b888e9bdaa1e730"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.371427 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" podStartSLOduration=7.371414829 podStartE2EDuration="7.371414829s" podCreationTimestamp="2025-10-07 12:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:24.370731719 +0000 UTC m=+40.358563974" watchObservedRunningTime="2025-10-07 12:25:24.371414829 +0000 UTC m=+40.359247074" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.380396 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:24 crc kubenswrapper[4854]: E1007 12:25:24.404171 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:24.904103375 +0000 UTC m=+40.891935630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.414985 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" event={"ID":"8bfa807d-ccf5-4c95-9912-9b039b0bad42","Type":"ContainerStarted","Data":"e97c695a3c55ccc4cbf63c0a9086237590914c39f3f5c373e7c8941425f4507d"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.415085 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" event={"ID":"8bfa807d-ccf5-4c95-9912-9b039b0bad42","Type":"ContainerStarted","Data":"9cfb767eaf5f4f31e87a801275553422df7c56770f12e631afe4fd2c67490384"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.415660 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.422394 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb" event={"ID":"b533ed73-fa56-4742-b10c-34beb63d4bba","Type":"ContainerStarted","Data":"dc50c23c5a27720522a2266fca6b338d6de7159cf45e030f9b9c35068ec63473"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.453345 4854 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-npmk7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.453805 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" podUID="8bfa807d-ccf5-4c95-9912-9b039b0bad42" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.484686 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd" event={"ID":"d9fa2d33-9f7a-4bbf-9c14-085ec357aafd","Type":"ContainerStarted","Data":"f542cbe262a754319fc46ab4076b0de6a86988e624b9e141422123c22da083c3"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.485938 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:24 crc kubenswrapper[4854]: E1007 12:25:24.494452 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:24.994426607 +0000 UTC m=+40.982258862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.505512 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" podStartSLOduration=19.505489531 podStartE2EDuration="19.505489531s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:24.504014778 +0000 UTC m=+40.491847033" watchObservedRunningTime="2025-10-07 12:25:24.505489531 +0000 UTC m=+40.493321786" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.509409 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n75xz" podStartSLOduration=19.509392925 podStartE2EDuration="19.509392925s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:24.460172475 +0000 UTC m=+40.448004730" watchObservedRunningTime="2025-10-07 12:25:24.509392925 +0000 UTC m=+40.497225180" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.534588 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" event={"ID":"8636dc3b-78f1-4098-81ca-1ae0ef4c441b","Type":"ContainerStarted","Data":"83c7d7761eed2717d67730fef786b60ebccf645f50e1bde8cb11ec56fa3300ef"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.534654 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" event={"ID":"8636dc3b-78f1-4098-81ca-1ae0ef4c441b","Type":"ContainerStarted","Data":"30993f8078b16ac73b4d54090af217659a8d8f9092d30a2d8b8ba13513e649de"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.547401 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkkdd" podStartSLOduration=20.547378056 podStartE2EDuration="20.547378056s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:24.545244444 +0000 UTC m=+40.533076699" watchObservedRunningTime="2025-10-07 12:25:24.547378056 +0000 UTC m=+40.535210311" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.557139 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-m6k45"] Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.577356 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" event={"ID":"329204da-6485-459a-bf30-0ae870c46ca2","Type":"ContainerStarted","Data":"0762e223565550e4607d4d0cfa4a10e11783430320c8bf8ff08ab67d03b93d50"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.578291 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.583520 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" podStartSLOduration=20.583482832 podStartE2EDuration="20.583482832s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:24.582777192 +0000 UTC m=+40.570609457" watchObservedRunningTime="2025-10-07 12:25:24.583482832 +0000 UTC m=+40.571315087" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.585576 4854 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nkh4b container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.585661 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" podUID="329204da-6485-459a-bf30-0ae870c46ca2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.585824 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" event={"ID":"a4d3106d-58e9-4cb6-bcfd-6e151b16969b","Type":"ContainerStarted","Data":"13bb31de8c9a34ad43690c838caccb0c04ad6d15e51cadf3be167a3fdb1a9b37"} Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.587554 4854 patch_prober.go:28] interesting pod/downloads-7954f5f757-64q88 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.587579 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-64q88" podUID="43359b2f-63d6-4e81-888e-e219dca84ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.589994 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:24 crc kubenswrapper[4854]: E1007 12:25:24.592606 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:25.092576748 +0000 UTC m=+41.080409003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.593018 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:24 crc kubenswrapper[4854]: E1007 12:25:24.597502 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:25.097481092 +0000 UTC m=+41.085313347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.604028 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-557n4" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.607759 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.613413 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:24 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:24 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:24 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.613483 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.630961 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" podStartSLOduration=19.630947591 podStartE2EDuration="19.630947591s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:24.627749927 +0000 UTC m=+40.615582182" watchObservedRunningTime="2025-10-07 12:25:24.630947591 +0000 UTC m=+40.618779846" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.665309 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.695012 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:24 crc kubenswrapper[4854]: E1007 12:25:24.701107 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:25.201075412 +0000 UTC m=+41.188907667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.709367 4854 scope.go:117] "RemoveContainer" containerID="d3d145a2ec17b5518fe133d5ca43ac1e7a6a8b4ab17a95fd32d3a0c23b4be70a" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.806375 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.807495 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-24ztk" Oct 07 12:25:24 crc kubenswrapper[4854]: E1007 12:25:24.807726 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:25.307707221 +0000 UTC m=+41.295539496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:24 crc kubenswrapper[4854]: W1007 12:25:24.815438 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0099a86_9473_4618_9266_2eb460d09150.slice/crio-eff640e4cd5d2c8d6f8274c7ffec23a1db545fa0186471fd0f9f60c41af06bee WatchSource:0}: Error finding container eff640e4cd5d2c8d6f8274c7ffec23a1db545fa0186471fd0f9f60c41af06bee: Status 404 returned error can't find the container with id eff640e4cd5d2c8d6f8274c7ffec23a1db545fa0186471fd0f9f60c41af06bee Oct 07 12:25:24 crc kubenswrapper[4854]: I1007 12:25:24.914445 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:24 crc kubenswrapper[4854]: E1007 12:25:24.914872 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:25.414855595 +0000 UTC m=+41.402687850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.025232 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:25 crc kubenswrapper[4854]: E1007 12:25:25.025813 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:25.52579177 +0000 UTC m=+41.513624015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.127895 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:25 crc kubenswrapper[4854]: E1007 12:25:25.128373 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:25.62834801 +0000 UTC m=+41.616180255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.230303 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:25 crc kubenswrapper[4854]: E1007 12:25:25.230734 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:25.730719025 +0000 UTC m=+41.718551280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:25 crc kubenswrapper[4854]: E1007 12:25:25.334129 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:25.834106469 +0000 UTC m=+41.821938714 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.334008 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.335104 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:25 crc kubenswrapper[4854]: E1007 12:25:25.335641 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:25.835630034 +0000 UTC m=+41.823462289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.443439 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:25 crc kubenswrapper[4854]: E1007 12:25:25.444058 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:25.944035025 +0000 UTC m=+41.931867280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.565998 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:25 crc kubenswrapper[4854]: E1007 12:25:25.566874 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:26.066861807 +0000 UTC m=+42.054694062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.624422 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:25 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:25 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:25 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.624461 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.669634 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" event={"ID":"615e917b-47ba-4399-a549-496561ca164e","Type":"ContainerStarted","Data":"6099da2e98137eca522a8e203fef0ce7e1884d1a1511b01d062eafd37ee0ceb0"} Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.669732 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" event={"ID":"615e917b-47ba-4399-a549-496561ca164e","Type":"ContainerStarted","Data":"bbecd688bf4a7f1038479fdd9388afc0035ac4eeec776f353a877618933978c6"} Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.670393 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:25 crc kubenswrapper[4854]: E1007 12:25:25.672163 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:26.172108736 +0000 UTC m=+42.159940991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.672364 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:25 crc kubenswrapper[4854]: E1007 12:25:25.672734 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:26.172715724 +0000 UTC m=+42.160547979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.683562 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" event={"ID":"329204da-6485-459a-bf30-0ae870c46ca2","Type":"ContainerStarted","Data":"33fcf1ba3846e030da024080cd90003baf5c10189156f3e2006d7d778b9aa2bb"} Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.688483 4854 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nkh4b container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.690043 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" podUID="329204da-6485-459a-bf30-0ae870c46ca2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.703119 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-vwn4j"] Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.706346 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" event={"ID":"c4a5d028-220b-40aa-a907-95015d7880c8","Type":"ContainerStarted","Data":"1c027cfd0e2f5a05a0c7585a1a879ce288608484071a2d3251d724c248207955"} Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.714103 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7frlz" event={"ID":"9fdfad7e-e478-4167-b2d6-5b8295354328","Type":"ContainerStarted","Data":"60c35e0af4429ffb647e036b4d89157ff7634a336aabc84022d177911d9781aa"} Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.727961 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x" event={"ID":"4c115670-445c-4d56-83ed-9e8155a08d89","Type":"ContainerStarted","Data":"24d1840b213fe827dbf4e5c5c12f2b794b42b0b48b2f4805deabf6e7bafafbff"} Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.744046 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wns76" event={"ID":"df20a6de-dc59-4af7-b284-9fef4b249be9","Type":"ContainerStarted","Data":"ec9cae44c0070fb753345df290a6115d38164c42fb9763c5635e85a648856d5e"} Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.768773 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kzt9x" podStartSLOduration=20.768751243 podStartE2EDuration="20.768751243s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:25.763453938 +0000 UTC m=+41.751286183" watchObservedRunningTime="2025-10-07 12:25:25.768751243 +0000 UTC m=+41.756583498" Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.779249 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t99kd" event={"ID":"a28154d9-7951-4d75-80be-4bc5cac0ffee","Type":"ContainerStarted","Data":"b509b0694419fc5115c298321cf2c03013d819e96d8cb467082dcf59be575159"} Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.781361 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-t99kd" Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.785063 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:25 crc kubenswrapper[4854]: E1007 12:25:25.792327 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:26.292295402 +0000 UTC m=+42.280127657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.806485 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" event={"ID":"8db95f90-9b16-440f-8329-be3ec6d7c1a0","Type":"ContainerStarted","Data":"c7caa8b5639d733e1f5c1a6a7a330417d502d4bffa4b5777a7f838d18762a9ca"} Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.806811 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" event={"ID":"8db95f90-9b16-440f-8329-be3ec6d7c1a0","Type":"ContainerStarted","Data":"72d59e33968d581cc4dccfab9a99005655c0f4acc8bdef1892793986eb2a5b31"} Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.866480 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-t99kd" podStartSLOduration=8.86642997 podStartE2EDuration="8.86642997s" podCreationTimestamp="2025-10-07 12:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:25.862176426 +0000 UTC m=+41.850008681" watchObservedRunningTime="2025-10-07 12:25:25.86642997 +0000 UTC m=+41.854262225" Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.887094 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" event={"ID":"a4d3106d-58e9-4cb6-bcfd-6e151b16969b","Type":"ContainerStarted","Data":"087b6f3bf987620e27586f088145f36ac664742f0ad02a3768e207da1401111a"} Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.900097 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:25 crc kubenswrapper[4854]: E1007 12:25:25.902461 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:26.402446274 +0000 UTC m=+42.390278529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.912938 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dwgwz" podStartSLOduration=20.91292042 podStartE2EDuration="20.91292042s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:25.911336604 +0000 UTC m=+41.899168859" watchObservedRunningTime="2025-10-07 12:25:25.91292042 +0000 UTC m=+41.900752675" Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.934714 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" event={"ID":"c625acfa-7d7b-4b52-b3a2-55c5f817966b","Type":"ContainerStarted","Data":"be96494c81af3d507ea0ced94469dc62e0902adcbd6d709c313c4cc4f7c1100c"} Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.935115 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.958402 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" event={"ID":"a791f903-64e0-4312-a4ac-f5639200f261","Type":"ContainerStarted","Data":"5a8c8776106b90f8d486d1c1896f959cde5102a11012718f2570124319ea3a48"} Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.959555 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.972923 4854 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-tj4lv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.973006 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" podUID="a791f903-64e0-4312-a4ac-f5639200f261" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Oct 07 12:25:25 crc kubenswrapper[4854]: I1007 12:25:25.991665 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" podStartSLOduration=20.991647183 podStartE2EDuration="20.991647183s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:25.956705431 +0000 UTC m=+41.944537686" watchObservedRunningTime="2025-10-07 12:25:25.991647183 +0000 UTC m=+41.979479438" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.005028 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:26 crc kubenswrapper[4854]: E1007 12:25:26.006646 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:26.506627241 +0000 UTC m=+42.494459496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.020659 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" event={"ID":"63387347-a234-4371-a7b9-c70d0ef73574","Type":"ContainerStarted","Data":"b03a440a25ca9a88e0b5170a64fea7ec9dbbb460a064d629c515e62142cdbf3b"} Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.023190 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" podStartSLOduration=22.023172915 podStartE2EDuration="22.023172915s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:25.996023961 +0000 UTC m=+41.983856216" watchObservedRunningTime="2025-10-07 12:25:26.023172915 +0000 UTC m=+42.011005170" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.060945 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g" event={"ID":"8552bac7-9651-4a2c-b65e-28b2c091a0f5","Type":"ContainerStarted","Data":"6bc4672b8d9c85ccee464d1fbbec7bdc4c0b94ccc6bd280a59a9e19b841bc055"} Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.065564 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m6k45" event={"ID":"f0099a86-9473-4618-9266-2eb460d09150","Type":"ContainerStarted","Data":"eff640e4cd5d2c8d6f8274c7ffec23a1db545fa0186471fd0f9f60c41af06bee"} Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.089004 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" event={"ID":"6d0a97ca-97f0-4bc3-8ee2-bf043accc158","Type":"ContainerStarted","Data":"dead10e55e6eedf37b519daed143a7e7c5aed3a5945eaae371c2e57372f03c5d"} Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.089871 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" podStartSLOduration=21.089859276 podStartE2EDuration="21.089859276s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:26.084392036 +0000 UTC m=+42.072224291" watchObservedRunningTime="2025-10-07 12:25:26.089859276 +0000 UTC m=+42.077691531" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.093926 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" podStartSLOduration=21.093916834 podStartE2EDuration="21.093916834s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:26.037496204 +0000 UTC m=+42.025328469" watchObservedRunningTime="2025-10-07 12:25:26.093916834 +0000 UTC m=+42.081749089" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.101281 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp" event={"ID":"e83624ac-e8aa-429f-b313-8c5c4fc6c88e","Type":"ContainerStarted","Data":"862c049ac54d9750619dec53f7e6c32fce864220214612cff8d81f9bf2362f76"} Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.101457 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp" event={"ID":"e83624ac-e8aa-429f-b313-8c5c4fc6c88e","Type":"ContainerStarted","Data":"8c774c2f034207f9d62ef088fc7db93b9d2bc015aec1d5f541077f5ce02c9161"} Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.107069 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:26 crc kubenswrapper[4854]: E1007 12:25:26.109850 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:26.60983052 +0000 UTC m=+42.597662775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.137627 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb" event={"ID":"b533ed73-fa56-4742-b10c-34beb63d4bba","Type":"ContainerStarted","Data":"775a49fcec67988272e4e458563986aafb6ba4b117b9b8ccbf0b36638c367e1c"} Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.138341 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb" event={"ID":"b533ed73-fa56-4742-b10c-34beb63d4bba","Type":"ContainerStarted","Data":"cc8478cf45825dce4ba85ac70d1baafd0ffffd8a61f60a56917f8b85f8a145c9"} Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.138884 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.153967 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" event={"ID":"c35b7e55-a7c4-47ed-a58a-559247edfaca","Type":"ContainerStarted","Data":"cb08d770ae8040a260c65bf3323f1225b035207597e006282ff7ca93d1ab5078"} Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.191764 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" event={"ID":"0f41f688-ca0b-4a67-99b6-7108b7980bcd","Type":"ContainerStarted","Data":"aa887818b3075bd86edaeac46d9214ca41eea84895424a9b6b5c94e836db87df"} Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.191817 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" event={"ID":"0f41f688-ca0b-4a67-99b6-7108b7980bcd","Type":"ContainerStarted","Data":"fdc80bfb29fb160ea84240770930069b79f5e9d572cbf05e50bc265f96642604"} Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.208415 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:26 crc kubenswrapper[4854]: E1007 12:25:26.209767 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:26.709741923 +0000 UTC m=+42.697574178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.222473 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-bw66g" podStartSLOduration=21.222444194 podStartE2EDuration="21.222444194s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:26.214756139 +0000 UTC m=+42.202588394" watchObservedRunningTime="2025-10-07 12:25:26.222444194 +0000 UTC m=+42.210276449" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.230349 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" event={"ID":"edc4e120-e930-4fbe-a871-bd63d3de85c9","Type":"ContainerStarted","Data":"c204763bac812cb289bc61c297aeb76afe91ac48bc3aaf0e61e85519b6eb0ade"} Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.252499 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kqnvp" event={"ID":"d06ab521-e811-4ddf-93ec-d076d089db0c","Type":"ContainerStarted","Data":"eab2da21337a90c80bf2f41489233b8e8d4dd350a087191678d8d3f7b2c0daad"} Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.312496 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hr45q" podStartSLOduration=21.312464157 podStartE2EDuration="21.312464157s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:26.288896768 +0000 UTC m=+42.276729023" watchObservedRunningTime="2025-10-07 12:25:26.312464157 +0000 UTC m=+42.300296412" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.314124 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-p8fbp" event={"ID":"b2b320c8-834c-43b3-ab9b-c97bba4fb2b2","Type":"ContainerStarted","Data":"e25c6326a65c613744b20a8fc0076309e97568d453f9778b0dd45b7a76e18f28"} Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.314584 4854 patch_prober.go:28] interesting pod/downloads-7954f5f757-64q88 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.314627 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-64q88" podUID="43359b2f-63d6-4e81-888e-e219dca84ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.314982 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:26 crc kubenswrapper[4854]: E1007 12:25:26.317642 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:26.817617728 +0000 UTC m=+42.805450143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.352292 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-npmk7" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.352732 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tt4xp" podStartSLOduration=22.352705854 podStartE2EDuration="22.352705854s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:26.352277082 +0000 UTC m=+42.340109347" watchObservedRunningTime="2025-10-07 12:25:26.352705854 +0000 UTC m=+42.340538119" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.418390 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:26 crc kubenswrapper[4854]: E1007 12:25:26.420472 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:26.920448056 +0000 UTC m=+42.908280311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.447807 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gh8fl" podStartSLOduration=21.447783356 podStartE2EDuration="21.447783356s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:26.437640779 +0000 UTC m=+42.425473034" watchObservedRunningTime="2025-10-07 12:25:26.447783356 +0000 UTC m=+42.435615631" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.504164 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb" podStartSLOduration=21.504128524 podStartE2EDuration="21.504128524s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:26.500553369 +0000 UTC m=+42.488385624" watchObservedRunningTime="2025-10-07 12:25:26.504128524 +0000 UTC m=+42.491960779" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.523027 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:26 crc kubenswrapper[4854]: E1007 12:25:26.523453 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:27.023440189 +0000 UTC m=+43.011272434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.589512 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-p8fbp" podStartSLOduration=21.589484191 podStartE2EDuration="21.589484191s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:26.588799001 +0000 UTC m=+42.576631256" watchObservedRunningTime="2025-10-07 12:25:26.589484191 +0000 UTC m=+42.577316446" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.619019 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:26 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:26 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:26 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.619089 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.624243 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:26 crc kubenswrapper[4854]: E1007 12:25:26.624660 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:27.124646309 +0000 UTC m=+43.112478564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.641326 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" podStartSLOduration=21.641303526 podStartE2EDuration="21.641303526s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:26.63357637 +0000 UTC m=+42.621408625" watchObservedRunningTime="2025-10-07 12:25:26.641303526 +0000 UTC m=+42.629135791" Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.729399 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:26 crc kubenswrapper[4854]: E1007 12:25:26.729889 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:27.229872857 +0000 UTC m=+43.217705112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.830838 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:26 crc kubenswrapper[4854]: E1007 12:25:26.831825 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:27.331777928 +0000 UTC m=+43.319610193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:26 crc kubenswrapper[4854]: I1007 12:25:26.933091 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:26 crc kubenswrapper[4854]: E1007 12:25:26.933640 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:27.433616727 +0000 UTC m=+43.421448982 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.034327 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:27 crc kubenswrapper[4854]: E1007 12:25:27.034660 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:27.534642252 +0000 UTC m=+43.522474507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.136364 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:27 crc kubenswrapper[4854]: E1007 12:25:27.136782 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:27.636766109 +0000 UTC m=+43.624598364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.237983 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:27 crc kubenswrapper[4854]: E1007 12:25:27.238289 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:27.738243187 +0000 UTC m=+43.726075452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.239683 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:27 crc kubenswrapper[4854]: E1007 12:25:27.240092 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:27.7400759 +0000 UTC m=+43.727908155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.329221 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7frlz" event={"ID":"9fdfad7e-e478-4167-b2d6-5b8295354328","Type":"ContainerStarted","Data":"3cd038af3e5de8e81f46acc4cb31b6c8a5922293493a6d8d7e694ca567786a13"} Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.340870 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-p8fbp" event={"ID":"b2b320c8-834c-43b3-ab9b-c97bba4fb2b2","Type":"ContainerStarted","Data":"a8d95e685bc80211cbf53ee91377964233d54f2af060fcdf36178181a74a22f1"} Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.342306 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:27 crc kubenswrapper[4854]: E1007 12:25:27.342561 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:27.842518407 +0000 UTC m=+43.830350672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.342919 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:27 crc kubenswrapper[4854]: E1007 12:25:27.343375 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:27.843366562 +0000 UTC m=+43.831198827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.347782 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" event={"ID":"615e917b-47ba-4399-a549-496561ca164e","Type":"ContainerStarted","Data":"8474d1568e70a32acc2ef91e58fe91e1fdf93003d861468f7c128c378a160691"} Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.351738 4854 generic.go:334] "Generic (PLEG): container finished" podID="8636dc3b-78f1-4098-81ca-1ae0ef4c441b" containerID="83c7d7761eed2717d67730fef786b60ebccf645f50e1bde8cb11ec56fa3300ef" exitCode=0 Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.351887 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" event={"ID":"8636dc3b-78f1-4098-81ca-1ae0ef4c441b","Type":"ContainerDied","Data":"83c7d7761eed2717d67730fef786b60ebccf645f50e1bde8cb11ec56fa3300ef"} Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.354008 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pzzxz" event={"ID":"63387347-a234-4371-a7b9-c70d0ef73574","Type":"ContainerStarted","Data":"6326b8703b40e22458eda413f268326ff67bea77f2685fa7c96e0946618d09c9"} Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.362917 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-7frlz" podStartSLOduration=22.362896663 podStartE2EDuration="22.362896663s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:27.361540073 +0000 UTC m=+43.349372328" watchObservedRunningTime="2025-10-07 12:25:27.362896663 +0000 UTC m=+43.350728918" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.363610 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wns76" event={"ID":"df20a6de-dc59-4af7-b284-9fef4b249be9","Type":"ContainerStarted","Data":"c8d57bb2df34d3d2092f6892e5384e9a41814509bd332da46c6671da8779a3f9"} Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.363681 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gf54n" podStartSLOduration=22.363671696 podStartE2EDuration="22.363671696s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:26.700306502 +0000 UTC m=+42.688138757" watchObservedRunningTime="2025-10-07 12:25:27.363671696 +0000 UTC m=+43.351503951" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.393319 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tsdfx" event={"ID":"a4d3106d-58e9-4cb6-bcfd-6e151b16969b","Type":"ContainerStarted","Data":"f260d836d4d71d4d174e5a069405568cb26ccd534986998f28160fbe651f61ea"} Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.410798 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7s59d"] Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.411861 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.413259 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m6k45" event={"ID":"f0099a86-9473-4618-9266-2eb460d09150","Type":"ContainerStarted","Data":"c5d83470670e3a52117be62e5ce78f89127f443f8a231b83ceb17ab2bd6528c4"} Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.413326 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m6k45" event={"ID":"f0099a86-9473-4618-9266-2eb460d09150","Type":"ContainerStarted","Data":"2752dee671245c5a7593019d4aec6a0b65ead49076c2822364d5958d1aaa8ccc"} Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.416500 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.424235 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.425888 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f7851394c8fd9e735909cc002de8a06f80f49934a468c7d0210bb4a44f105ed7"} Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.426305 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.446029 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:27 crc kubenswrapper[4854]: E1007 12:25:27.448550 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:27.948532858 +0000 UTC m=+43.936365113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.460713 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" event={"ID":"6d0a97ca-97f0-4bc3-8ee2-bf043accc158","Type":"ContainerStarted","Data":"197d69f0bc7c86f37b77e04e384ce7ed186870b0cc8dfff5918cc75659950ace"} Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.462968 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7s59d"] Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.463029 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" event={"ID":"c4a5d028-220b-40aa-a907-95015d7880c8","Type":"ContainerStarted","Data":"2632c57e962beeec26eb494fccd7c545c256528efea4e4d64aaeb922f8c8a0f1"} Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.463872 4854 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nkh4b container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.463940 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" podUID="329204da-6485-459a-bf30-0ae870c46ca2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.473220 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" podUID="3e06c8ed-2c8c-4c09-a352-31432fbd7d40" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" gracePeriod=30 Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.496515 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tj4lv" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.550290 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-utilities\") pod \"certified-operators-7s59d\" (UID: \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\") " pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.550365 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-catalog-content\") pod \"certified-operators-7s59d\" (UID: \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\") " pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.550419 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.550482 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wxz5\" (UniqueName: \"kubernetes.io/projected/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-kube-api-access-6wxz5\") pod \"certified-operators-7s59d\" (UID: \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\") " pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:25:27 crc kubenswrapper[4854]: E1007 12:25:27.551656 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:28.051633184 +0000 UTC m=+44.039465439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.561424 4854 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-r4lpq container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.561485 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" podUID="c625acfa-7d7b-4b52-b3a2-55c5f817966b" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.562132 4854 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-r4lpq container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.562275 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" podUID="c625acfa-7d7b-4b52-b3a2-55c5f817966b" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.569401 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rhcf8" podStartSLOduration=22.569374723 podStartE2EDuration="22.569374723s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:27.50465655 +0000 UTC m=+43.492488795" watchObservedRunningTime="2025-10-07 12:25:27.569374723 +0000 UTC m=+43.557206978" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.572183 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gnz24"] Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.573758 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.582526 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.586749 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gnz24"] Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.624484 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:27 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:27 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:27 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.624920 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.652296 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.652596 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wxz5\" (UniqueName: \"kubernetes.io/projected/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-kube-api-access-6wxz5\") pod \"certified-operators-7s59d\" (UID: \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\") " pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.652729 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1de49b7-71c6-450c-ab89-0c37fb18b32a-catalog-content\") pod \"community-operators-gnz24\" (UID: \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\") " pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.652902 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1de49b7-71c6-450c-ab89-0c37fb18b32a-utilities\") pod \"community-operators-gnz24\" (UID: \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\") " pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.652928 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mnxx\" (UniqueName: \"kubernetes.io/projected/f1de49b7-71c6-450c-ab89-0c37fb18b32a-kube-api-access-2mnxx\") pod \"community-operators-gnz24\" (UID: \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\") " pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.653082 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-utilities\") pod \"certified-operators-7s59d\" (UID: \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\") " pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.653163 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-catalog-content\") pod \"certified-operators-7s59d\" (UID: \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\") " pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:25:27 crc kubenswrapper[4854]: E1007 12:25:27.653997 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:28.153974788 +0000 UTC m=+44.141807043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.657329 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-catalog-content\") pod \"certified-operators-7s59d\" (UID: \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\") " pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.659087 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-utilities\") pod \"certified-operators-7s59d\" (UID: \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\") " pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.706163 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wns76" podStartSLOduration=22.706127133 podStartE2EDuration="22.706127133s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:27.704366392 +0000 UTC m=+43.692198647" watchObservedRunningTime="2025-10-07 12:25:27.706127133 +0000 UTC m=+43.693959388" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.716216 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wxz5\" (UniqueName: \"kubernetes.io/projected/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-kube-api-access-6wxz5\") pod \"certified-operators-7s59d\" (UID: \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\") " pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.723853 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-m6k45" podStartSLOduration=23.723817191 podStartE2EDuration="23.723817191s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:27.722802831 +0000 UTC m=+43.710635086" watchObservedRunningTime="2025-10-07 12:25:27.723817191 +0000 UTC m=+43.711649446" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.761406 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.761941 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1de49b7-71c6-450c-ab89-0c37fb18b32a-utilities\") pod \"community-operators-gnz24\" (UID: \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\") " pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.761978 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mnxx\" (UniqueName: \"kubernetes.io/projected/f1de49b7-71c6-450c-ab89-0c37fb18b32a-kube-api-access-2mnxx\") pod \"community-operators-gnz24\" (UID: \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\") " pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.762042 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.762081 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1de49b7-71c6-450c-ab89-0c37fb18b32a-catalog-content\") pod \"community-operators-gnz24\" (UID: \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\") " pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:25:27 crc kubenswrapper[4854]: E1007 12:25:27.762899 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:28.262877983 +0000 UTC m=+44.250710238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.763210 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1de49b7-71c6-450c-ab89-0c37fb18b32a-utilities\") pod \"community-operators-gnz24\" (UID: \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\") " pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.767224 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1de49b7-71c6-450c-ab89-0c37fb18b32a-catalog-content\") pod \"community-operators-gnz24\" (UID: \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\") " pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.778605 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" podStartSLOduration=23.778583313 podStartE2EDuration="23.778583313s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:27.77506119 +0000 UTC m=+43.762893445" watchObservedRunningTime="2025-10-07 12:25:27.778583313 +0000 UTC m=+43.766415568" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.796823 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r4lpq" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.817460 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mnxx\" (UniqueName: \"kubernetes.io/projected/f1de49b7-71c6-450c-ab89-0c37fb18b32a-kube-api-access-2mnxx\") pod \"community-operators-gnz24\" (UID: \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\") " pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.817571 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7vxvx"] Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.821889 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.842038 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vxvx"] Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.844486 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.84445847 podStartE2EDuration="23.84445847s" podCreationTimestamp="2025-10-07 12:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:27.843864862 +0000 UTC m=+43.831697137" watchObservedRunningTime="2025-10-07 12:25:27.84445847 +0000 UTC m=+43.832290725" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.887851 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:27 crc kubenswrapper[4854]: E1007 12:25:27.888308 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:28.388286262 +0000 UTC m=+44.376118517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.903710 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.989307 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6z89\" (UniqueName: \"kubernetes.io/projected/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-kube-api-access-k6z89\") pod \"certified-operators-7vxvx\" (UID: \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\") " pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.989349 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.989414 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-catalog-content\") pod \"certified-operators-7vxvx\" (UID: \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\") " pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:25:27 crc kubenswrapper[4854]: I1007 12:25:27.989586 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-utilities\") pod \"certified-operators-7vxvx\" (UID: \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\") " pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:25:27 crc kubenswrapper[4854]: E1007 12:25:27.989722 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:28.489706978 +0000 UTC m=+44.477539233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.003769 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-66kwk"] Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.005576 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.036333 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-66kwk"] Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.092434 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:28 crc kubenswrapper[4854]: E1007 12:25:28.092854 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:28.592637659 +0000 UTC m=+44.580469914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.093015 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-catalog-content\") pod \"certified-operators-7vxvx\" (UID: \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\") " pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.093216 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-utilities\") pod \"certified-operators-7vxvx\" (UID: \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\") " pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.093448 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6z89\" (UniqueName: \"kubernetes.io/projected/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-kube-api-access-k6z89\") pod \"certified-operators-7vxvx\" (UID: \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\") " pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.093545 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:28 crc kubenswrapper[4854]: E1007 12:25:28.093962 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:28.593951828 +0000 UTC m=+44.581784083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.093972 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-catalog-content\") pod \"certified-operators-7vxvx\" (UID: \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\") " pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.094251 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-utilities\") pod \"certified-operators-7vxvx\" (UID: \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\") " pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.123453 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6z89\" (UniqueName: \"kubernetes.io/projected/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-kube-api-access-k6z89\") pod \"certified-operators-7vxvx\" (UID: \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\") " pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.194930 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:28 crc kubenswrapper[4854]: E1007 12:25:28.195353 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:28.695309473 +0000 UTC m=+44.683141728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.195499 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2519a25-7584-44fa-8751-eb0160c6eb14-utilities\") pod \"community-operators-66kwk\" (UID: \"f2519a25-7584-44fa-8751-eb0160c6eb14\") " pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.195594 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.195626 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2519a25-7584-44fa-8751-eb0160c6eb14-catalog-content\") pod \"community-operators-66kwk\" (UID: \"f2519a25-7584-44fa-8751-eb0160c6eb14\") " pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.195654 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdnwf\" (UniqueName: \"kubernetes.io/projected/f2519a25-7584-44fa-8751-eb0160c6eb14-kube-api-access-sdnwf\") pod \"community-operators-66kwk\" (UID: \"f2519a25-7584-44fa-8751-eb0160c6eb14\") " pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:25:28 crc kubenswrapper[4854]: E1007 12:25:28.196048 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:28.696033084 +0000 UTC m=+44.683865339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.208658 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.297484 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.297747 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2519a25-7584-44fa-8751-eb0160c6eb14-catalog-content\") pod \"community-operators-66kwk\" (UID: \"f2519a25-7584-44fa-8751-eb0160c6eb14\") " pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.297784 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdnwf\" (UniqueName: \"kubernetes.io/projected/f2519a25-7584-44fa-8751-eb0160c6eb14-kube-api-access-sdnwf\") pod \"community-operators-66kwk\" (UID: \"f2519a25-7584-44fa-8751-eb0160c6eb14\") " pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.297802 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2519a25-7584-44fa-8751-eb0160c6eb14-utilities\") pod \"community-operators-66kwk\" (UID: \"f2519a25-7584-44fa-8751-eb0160c6eb14\") " pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.298236 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2519a25-7584-44fa-8751-eb0160c6eb14-utilities\") pod \"community-operators-66kwk\" (UID: \"f2519a25-7584-44fa-8751-eb0160c6eb14\") " pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:25:28 crc kubenswrapper[4854]: E1007 12:25:28.298590 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:28.798555083 +0000 UTC m=+44.786387338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.298631 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2519a25-7584-44fa-8751-eb0160c6eb14-catalog-content\") pod \"community-operators-66kwk\" (UID: \"f2519a25-7584-44fa-8751-eb0160c6eb14\") " pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.352125 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdnwf\" (UniqueName: \"kubernetes.io/projected/f2519a25-7584-44fa-8751-eb0160c6eb14-kube-api-access-sdnwf\") pod \"community-operators-66kwk\" (UID: \"f2519a25-7584-44fa-8751-eb0160c6eb14\") " pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.399111 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:28 crc kubenswrapper[4854]: E1007 12:25:28.399509 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:28.899497815 +0000 UTC m=+44.887330070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.476526 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" event={"ID":"6d0a97ca-97f0-4bc3-8ee2-bf043accc158","Type":"ContainerStarted","Data":"8a08301d1ee9021d3def4fa40c77072dd44fecf2f99e8dad63a1aba7a1209feb"} Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.510855 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:28 crc kubenswrapper[4854]: E1007 12:25:28.511219 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:29.011195963 +0000 UTC m=+44.999028218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.590336 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7s59d"] Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.591932 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gnz24"] Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.610997 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:28 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:28 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:28 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.611092 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.612058 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:28 crc kubenswrapper[4854]: E1007 12:25:28.615328 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:29.115309918 +0000 UTC m=+45.103142173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.630066 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:25:28 crc kubenswrapper[4854]: W1007 12:25:28.663260 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87c2c08e_dd9a_47b9_8e82_5419fdb6cda8.slice/crio-fab199944ed004b66c0ca79d88cea00a61018e26c8491205911fa9d2ff7b6658 WatchSource:0}: Error finding container fab199944ed004b66c0ca79d88cea00a61018e26c8491205911fa9d2ff7b6658: Status 404 returned error can't find the container with id fab199944ed004b66c0ca79d88cea00a61018e26c8491205911fa9d2ff7b6658 Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.714210 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:28 crc kubenswrapper[4854]: E1007 12:25:28.714614 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:29.214564151 +0000 UTC m=+45.202396406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.714944 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:28 crc kubenswrapper[4854]: E1007 12:25:28.715368 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:29.215356985 +0000 UTC m=+45.203189240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.817137 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:28 crc kubenswrapper[4854]: E1007 12:25:28.817579 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:29.317551564 +0000 UTC m=+45.305383809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:28 crc kubenswrapper[4854]: I1007 12:25:28.920094 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:28 crc kubenswrapper[4854]: E1007 12:25:28.920445 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:29.420434403 +0000 UTC m=+45.408266658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.023064 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:29 crc kubenswrapper[4854]: E1007 12:25:29.023550 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:29.523515479 +0000 UTC m=+45.511347734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.023826 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:29 crc kubenswrapper[4854]: E1007 12:25:29.024200 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:29.524192049 +0000 UTC m=+45.512024304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.110897 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-66kwk"] Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.132507 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:29 crc kubenswrapper[4854]: E1007 12:25:29.134374 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:29.634339051 +0000 UTC m=+45.622171306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.134622 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:29 crc kubenswrapper[4854]: E1007 12:25:29.135197 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:29.635177665 +0000 UTC m=+45.623009920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.168914 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.170206 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 12:25:29 crc kubenswrapper[4854]: E1007 12:25:29.170584 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8636dc3b-78f1-4098-81ca-1ae0ef4c441b" containerName="collect-profiles" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.170606 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="8636dc3b-78f1-4098-81ca-1ae0ef4c441b" containerName="collect-profiles" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.170724 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="8636dc3b-78f1-4098-81ca-1ae0ef4c441b" containerName="collect-profiles" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.171337 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.174214 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.174400 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.184781 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.196692 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vxvx"] Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.228794 4854 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.237803 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:29 crc kubenswrapper[4854]: E1007 12:25:29.240036 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:29.740020862 +0000 UTC m=+45.727853117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.240572 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:29 crc kubenswrapper[4854]: E1007 12:25:29.241354 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:29.74133722 +0000 UTC m=+45.729169475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.343733 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqz6c\" (UniqueName: \"kubernetes.io/projected/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-kube-api-access-wqz6c\") pod \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\" (UID: \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\") " Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.344511 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.344619 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-config-volume\") pod \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\" (UID: \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\") " Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.344669 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-secret-volume\") pod \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\" (UID: \"8636dc3b-78f1-4098-81ca-1ae0ef4c441b\") " Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.345036 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c66ec6fc-5daf-4f6f-b057-80f0eb1a7584-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c66ec6fc-5daf-4f6f-b057-80f0eb1a7584\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.345166 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c66ec6fc-5daf-4f6f-b057-80f0eb1a7584-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c66ec6fc-5daf-4f6f-b057-80f0eb1a7584\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 12:25:29 crc kubenswrapper[4854]: E1007 12:25:29.345358 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:29.845340193 +0000 UTC m=+45.833172448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.346215 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-config-volume" (OuterVolumeSpecName: "config-volume") pod "8636dc3b-78f1-4098-81ca-1ae0ef4c441b" (UID: "8636dc3b-78f1-4098-81ca-1ae0ef4c441b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.352469 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8636dc3b-78f1-4098-81ca-1ae0ef4c441b" (UID: "8636dc3b-78f1-4098-81ca-1ae0ef4c441b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.353065 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-kube-api-access-wqz6c" (OuterVolumeSpecName: "kube-api-access-wqz6c") pod "8636dc3b-78f1-4098-81ca-1ae0ef4c441b" (UID: "8636dc3b-78f1-4098-81ca-1ae0ef4c441b"). InnerVolumeSpecName "kube-api-access-wqz6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.446505 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c66ec6fc-5daf-4f6f-b057-80f0eb1a7584-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c66ec6fc-5daf-4f6f-b057-80f0eb1a7584\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.446597 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.446628 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c66ec6fc-5daf-4f6f-b057-80f0eb1a7584-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c66ec6fc-5daf-4f6f-b057-80f0eb1a7584\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.446681 4854 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.446693 4854 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.446706 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqz6c\" (UniqueName: \"kubernetes.io/projected/8636dc3b-78f1-4098-81ca-1ae0ef4c441b-kube-api-access-wqz6c\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.446753 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c66ec6fc-5daf-4f6f-b057-80f0eb1a7584-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c66ec6fc-5daf-4f6f-b057-80f0eb1a7584\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 12:25:29 crc kubenswrapper[4854]: E1007 12:25:29.447046 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:29.947033807 +0000 UTC m=+45.934866062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.464282 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c66ec6fc-5daf-4f6f-b057-80f0eb1a7584-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c66ec6fc-5daf-4f6f-b057-80f0eb1a7584\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.492220 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" event={"ID":"8636dc3b-78f1-4098-81ca-1ae0ef4c441b","Type":"ContainerDied","Data":"30993f8078b16ac73b4d54090af217659a8d8f9092d30a2d8b8ba13513e649de"} Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.492269 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30993f8078b16ac73b4d54090af217659a8d8f9092d30a2d8b8ba13513e649de" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.492289 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.493597 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vxvx" event={"ID":"bc3ab00d-65d3-4281-8fd5-65dda5839c8d","Type":"ContainerStarted","Data":"30c5fdb84f77f6a19f4b58d8b1328da738094ee4761882d1c20f5c9af9ee4b22"} Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.495193 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.496935 4854 generic.go:334] "Generic (PLEG): container finished" podID="f1de49b7-71c6-450c-ab89-0c37fb18b32a" containerID="f08b48fcdb02df154fab97a3b33d81a3092f36ffbb8de343cef5d7b9c953a38a" exitCode=0 Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.497222 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnz24" event={"ID":"f1de49b7-71c6-450c-ab89-0c37fb18b32a","Type":"ContainerDied","Data":"f08b48fcdb02df154fab97a3b33d81a3092f36ffbb8de343cef5d7b9c953a38a"} Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.497256 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnz24" event={"ID":"f1de49b7-71c6-450c-ab89-0c37fb18b32a","Type":"ContainerStarted","Data":"43c7c4bb6c694f43e57e4cc6ede701032fd69701a586a2b87ff859d1d33387fc"} Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.506198 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" event={"ID":"6d0a97ca-97f0-4bc3-8ee2-bf043accc158","Type":"ContainerStarted","Data":"380b0ae9e26531015a5e1aee2ff6eb002a58580c6e7ce07234d4ce8a27203b2a"} Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.508738 4854 generic.go:334] "Generic (PLEG): container finished" podID="f2519a25-7584-44fa-8751-eb0160c6eb14" containerID="a57f6cc64b7307b131531d2dda49e8b1eab2a7d2e0f9634e98c5c20596a98c3a" exitCode=0 Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.508798 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66kwk" event={"ID":"f2519a25-7584-44fa-8751-eb0160c6eb14","Type":"ContainerDied","Data":"a57f6cc64b7307b131531d2dda49e8b1eab2a7d2e0f9634e98c5c20596a98c3a"} Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.508816 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66kwk" event={"ID":"f2519a25-7584-44fa-8751-eb0160c6eb14","Type":"ContainerStarted","Data":"007fbc4465578bf538e399b2de1c093d6746065913ddb4c430e133dec67c3f11"} Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.511619 4854 generic.go:334] "Generic (PLEG): container finished" podID="87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" containerID="e49df6e67a8514d29c88c5c1c96e817b2fd25df6d80184c7d2c1b670a3952d64" exitCode=0 Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.511802 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7s59d" event={"ID":"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8","Type":"ContainerDied","Data":"e49df6e67a8514d29c88c5c1c96e817b2fd25df6d80184c7d2c1b670a3952d64"} Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.511888 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7s59d" event={"ID":"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8","Type":"ContainerStarted","Data":"fab199944ed004b66c0ca79d88cea00a61018e26c8491205911fa9d2ff7b6658"} Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.523336 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.547958 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:29 crc kubenswrapper[4854]: E1007 12:25:29.548108 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:30.048078113 +0000 UTC m=+46.035910368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.548175 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:29 crc kubenswrapper[4854]: E1007 12:25:29.548930 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:30.048913087 +0000 UTC m=+46.036745342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.563398 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wn5m7"] Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.564619 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.571626 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.580634 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn5m7"] Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.610215 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:29 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:29 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:29 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.610302 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.649756 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:29 crc kubenswrapper[4854]: E1007 12:25:29.650095 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:30.150023775 +0000 UTC m=+46.137856030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.650470 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-utilities\") pod \"redhat-marketplace-wn5m7\" (UID: \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\") " pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.650504 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5w2t\" (UniqueName: \"kubernetes.io/projected/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-kube-api-access-j5w2t\") pod \"redhat-marketplace-wn5m7\" (UID: \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\") " pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.650608 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.650653 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-catalog-content\") pod \"redhat-marketplace-wn5m7\" (UID: \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\") " pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:25:29 crc kubenswrapper[4854]: E1007 12:25:29.651136 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 12:25:30.151114597 +0000 UTC m=+46.138946852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-44m5s" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.752521 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.752813 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-catalog-content\") pod \"redhat-marketplace-wn5m7\" (UID: \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\") " pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.752850 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5w2t\" (UniqueName: \"kubernetes.io/projected/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-kube-api-access-j5w2t\") pod \"redhat-marketplace-wn5m7\" (UID: \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\") " pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.752871 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-utilities\") pod \"redhat-marketplace-wn5m7\" (UID: \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\") " pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.753313 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-utilities\") pod \"redhat-marketplace-wn5m7\" (UID: \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\") " pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:25:29 crc kubenswrapper[4854]: E1007 12:25:29.753383 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 12:25:30.253367938 +0000 UTC m=+46.241200193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.753578 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-catalog-content\") pod \"redhat-marketplace-wn5m7\" (UID: \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\") " pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.774800 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5w2t\" (UniqueName: \"kubernetes.io/projected/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-kube-api-access-j5w2t\") pod \"redhat-marketplace-wn5m7\" (UID: \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\") " pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.775006 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 12:25:29 crc kubenswrapper[4854]: W1007 12:25:29.792315 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc66ec6fc_5daf_4f6f_b057_80f0eb1a7584.slice/crio-baf39102dcfa7234c01ada7696984159bf8c25ab35e03620d715f79e4a3a8de8 WatchSource:0}: Error finding container baf39102dcfa7234c01ada7696984159bf8c25ab35e03620d715f79e4a3a8de8: Status 404 returned error can't find the container with id baf39102dcfa7234c01ada7696984159bf8c25ab35e03620d715f79e4a3a8de8 Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.846952 4854 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-07T12:25:29.228912457Z","Handler":null,"Name":""} Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.850439 4854 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.850482 4854 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.854231 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.857731 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.857787 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.885553 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-44m5s\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.944001 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.956014 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.961199 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.968607 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zh2kv"] Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.969857 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.981778 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh2kv"] Oct 07 12:25:29 crc kubenswrapper[4854]: I1007 12:25:29.985986 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.057691 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7c51857-fb64-4aeb-9091-b50d3268b4ec-catalog-content\") pod \"redhat-marketplace-zh2kv\" (UID: \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\") " pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.057768 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwxd5\" (UniqueName: \"kubernetes.io/projected/d7c51857-fb64-4aeb-9091-b50d3268b4ec-kube-api-access-cwxd5\") pod \"redhat-marketplace-zh2kv\" (UID: \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\") " pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.057826 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7c51857-fb64-4aeb-9091-b50d3268b4ec-utilities\") pod \"redhat-marketplace-zh2kv\" (UID: \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\") " pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.159528 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7c51857-fb64-4aeb-9091-b50d3268b4ec-utilities\") pod \"redhat-marketplace-zh2kv\" (UID: \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\") " pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.159576 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7c51857-fb64-4aeb-9091-b50d3268b4ec-catalog-content\") pod \"redhat-marketplace-zh2kv\" (UID: \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\") " pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.159638 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwxd5\" (UniqueName: \"kubernetes.io/projected/d7c51857-fb64-4aeb-9091-b50d3268b4ec-kube-api-access-cwxd5\") pod \"redhat-marketplace-zh2kv\" (UID: \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\") " pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.161101 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7c51857-fb64-4aeb-9091-b50d3268b4ec-utilities\") pod \"redhat-marketplace-zh2kv\" (UID: \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\") " pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.161430 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7c51857-fb64-4aeb-9091-b50d3268b4ec-catalog-content\") pod \"redhat-marketplace-zh2kv\" (UID: \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\") " pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.178437 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwxd5\" (UniqueName: \"kubernetes.io/projected/d7c51857-fb64-4aeb-9091-b50d3268b4ec-kube-api-access-cwxd5\") pod \"redhat-marketplace-zh2kv\" (UID: \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\") " pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.255654 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn5m7"] Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.285879 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.286558 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.309884 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.333614 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.337103 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-44m5s"] Oct 07 12:25:30 crc kubenswrapper[4854]: W1007 12:25:30.376615 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf83bc29b_b3be_4578_bae7_d2867242278c.slice/crio-8d51c65b3bcbb17d2ecc397697168e77c67e04eaeeea2a01026039811b7925f7 WatchSource:0}: Error finding container 8d51c65b3bcbb17d2ecc397697168e77c67e04eaeeea2a01026039811b7925f7: Status 404 returned error can't find the container with id 8d51c65b3bcbb17d2ecc397697168e77c67e04eaeeea2a01026039811b7925f7 Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.542617 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.543330 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.545642 4854 patch_prober.go:28] interesting pod/downloads-7954f5f757-64q88 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.545693 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-64q88" podUID="43359b2f-63d6-4e81-888e-e219dca84ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.545776 4854 patch_prober.go:28] interesting pod/downloads-7954f5f757-64q88 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.545825 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-64q88" podUID="43359b2f-63d6-4e81-888e-e219dca84ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.554118 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.560953 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c66ec6fc-5daf-4f6f-b057-80f0eb1a7584","Type":"ContainerStarted","Data":"a2e313ff5bad53e19ba45e6dbbcd61ba9edcf9a7f75f51e454e092c950e542da"} Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.561017 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c66ec6fc-5daf-4f6f-b057-80f0eb1a7584","Type":"ContainerStarted","Data":"baf39102dcfa7234c01ada7696984159bf8c25ab35e03620d715f79e4a3a8de8"} Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.569040 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zst5f"] Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.572420 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.583314 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" event={"ID":"f83bc29b-b3be-4578-bae7-d2867242278c","Type":"ContainerStarted","Data":"8d51c65b3bcbb17d2ecc397697168e77c67e04eaeeea2a01026039811b7925f7"} Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.584482 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.588566 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zst5f"] Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.596018 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn5m7" event={"ID":"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8","Type":"ContainerStarted","Data":"eda3f62d6b983fbc852aad92fdb3709c93fa8d5013037eb80a4c84a7e956ae69"} Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.600034 4854 generic.go:334] "Generic (PLEG): container finished" podID="bc3ab00d-65d3-4281-8fd5-65dda5839c8d" containerID="0c58d9c5b2b83485bb4e76c00aa37d0d526e5ee83b4a5759593efc6de8d6516d" exitCode=0 Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.600108 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vxvx" event={"ID":"bc3ab00d-65d3-4281-8fd5-65dda5839c8d","Type":"ContainerDied","Data":"0c58d9c5b2b83485bb4e76c00aa37d0d526e5ee83b4a5759593efc6de8d6516d"} Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.604212 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.611114 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:30 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:30 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:30 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.611734 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.612735 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.6127021350000001 podStartE2EDuration="1.612702135s" podCreationTimestamp="2025-10-07 12:25:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:30.606015299 +0000 UTC m=+46.593847574" watchObservedRunningTime="2025-10-07 12:25:30.612702135 +0000 UTC m=+46.600534380" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.647087 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" event={"ID":"6d0a97ca-97f0-4bc3-8ee2-bf043accc158","Type":"ContainerStarted","Data":"8b33cd2af007254214c227e85a26d89404800900a473a4c2b878fcd43095f9dd"} Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.658922 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-c7nrq" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.682719 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e31cf8e1-9bcc-4bf0-9189-950e81595d38-catalog-content\") pod \"redhat-operators-zst5f\" (UID: \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\") " pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.684127 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e31cf8e1-9bcc-4bf0-9189-950e81595d38-utilities\") pod \"redhat-operators-zst5f\" (UID: \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\") " pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.684333 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsps7\" (UniqueName: \"kubernetes.io/projected/e31cf8e1-9bcc-4bf0-9189-950e81595d38-kube-api-access-xsps7\") pod \"redhat-operators-zst5f\" (UID: \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\") " pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.686733 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh2kv"] Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.742500 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.786194 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e31cf8e1-9bcc-4bf0-9189-950e81595d38-catalog-content\") pod \"redhat-operators-zst5f\" (UID: \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\") " pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.786348 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e31cf8e1-9bcc-4bf0-9189-950e81595d38-utilities\") pod \"redhat-operators-zst5f\" (UID: \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\") " pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.786402 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsps7\" (UniqueName: \"kubernetes.io/projected/e31cf8e1-9bcc-4bf0-9189-950e81595d38-kube-api-access-xsps7\") pod \"redhat-operators-zst5f\" (UID: \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\") " pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.786947 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e31cf8e1-9bcc-4bf0-9189-950e81595d38-utilities\") pod \"redhat-operators-zst5f\" (UID: \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\") " pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.787278 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e31cf8e1-9bcc-4bf0-9189-950e81595d38-catalog-content\") pod \"redhat-operators-zst5f\" (UID: \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\") " pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.792838 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-xhvq4" podStartSLOduration=12.792819794 podStartE2EDuration="12.792819794s" podCreationTimestamp="2025-10-07 12:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:30.790721362 +0000 UTC m=+46.778553607" watchObservedRunningTime="2025-10-07 12:25:30.792819794 +0000 UTC m=+46.780652049" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.819096 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsps7\" (UniqueName: \"kubernetes.io/projected/e31cf8e1-9bcc-4bf0-9189-950e81595d38-kube-api-access-xsps7\") pod \"redhat-operators-zst5f\" (UID: \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\") " pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.952106 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.964892 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rcw26"] Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.966049 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.978948 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.979380 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.980236 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rcw26"] Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.981944 4854 patch_prober.go:28] interesting pod/console-f9d7485db-9cjsc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.982022 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-9cjsc" podUID="a5796307-74e5-4cb6-99db-b1ba95dacb54" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.997942 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n49qp\" (UniqueName: \"kubernetes.io/projected/12ccfbc2-fd11-4014-80a1-ab815ea4521d-kube-api-access-n49qp\") pod \"redhat-operators-rcw26\" (UID: \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\") " pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.998016 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ccfbc2-fd11-4014-80a1-ab815ea4521d-utilities\") pod \"redhat-operators-rcw26\" (UID: \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\") " pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:25:30 crc kubenswrapper[4854]: I1007 12:25:30.998046 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ccfbc2-fd11-4014-80a1-ab815ea4521d-catalog-content\") pod \"redhat-operators-rcw26\" (UID: \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\") " pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.100777 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ccfbc2-fd11-4014-80a1-ab815ea4521d-utilities\") pod \"redhat-operators-rcw26\" (UID: \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\") " pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.100853 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ccfbc2-fd11-4014-80a1-ab815ea4521d-catalog-content\") pod \"redhat-operators-rcw26\" (UID: \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\") " pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.100946 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n49qp\" (UniqueName: \"kubernetes.io/projected/12ccfbc2-fd11-4014-80a1-ab815ea4521d-kube-api-access-n49qp\") pod \"redhat-operators-rcw26\" (UID: \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\") " pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.101569 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ccfbc2-fd11-4014-80a1-ab815ea4521d-catalog-content\") pod \"redhat-operators-rcw26\" (UID: \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\") " pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.101817 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ccfbc2-fd11-4014-80a1-ab815ea4521d-utilities\") pod \"redhat-operators-rcw26\" (UID: \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\") " pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.119699 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n49qp\" (UniqueName: \"kubernetes.io/projected/12ccfbc2-fd11-4014-80a1-ab815ea4521d-kube-api-access-n49qp\") pod \"redhat-operators-rcw26\" (UID: \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\") " pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.247439 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.247661 4854 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.303621 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:25:31 crc kubenswrapper[4854]: E1007 12:25:31.313491 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.316448 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:25:31 crc kubenswrapper[4854]: E1007 12:25:31.320282 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:25:31 crc kubenswrapper[4854]: E1007 12:25:31.322117 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:25:31 crc kubenswrapper[4854]: E1007 12:25:31.322185 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" podUID="3e06c8ed-2c8c-4c09-a352-31432fbd7d40" containerName="kube-multus-additional-cni-plugins" Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.375078 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zst5f"] Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.405384 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:25:31 crc kubenswrapper[4854]: W1007 12:25:31.428038 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode31cf8e1_9bcc_4bf0_9189_950e81595d38.slice/crio-c492240cad7e765ce2fe16a4b023e385ba4cf0dfde1fa0466ec057dbe587b061 WatchSource:0}: Error finding container c492240cad7e765ce2fe16a4b023e385ba4cf0dfde1fa0466ec057dbe587b061: Status 404 returned error can't find the container with id c492240cad7e765ce2fe16a4b023e385ba4cf0dfde1fa0466ec057dbe587b061 Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.615834 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:31 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:31 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:31 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.615897 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.695652 4854 generic.go:334] "Generic (PLEG): container finished" podID="573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" containerID="7d444d8cc24a4a36c65b7e6c8ddc040ce526c4fa501eec3a85478dd346f2542d" exitCode=0 Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.695785 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn5m7" event={"ID":"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8","Type":"ContainerDied","Data":"7d444d8cc24a4a36c65b7e6c8ddc040ce526c4fa501eec3a85478dd346f2542d"} Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.717690 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rcw26"] Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.733784 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c66ec6fc-5daf-4f6f-b057-80f0eb1a7584","Type":"ContainerDied","Data":"a2e313ff5bad53e19ba45e6dbbcd61ba9edcf9a7f75f51e454e092c950e542da"} Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.736327 4854 generic.go:334] "Generic (PLEG): container finished" podID="c66ec6fc-5daf-4f6f-b057-80f0eb1a7584" containerID="a2e313ff5bad53e19ba45e6dbbcd61ba9edcf9a7f75f51e454e092c950e542da" exitCode=0 Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.749406 4854 generic.go:334] "Generic (PLEG): container finished" podID="d7c51857-fb64-4aeb-9091-b50d3268b4ec" containerID="bb8802c8d410a115d053ddb00a9c55b7ecdd50ae1d6e2eef305df5f851f01606" exitCode=0 Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.749505 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh2kv" event={"ID":"d7c51857-fb64-4aeb-9091-b50d3268b4ec","Type":"ContainerDied","Data":"bb8802c8d410a115d053ddb00a9c55b7ecdd50ae1d6e2eef305df5f851f01606"} Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.749541 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh2kv" event={"ID":"d7c51857-fb64-4aeb-9091-b50d3268b4ec","Type":"ContainerStarted","Data":"6ee9d784e9d650fb7f3e6a618d515ffa40812240935e1c55845d6273c2ea224a"} Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.775039 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" event={"ID":"f83bc29b-b3be-4578-bae7-d2867242278c","Type":"ContainerStarted","Data":"15aa6a910972caaf604e16d798f8c368f58a51c9265446916463df8595e8c834"} Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.775538 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.790101 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zst5f" event={"ID":"e31cf8e1-9bcc-4bf0-9189-950e81595d38","Type":"ContainerStarted","Data":"c492240cad7e765ce2fe16a4b023e385ba4cf0dfde1fa0466ec057dbe587b061"} Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.798418 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-nrwsd" Oct 07 12:25:31 crc kubenswrapper[4854]: I1007 12:25:31.825663 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" podStartSLOduration=26.825640874 podStartE2EDuration="26.825640874s" podCreationTimestamp="2025-10-07 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:31.82070248 +0000 UTC m=+47.808534735" watchObservedRunningTime="2025-10-07 12:25:31.825640874 +0000 UTC m=+47.813473129" Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.232105 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.233536 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.248441 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.250302 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.254727 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.340138 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a57c2262-320e-419b-97cb-47bc43675fe7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a57c2262-320e-419b-97cb-47bc43675fe7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.340346 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a57c2262-320e-419b-97cb-47bc43675fe7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a57c2262-320e-419b-97cb-47bc43675fe7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.442084 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a57c2262-320e-419b-97cb-47bc43675fe7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a57c2262-320e-419b-97cb-47bc43675fe7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.442167 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a57c2262-320e-419b-97cb-47bc43675fe7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a57c2262-320e-419b-97cb-47bc43675fe7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.442281 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a57c2262-320e-419b-97cb-47bc43675fe7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a57c2262-320e-419b-97cb-47bc43675fe7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.474158 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a57c2262-320e-419b-97cb-47bc43675fe7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a57c2262-320e-419b-97cb-47bc43675fe7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.607693 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:32 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:32 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:32 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.607773 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.678136 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.830290 4854 generic.go:334] "Generic (PLEG): container finished" podID="e31cf8e1-9bcc-4bf0-9189-950e81595d38" containerID="d10aa26c8cb4dc0a99da5a7b94b543cf071ce1012b41962157f4f22f89263455" exitCode=0 Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.830383 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zst5f" event={"ID":"e31cf8e1-9bcc-4bf0-9189-950e81595d38","Type":"ContainerDied","Data":"d10aa26c8cb4dc0a99da5a7b94b543cf071ce1012b41962157f4f22f89263455"} Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.852025 4854 generic.go:334] "Generic (PLEG): container finished" podID="12ccfbc2-fd11-4014-80a1-ab815ea4521d" containerID="11a9875e79dccf68c2165292dfdf0be25eee1790704cf52bb7283c3564cdf236" exitCode=0 Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.853220 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcw26" event={"ID":"12ccfbc2-fd11-4014-80a1-ab815ea4521d","Type":"ContainerDied","Data":"11a9875e79dccf68c2165292dfdf0be25eee1790704cf52bb7283c3564cdf236"} Oct 07 12:25:32 crc kubenswrapper[4854]: I1007 12:25:32.853256 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcw26" event={"ID":"12ccfbc2-fd11-4014-80a1-ab815ea4521d","Type":"ContainerStarted","Data":"4c015d34fd8f483a4c85e1ed5be6e4ae452407aec48a9ede3d1fa9348ce37069"} Oct 07 12:25:33 crc kubenswrapper[4854]: I1007 12:25:33.216297 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 12:25:33 crc kubenswrapper[4854]: I1007 12:25:33.282858 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 12:25:33 crc kubenswrapper[4854]: I1007 12:25:33.368661 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c66ec6fc-5daf-4f6f-b057-80f0eb1a7584-kube-api-access\") pod \"c66ec6fc-5daf-4f6f-b057-80f0eb1a7584\" (UID: \"c66ec6fc-5daf-4f6f-b057-80f0eb1a7584\") " Oct 07 12:25:33 crc kubenswrapper[4854]: I1007 12:25:33.368730 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c66ec6fc-5daf-4f6f-b057-80f0eb1a7584-kubelet-dir\") pod \"c66ec6fc-5daf-4f6f-b057-80f0eb1a7584\" (UID: \"c66ec6fc-5daf-4f6f-b057-80f0eb1a7584\") " Oct 07 12:25:33 crc kubenswrapper[4854]: I1007 12:25:33.369556 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c66ec6fc-5daf-4f6f-b057-80f0eb1a7584-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c66ec6fc-5daf-4f6f-b057-80f0eb1a7584" (UID: "c66ec6fc-5daf-4f6f-b057-80f0eb1a7584"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:25:33 crc kubenswrapper[4854]: I1007 12:25:33.384964 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c66ec6fc-5daf-4f6f-b057-80f0eb1a7584-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c66ec6fc-5daf-4f6f-b057-80f0eb1a7584" (UID: "c66ec6fc-5daf-4f6f-b057-80f0eb1a7584"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:33 crc kubenswrapper[4854]: I1007 12:25:33.474471 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c66ec6fc-5daf-4f6f-b057-80f0eb1a7584-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:33 crc kubenswrapper[4854]: I1007 12:25:33.474524 4854 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c66ec6fc-5daf-4f6f-b057-80f0eb1a7584-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:33 crc kubenswrapper[4854]: I1007 12:25:33.611457 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:33 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:33 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:33 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:33 crc kubenswrapper[4854]: I1007 12:25:33.611512 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:33 crc kubenswrapper[4854]: I1007 12:25:33.877648 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 12:25:33 crc kubenswrapper[4854]: I1007 12:25:33.877749 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c66ec6fc-5daf-4f6f-b057-80f0eb1a7584","Type":"ContainerDied","Data":"baf39102dcfa7234c01ada7696984159bf8c25ab35e03620d715f79e4a3a8de8"} Oct 07 12:25:33 crc kubenswrapper[4854]: I1007 12:25:33.878796 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baf39102dcfa7234c01ada7696984159bf8c25ab35e03620d715f79e4a3a8de8" Oct 07 12:25:33 crc kubenswrapper[4854]: I1007 12:25:33.900124 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a57c2262-320e-419b-97cb-47bc43675fe7","Type":"ContainerStarted","Data":"1edab420a0b805084c1342c4227605bdcc78006130898b08f07a184c2443400c"} Oct 07 12:25:34 crc kubenswrapper[4854]: I1007 12:25:34.610678 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:34 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:34 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:34 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:34 crc kubenswrapper[4854]: I1007 12:25:34.610917 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:34 crc kubenswrapper[4854]: I1007 12:25:34.947586 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a57c2262-320e-419b-97cb-47bc43675fe7","Type":"ContainerStarted","Data":"16b4476256601dd490054ef7eeee6ee5720ac91bd95a9b24beb31252f5b8de7c"} Oct 07 12:25:34 crc kubenswrapper[4854]: I1007 12:25:34.967067 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.967048464 podStartE2EDuration="2.967048464s" podCreationTimestamp="2025-10-07 12:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:34.964061047 +0000 UTC m=+50.951893302" watchObservedRunningTime="2025-10-07 12:25:34.967048464 +0000 UTC m=+50.954880729" Oct 07 12:25:35 crc kubenswrapper[4854]: E1007 12:25:35.272765 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-poda57c2262_320e_419b_97cb_47bc43675fe7.slice/crio-conmon-16b4476256601dd490054ef7eeee6ee5720ac91bd95a9b24beb31252f5b8de7c.scope\": RecentStats: unable to find data in memory cache]" Oct 07 12:25:35 crc kubenswrapper[4854]: I1007 12:25:35.607633 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:35 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:35 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:35 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:35 crc kubenswrapper[4854]: I1007 12:25:35.607701 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:35 crc kubenswrapper[4854]: I1007 12:25:35.934470 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-t99kd" Oct 07 12:25:35 crc kubenswrapper[4854]: I1007 12:25:35.969764 4854 generic.go:334] "Generic (PLEG): container finished" podID="a57c2262-320e-419b-97cb-47bc43675fe7" containerID="16b4476256601dd490054ef7eeee6ee5720ac91bd95a9b24beb31252f5b8de7c" exitCode=0 Oct 07 12:25:35 crc kubenswrapper[4854]: I1007 12:25:35.969816 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a57c2262-320e-419b-97cb-47bc43675fe7","Type":"ContainerDied","Data":"16b4476256601dd490054ef7eeee6ee5720ac91bd95a9b24beb31252f5b8de7c"} Oct 07 12:25:36 crc kubenswrapper[4854]: I1007 12:25:36.525360 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 12:25:36 crc kubenswrapper[4854]: I1007 12:25:36.544050 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 07 12:25:36 crc kubenswrapper[4854]: I1007 12:25:36.609012 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:36 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:36 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:36 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:36 crc kubenswrapper[4854]: I1007 12:25:36.609097 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:37 crc kubenswrapper[4854]: I1007 12:25:37.427806 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 12:25:37 crc kubenswrapper[4854]: I1007 12:25:37.480012 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.479987691 podStartE2EDuration="1.479987691s" podCreationTimestamp="2025-10-07 12:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:25:37.473284705 +0000 UTC m=+53.461116960" watchObservedRunningTime="2025-10-07 12:25:37.479987691 +0000 UTC m=+53.467819936" Oct 07 12:25:37 crc kubenswrapper[4854]: I1007 12:25:37.577894 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a57c2262-320e-419b-97cb-47bc43675fe7-kube-api-access\") pod \"a57c2262-320e-419b-97cb-47bc43675fe7\" (UID: \"a57c2262-320e-419b-97cb-47bc43675fe7\") " Oct 07 12:25:37 crc kubenswrapper[4854]: I1007 12:25:37.578043 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a57c2262-320e-419b-97cb-47bc43675fe7-kubelet-dir\") pod \"a57c2262-320e-419b-97cb-47bc43675fe7\" (UID: \"a57c2262-320e-419b-97cb-47bc43675fe7\") " Oct 07 12:25:37 crc kubenswrapper[4854]: I1007 12:25:37.578208 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a57c2262-320e-419b-97cb-47bc43675fe7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a57c2262-320e-419b-97cb-47bc43675fe7" (UID: "a57c2262-320e-419b-97cb-47bc43675fe7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:25:37 crc kubenswrapper[4854]: I1007 12:25:37.578433 4854 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a57c2262-320e-419b-97cb-47bc43675fe7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:37 crc kubenswrapper[4854]: I1007 12:25:37.586883 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a57c2262-320e-419b-97cb-47bc43675fe7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a57c2262-320e-419b-97cb-47bc43675fe7" (UID: "a57c2262-320e-419b-97cb-47bc43675fe7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:25:37 crc kubenswrapper[4854]: I1007 12:25:37.608544 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:37 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:37 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:37 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:37 crc kubenswrapper[4854]: I1007 12:25:37.608700 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:37 crc kubenswrapper[4854]: I1007 12:25:37.679682 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a57c2262-320e-419b-97cb-47bc43675fe7-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 12:25:38 crc kubenswrapper[4854]: I1007 12:25:38.049193 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a57c2262-320e-419b-97cb-47bc43675fe7","Type":"ContainerDied","Data":"1edab420a0b805084c1342c4227605bdcc78006130898b08f07a184c2443400c"} Oct 07 12:25:38 crc kubenswrapper[4854]: I1007 12:25:38.049550 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1edab420a0b805084c1342c4227605bdcc78006130898b08f07a184c2443400c" Oct 07 12:25:38 crc kubenswrapper[4854]: I1007 12:25:38.049342 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 12:25:38 crc kubenswrapper[4854]: I1007 12:25:38.609405 4854 patch_prober.go:28] interesting pod/router-default-5444994796-xhnnw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 12:25:38 crc kubenswrapper[4854]: [-]has-synced failed: reason withheld Oct 07 12:25:38 crc kubenswrapper[4854]: [+]process-running ok Oct 07 12:25:38 crc kubenswrapper[4854]: healthz check failed Oct 07 12:25:38 crc kubenswrapper[4854]: I1007 12:25:38.609495 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhnnw" podUID="c1f51bac-5e94-403b-a197-bf85caf57f23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 12:25:39 crc kubenswrapper[4854]: I1007 12:25:39.609650 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:39 crc kubenswrapper[4854]: I1007 12:25:39.615520 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xhnnw" Oct 07 12:25:40 crc kubenswrapper[4854]: I1007 12:25:40.552130 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-64q88" Oct 07 12:25:40 crc kubenswrapper[4854]: I1007 12:25:40.983762 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:40 crc kubenswrapper[4854]: I1007 12:25:40.988673 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:25:41 crc kubenswrapper[4854]: E1007 12:25:41.312175 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:25:41 crc kubenswrapper[4854]: E1007 12:25:41.314021 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:25:41 crc kubenswrapper[4854]: E1007 12:25:41.316277 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:25:41 crc kubenswrapper[4854]: E1007 12:25:41.316383 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" podUID="3e06c8ed-2c8c-4c09-a352-31432fbd7d40" containerName="kube-multus-additional-cni-plugins" Oct 07 12:25:43 crc kubenswrapper[4854]: I1007 12:25:43.857005 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 12:25:49 crc kubenswrapper[4854]: I1007 12:25:49.992678 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:25:51 crc kubenswrapper[4854]: E1007 12:25:51.312423 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:25:51 crc kubenswrapper[4854]: E1007 12:25:51.314159 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:25:51 crc kubenswrapper[4854]: E1007 12:25:51.315388 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:25:51 crc kubenswrapper[4854]: E1007 12:25:51.315424 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" podUID="3e06c8ed-2c8c-4c09-a352-31432fbd7d40" containerName="kube-multus-additional-cni-plugins" Oct 07 12:25:55 crc kubenswrapper[4854]: I1007 12:25:55.722198 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 07 12:25:58 crc kubenswrapper[4854]: I1007 12:25:58.187570 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-vwn4j_3e06c8ed-2c8c-4c09-a352-31432fbd7d40/kube-multus-additional-cni-plugins/0.log" Oct 07 12:25:58 crc kubenswrapper[4854]: I1007 12:25:58.187987 4854 generic.go:334] "Generic (PLEG): container finished" podID="3e06c8ed-2c8c-4c09-a352-31432fbd7d40" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" exitCode=137 Oct 07 12:25:58 crc kubenswrapper[4854]: I1007 12:25:58.188025 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" event={"ID":"3e06c8ed-2c8c-4c09-a352-31432fbd7d40","Type":"ContainerDied","Data":"f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee"} Oct 07 12:26:00 crc kubenswrapper[4854]: I1007 12:26:00.688683 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 12:26:00 crc kubenswrapper[4854]: I1007 12:26:00.734793 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=5.734759836 podStartE2EDuration="5.734759836s" podCreationTimestamp="2025-10-07 12:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:26:00.731488382 +0000 UTC m=+76.719320647" watchObservedRunningTime="2025-10-07 12:26:00.734759836 +0000 UTC m=+76.722592091" Oct 07 12:26:01 crc kubenswrapper[4854]: I1007 12:26:01.238748 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rv6rb" Oct 07 12:26:01 crc kubenswrapper[4854]: E1007 12:26:01.309714 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee is running failed: container process not found" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:26:01 crc kubenswrapper[4854]: E1007 12:26:01.310426 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee is running failed: container process not found" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:26:01 crc kubenswrapper[4854]: E1007 12:26:01.311011 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee is running failed: container process not found" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:26:01 crc kubenswrapper[4854]: E1007 12:26:01.311088 4854 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" podUID="3e06c8ed-2c8c-4c09-a352-31432fbd7d40" containerName="kube-multus-additional-cni-plugins" Oct 07 12:26:03 crc kubenswrapper[4854]: E1007 12:26:03.006350 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 07 12:26:03 crc kubenswrapper[4854]: E1007 12:26:03.006887 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n49qp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rcw26_openshift-marketplace(12ccfbc2-fd11-4014-80a1-ab815ea4521d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 12:26:03 crc kubenswrapper[4854]: E1007 12:26:03.008074 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rcw26" podUID="12ccfbc2-fd11-4014-80a1-ab815ea4521d" Oct 07 12:26:04 crc kubenswrapper[4854]: E1007 12:26:04.062534 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rcw26" podUID="12ccfbc2-fd11-4014-80a1-ab815ea4521d" Oct 07 12:26:05 crc kubenswrapper[4854]: E1007 12:26:05.359119 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 07 12:26:05 crc kubenswrapper[4854]: E1007 12:26:05.359359 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xsps7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zst5f_openshift-marketplace(e31cf8e1-9bcc-4bf0-9189-950e81595d38): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 12:26:05 crc kubenswrapper[4854]: E1007 12:26:05.360538 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zst5f" podUID="e31cf8e1-9bcc-4bf0-9189-950e81595d38" Oct 07 12:26:07 crc kubenswrapper[4854]: E1007 12:26:07.282809 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zst5f" podUID="e31cf8e1-9bcc-4bf0-9189-950e81595d38" Oct 07 12:26:07 crc kubenswrapper[4854]: E1007 12:26:07.367843 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 12:26:07 crc kubenswrapper[4854]: E1007 12:26:07.368485 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2mnxx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gnz24_openshift-marketplace(f1de49b7-71c6-450c-ab89-0c37fb18b32a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 12:26:07 crc kubenswrapper[4854]: E1007 12:26:07.369912 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gnz24" podUID="f1de49b7-71c6-450c-ab89-0c37fb18b32a" Oct 07 12:26:08 crc kubenswrapper[4854]: E1007 12:26:08.703895 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gnz24" podUID="f1de49b7-71c6-450c-ab89-0c37fb18b32a" Oct 07 12:26:08 crc kubenswrapper[4854]: E1007 12:26:08.784779 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 07 12:26:08 crc kubenswrapper[4854]: E1007 12:26:08.785297 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6wxz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7s59d_openshift-marketplace(87c2c08e-dd9a-47b9-8e82-5419fdb6cda8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 12:26:08 crc kubenswrapper[4854]: E1007 12:26:08.786452 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7s59d" podUID="87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" Oct 07 12:26:11 crc kubenswrapper[4854]: E1007 12:26:11.309102 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee is running failed: container process not found" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:26:11 crc kubenswrapper[4854]: E1007 12:26:11.310571 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee is running failed: container process not found" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:26:11 crc kubenswrapper[4854]: E1007 12:26:11.310958 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee is running failed: container process not found" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 07 12:26:11 crc kubenswrapper[4854]: E1007 12:26:11.310991 4854 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" podUID="3e06c8ed-2c8c-4c09-a352-31432fbd7d40" containerName="kube-multus-additional-cni-plugins" Oct 07 12:26:11 crc kubenswrapper[4854]: E1007 12:26:11.579612 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7s59d" podUID="87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" Oct 07 12:26:11 crc kubenswrapper[4854]: I1007 12:26:11.622900 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-vwn4j_3e06c8ed-2c8c-4c09-a352-31432fbd7d40/kube-multus-additional-cni-plugins/0.log" Oct 07 12:26:11 crc kubenswrapper[4854]: I1007 12:26:11.623034 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:26:11 crc kubenswrapper[4854]: E1007 12:26:11.654661 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 07 12:26:11 crc kubenswrapper[4854]: E1007 12:26:11.654908 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k6z89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7vxvx_openshift-marketplace(bc3ab00d-65d3-4281-8fd5-65dda5839c8d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 12:26:11 crc kubenswrapper[4854]: E1007 12:26:11.656239 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7vxvx" podUID="bc3ab00d-65d3-4281-8fd5-65dda5839c8d" Oct 07 12:26:11 crc kubenswrapper[4854]: E1007 12:26:11.687191 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 12:26:11 crc kubenswrapper[4854]: E1007 12:26:11.687401 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sdnwf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-66kwk_openshift-marketplace(f2519a25-7584-44fa-8751-eb0160c6eb14): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 12:26:11 crc kubenswrapper[4854]: E1007 12:26:11.688656 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-66kwk" podUID="f2519a25-7584-44fa-8751-eb0160c6eb14" Oct 07 12:26:11 crc kubenswrapper[4854]: I1007 12:26:11.753643 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-tuning-conf-dir\") pod \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " Oct 07 12:26:11 crc kubenswrapper[4854]: I1007 12:26:11.753732 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mtd4\" (UniqueName: \"kubernetes.io/projected/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-kube-api-access-4mtd4\") pod \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " Oct 07 12:26:11 crc kubenswrapper[4854]: I1007 12:26:11.753795 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "3e06c8ed-2c8c-4c09-a352-31432fbd7d40" (UID: "3e06c8ed-2c8c-4c09-a352-31432fbd7d40"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:26:11 crc kubenswrapper[4854]: I1007 12:26:11.754176 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-ready\") pod \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " Oct 07 12:26:11 crc kubenswrapper[4854]: I1007 12:26:11.754233 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-cni-sysctl-allowlist\") pod \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\" (UID: \"3e06c8ed-2c8c-4c09-a352-31432fbd7d40\") " Oct 07 12:26:11 crc kubenswrapper[4854]: I1007 12:26:11.754600 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-ready" (OuterVolumeSpecName: "ready") pod "3e06c8ed-2c8c-4c09-a352-31432fbd7d40" (UID: "3e06c8ed-2c8c-4c09-a352-31432fbd7d40"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:26:11 crc kubenswrapper[4854]: I1007 12:26:11.754922 4854 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:11 crc kubenswrapper[4854]: I1007 12:26:11.754944 4854 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-ready\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:11 crc kubenswrapper[4854]: I1007 12:26:11.755205 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "3e06c8ed-2c8c-4c09-a352-31432fbd7d40" (UID: "3e06c8ed-2c8c-4c09-a352-31432fbd7d40"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:26:11 crc kubenswrapper[4854]: I1007 12:26:11.765020 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-kube-api-access-4mtd4" (OuterVolumeSpecName: "kube-api-access-4mtd4") pod "3e06c8ed-2c8c-4c09-a352-31432fbd7d40" (UID: "3e06c8ed-2c8c-4c09-a352-31432fbd7d40"). InnerVolumeSpecName "kube-api-access-4mtd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:26:11 crc kubenswrapper[4854]: I1007 12:26:11.855601 4854 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:11 crc kubenswrapper[4854]: I1007 12:26:11.855642 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mtd4\" (UniqueName: \"kubernetes.io/projected/3e06c8ed-2c8c-4c09-a352-31432fbd7d40-kube-api-access-4mtd4\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:12 crc kubenswrapper[4854]: I1007 12:26:12.280463 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-vwn4j_3e06c8ed-2c8c-4c09-a352-31432fbd7d40/kube-multus-additional-cni-plugins/0.log" Oct 07 12:26:12 crc kubenswrapper[4854]: I1007 12:26:12.280565 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" event={"ID":"3e06c8ed-2c8c-4c09-a352-31432fbd7d40","Type":"ContainerDied","Data":"2f67d006864cc4fe132246b02b59392011d20bfb3335b1bd06418da479e56e96"} Oct 07 12:26:12 crc kubenswrapper[4854]: I1007 12:26:12.280647 4854 scope.go:117] "RemoveContainer" containerID="f3648a5c341ce89898f7d66f52949e8fd83866185edfb80a8936501d40f83cee" Oct 07 12:26:12 crc kubenswrapper[4854]: I1007 12:26:12.280810 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-vwn4j" Oct 07 12:26:12 crc kubenswrapper[4854]: E1007 12:26:12.326365 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7vxvx" podUID="bc3ab00d-65d3-4281-8fd5-65dda5839c8d" Oct 07 12:26:12 crc kubenswrapper[4854]: E1007 12:26:12.326390 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-66kwk" podUID="f2519a25-7584-44fa-8751-eb0160c6eb14" Oct 07 12:26:12 crc kubenswrapper[4854]: I1007 12:26:12.347639 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-vwn4j"] Oct 07 12:26:12 crc kubenswrapper[4854]: I1007 12:26:12.349913 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-vwn4j"] Oct 07 12:26:12 crc kubenswrapper[4854]: E1007 12:26:12.393416 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 12:26:12 crc kubenswrapper[4854]: E1007 12:26:12.393590 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cwxd5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zh2kv_openshift-marketplace(d7c51857-fb64-4aeb-9091-b50d3268b4ec): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 12:26:12 crc kubenswrapper[4854]: E1007 12:26:12.394796 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-zh2kv" podUID="d7c51857-fb64-4aeb-9091-b50d3268b4ec" Oct 07 12:26:12 crc kubenswrapper[4854]: E1007 12:26:12.408715 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 12:26:12 crc kubenswrapper[4854]: E1007 12:26:12.408895 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j5w2t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wn5m7_openshift-marketplace(573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 12:26:12 crc kubenswrapper[4854]: E1007 12:26:12.410107 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wn5m7" podUID="573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" Oct 07 12:26:12 crc kubenswrapper[4854]: I1007 12:26:12.716217 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e06c8ed-2c8c-4c09-a352-31432fbd7d40" path="/var/lib/kubelet/pods/3e06c8ed-2c8c-4c09-a352-31432fbd7d40/volumes" Oct 07 12:26:13 crc kubenswrapper[4854]: E1007 12:26:13.292846 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zh2kv" podUID="d7c51857-fb64-4aeb-9091-b50d3268b4ec" Oct 07 12:26:13 crc kubenswrapper[4854]: E1007 12:26:13.293170 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wn5m7" podUID="573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" Oct 07 12:26:20 crc kubenswrapper[4854]: I1007 12:26:20.346411 4854 generic.go:334] "Generic (PLEG): container finished" podID="12ccfbc2-fd11-4014-80a1-ab815ea4521d" containerID="fc797e6a0898bdaabb9575d6f3fd1145efb6bfe4ca4ef5b15c741e89045b45b3" exitCode=0 Oct 07 12:26:20 crc kubenswrapper[4854]: I1007 12:26:20.346529 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcw26" event={"ID":"12ccfbc2-fd11-4014-80a1-ab815ea4521d","Type":"ContainerDied","Data":"fc797e6a0898bdaabb9575d6f3fd1145efb6bfe4ca4ef5b15c741e89045b45b3"} Oct 07 12:26:21 crc kubenswrapper[4854]: I1007 12:26:21.359296 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcw26" event={"ID":"12ccfbc2-fd11-4014-80a1-ab815ea4521d","Type":"ContainerStarted","Data":"8df528dbf160be0df56081b283a2a9f89d9dabed9022964e955dc539da70044e"} Oct 07 12:26:21 crc kubenswrapper[4854]: I1007 12:26:21.362236 4854 generic.go:334] "Generic (PLEG): container finished" podID="f1de49b7-71c6-450c-ab89-0c37fb18b32a" containerID="c630e58e6d22169a2e0a3cf2846de0e3376f92c7acf2122ea3a11efff281b537" exitCode=0 Oct 07 12:26:21 crc kubenswrapper[4854]: I1007 12:26:21.362302 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnz24" event={"ID":"f1de49b7-71c6-450c-ab89-0c37fb18b32a","Type":"ContainerDied","Data":"c630e58e6d22169a2e0a3cf2846de0e3376f92c7acf2122ea3a11efff281b537"} Oct 07 12:26:21 crc kubenswrapper[4854]: I1007 12:26:21.395879 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rcw26" podStartSLOduration=3.48704337 podStartE2EDuration="51.395855331s" podCreationTimestamp="2025-10-07 12:25:30 +0000 UTC" firstStartedPulling="2025-10-07 12:25:32.873093794 +0000 UTC m=+48.860926049" lastFinishedPulling="2025-10-07 12:26:20.781905715 +0000 UTC m=+96.769738010" observedRunningTime="2025-10-07 12:26:21.385564555 +0000 UTC m=+97.373396870" watchObservedRunningTime="2025-10-07 12:26:21.395855331 +0000 UTC m=+97.383687596" Oct 07 12:26:22 crc kubenswrapper[4854]: I1007 12:26:22.372508 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnz24" event={"ID":"f1de49b7-71c6-450c-ab89-0c37fb18b32a","Type":"ContainerStarted","Data":"c07ac8dabbf7fe77a854d7d589e8a659fbacb855f2896a58bfdf65cc02402a32"} Oct 07 12:26:22 crc kubenswrapper[4854]: I1007 12:26:22.403586 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gnz24" podStartSLOduration=3.088973702 podStartE2EDuration="55.403563955s" podCreationTimestamp="2025-10-07 12:25:27 +0000 UTC" firstStartedPulling="2025-10-07 12:25:29.522026921 +0000 UTC m=+45.509859176" lastFinishedPulling="2025-10-07 12:26:21.836617164 +0000 UTC m=+97.824449429" observedRunningTime="2025-10-07 12:26:22.401458475 +0000 UTC m=+98.389290730" watchObservedRunningTime="2025-10-07 12:26:22.403563955 +0000 UTC m=+98.391396210" Oct 07 12:26:24 crc kubenswrapper[4854]: I1007 12:26:24.388186 4854 generic.go:334] "Generic (PLEG): container finished" podID="e31cf8e1-9bcc-4bf0-9189-950e81595d38" containerID="ff1af110163ccec3e98465b0e7935881e3fa0b1fde56ce13b5d7f87e73f42c1a" exitCode=0 Oct 07 12:26:24 crc kubenswrapper[4854]: I1007 12:26:24.388307 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zst5f" event={"ID":"e31cf8e1-9bcc-4bf0-9189-950e81595d38","Type":"ContainerDied","Data":"ff1af110163ccec3e98465b0e7935881e3fa0b1fde56ce13b5d7f87e73f42c1a"} Oct 07 12:26:25 crc kubenswrapper[4854]: I1007 12:26:25.400142 4854 generic.go:334] "Generic (PLEG): container finished" podID="bc3ab00d-65d3-4281-8fd5-65dda5839c8d" containerID="6986d79fdfe18c3b08f851401bba07c0ae302c98d4964e8f48fde5e83c98a118" exitCode=0 Oct 07 12:26:25 crc kubenswrapper[4854]: I1007 12:26:25.400222 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vxvx" event={"ID":"bc3ab00d-65d3-4281-8fd5-65dda5839c8d","Type":"ContainerDied","Data":"6986d79fdfe18c3b08f851401bba07c0ae302c98d4964e8f48fde5e83c98a118"} Oct 07 12:26:25 crc kubenswrapper[4854]: I1007 12:26:25.403395 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zst5f" event={"ID":"e31cf8e1-9bcc-4bf0-9189-950e81595d38","Type":"ContainerStarted","Data":"53deea10f18e0e76fdc0a3d62084e00d88c92a3336e2a713042001c764541007"} Oct 07 12:26:25 crc kubenswrapper[4854]: I1007 12:26:25.448636 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zst5f" podStartSLOduration=2.371409112 podStartE2EDuration="55.448604889s" podCreationTimestamp="2025-10-07 12:25:30 +0000 UTC" firstStartedPulling="2025-10-07 12:25:31.866828529 +0000 UTC m=+47.854660784" lastFinishedPulling="2025-10-07 12:26:24.944024296 +0000 UTC m=+100.931856561" observedRunningTime="2025-10-07 12:26:25.445869 +0000 UTC m=+101.433701265" watchObservedRunningTime="2025-10-07 12:26:25.448604889 +0000 UTC m=+101.436437184" Oct 07 12:26:25 crc kubenswrapper[4854]: E1007 12:26:25.933693 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573bfe9e_2b5e_47dc_9d08_9bccdca9a7b8.slice/crio-conmon-def927a5514997fdbc27dc01f48d1b411872978049138270a5748babe4f131a4.scope\": RecentStats: unable to find data in memory cache]" Oct 07 12:26:26 crc kubenswrapper[4854]: I1007 12:26:26.414837 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vxvx" event={"ID":"bc3ab00d-65d3-4281-8fd5-65dda5839c8d","Type":"ContainerStarted","Data":"bd1cb0747c08a3270721022677bfc265a508b0a5578aee852ffdb4105ad2d112"} Oct 07 12:26:26 crc kubenswrapper[4854]: I1007 12:26:26.420446 4854 generic.go:334] "Generic (PLEG): container finished" podID="573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" containerID="def927a5514997fdbc27dc01f48d1b411872978049138270a5748babe4f131a4" exitCode=0 Oct 07 12:26:26 crc kubenswrapper[4854]: I1007 12:26:26.420528 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn5m7" event={"ID":"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8","Type":"ContainerDied","Data":"def927a5514997fdbc27dc01f48d1b411872978049138270a5748babe4f131a4"} Oct 07 12:26:26 crc kubenswrapper[4854]: I1007 12:26:26.437613 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7vxvx" podStartSLOduration=4.17922166 podStartE2EDuration="59.437583842s" podCreationTimestamp="2025-10-07 12:25:27 +0000 UTC" firstStartedPulling="2025-10-07 12:25:30.601867688 +0000 UTC m=+46.589699943" lastFinishedPulling="2025-10-07 12:26:25.86022987 +0000 UTC m=+101.848062125" observedRunningTime="2025-10-07 12:26:26.436507181 +0000 UTC m=+102.424339446" watchObservedRunningTime="2025-10-07 12:26:26.437583842 +0000 UTC m=+102.425416107" Oct 07 12:26:27 crc kubenswrapper[4854]: I1007 12:26:27.430133 4854 generic.go:334] "Generic (PLEG): container finished" podID="87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" containerID="09d9a54021829f0632b9eaaaa56ef1adecdd184afc0fe3cd415e5242727ac350" exitCode=0 Oct 07 12:26:27 crc kubenswrapper[4854]: I1007 12:26:27.430250 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7s59d" event={"ID":"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8","Type":"ContainerDied","Data":"09d9a54021829f0632b9eaaaa56ef1adecdd184afc0fe3cd415e5242727ac350"} Oct 07 12:26:27 crc kubenswrapper[4854]: I1007 12:26:27.439054 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn5m7" event={"ID":"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8","Type":"ContainerStarted","Data":"e935299c2c78782e238a464493dbeb4ee7c683b72dea3552fea0953ccb46b0b0"} Oct 07 12:26:27 crc kubenswrapper[4854]: I1007 12:26:27.493719 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wn5m7" podStartSLOduration=3.130913692 podStartE2EDuration="58.493694703s" podCreationTimestamp="2025-10-07 12:25:29 +0000 UTC" firstStartedPulling="2025-10-07 12:25:31.70822257 +0000 UTC m=+47.696054825" lastFinishedPulling="2025-10-07 12:26:27.071003581 +0000 UTC m=+103.058835836" observedRunningTime="2025-10-07 12:26:27.491784787 +0000 UTC m=+103.479617042" watchObservedRunningTime="2025-10-07 12:26:27.493694703 +0000 UTC m=+103.481526958" Oct 07 12:26:27 crc kubenswrapper[4854]: I1007 12:26:27.905418 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:26:27 crc kubenswrapper[4854]: I1007 12:26:27.905889 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:26:28 crc kubenswrapper[4854]: I1007 12:26:28.123459 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:26:28 crc kubenswrapper[4854]: I1007 12:26:28.209920 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:26:28 crc kubenswrapper[4854]: I1007 12:26:28.209997 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:26:28 crc kubenswrapper[4854]: I1007 12:26:28.275019 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:26:28 crc kubenswrapper[4854]: I1007 12:26:28.447113 4854 generic.go:334] "Generic (PLEG): container finished" podID="f2519a25-7584-44fa-8751-eb0160c6eb14" containerID="01a4009660ae2ed3d46606e3f15ba2827eed6bd0ea74e553cfefcf7a5ce0776a" exitCode=0 Oct 07 12:26:28 crc kubenswrapper[4854]: I1007 12:26:28.447206 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66kwk" event={"ID":"f2519a25-7584-44fa-8751-eb0160c6eb14","Type":"ContainerDied","Data":"01a4009660ae2ed3d46606e3f15ba2827eed6bd0ea74e553cfefcf7a5ce0776a"} Oct 07 12:26:28 crc kubenswrapper[4854]: I1007 12:26:28.452273 4854 generic.go:334] "Generic (PLEG): container finished" podID="d7c51857-fb64-4aeb-9091-b50d3268b4ec" containerID="ced4ce77f9388502a9219c77bd21e1de0ace7ff06396bf631d6414185ce18a60" exitCode=0 Oct 07 12:26:28 crc kubenswrapper[4854]: I1007 12:26:28.453102 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh2kv" event={"ID":"d7c51857-fb64-4aeb-9091-b50d3268b4ec","Type":"ContainerDied","Data":"ced4ce77f9388502a9219c77bd21e1de0ace7ff06396bf631d6414185ce18a60"} Oct 07 12:26:28 crc kubenswrapper[4854]: I1007 12:26:28.460752 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7s59d" event={"ID":"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8","Type":"ContainerStarted","Data":"3a3a0ebfb1d077bddbca365dd95b49deca87b1158bfc34ef7de03f7c3c27a7e2"} Oct 07 12:26:28 crc kubenswrapper[4854]: I1007 12:26:28.503100 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:26:28 crc kubenswrapper[4854]: I1007 12:26:28.509635 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7s59d" podStartSLOduration=2.9651800919999998 podStartE2EDuration="1m1.509612702s" podCreationTimestamp="2025-10-07 12:25:27 +0000 UTC" firstStartedPulling="2025-10-07 12:25:29.525286906 +0000 UTC m=+45.513119161" lastFinishedPulling="2025-10-07 12:26:28.069719516 +0000 UTC m=+104.057551771" observedRunningTime="2025-10-07 12:26:28.506878624 +0000 UTC m=+104.494710909" watchObservedRunningTime="2025-10-07 12:26:28.509612702 +0000 UTC m=+104.497444957" Oct 07 12:26:29 crc kubenswrapper[4854]: I1007 12:26:29.200322 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-87pqd"] Oct 07 12:26:29 crc kubenswrapper[4854]: I1007 12:26:29.468573 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66kwk" event={"ID":"f2519a25-7584-44fa-8751-eb0160c6eb14","Type":"ContainerStarted","Data":"d72d46497894a1b682d8cca4abc2011cd54015111e4c8118183bb91d055c383d"} Oct 07 12:26:29 crc kubenswrapper[4854]: I1007 12:26:29.471293 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh2kv" event={"ID":"d7c51857-fb64-4aeb-9091-b50d3268b4ec","Type":"ContainerStarted","Data":"8d7ea341ad948ef3cacb9060a71ed776df3a20ef93189f5aad45d2a44059de9d"} Oct 07 12:26:29 crc kubenswrapper[4854]: I1007 12:26:29.497257 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-66kwk" podStartSLOduration=2.882292932 podStartE2EDuration="1m2.497229267s" podCreationTimestamp="2025-10-07 12:25:27 +0000 UTC" firstStartedPulling="2025-10-07 12:25:29.52198709 +0000 UTC m=+45.509819345" lastFinishedPulling="2025-10-07 12:26:29.136923425 +0000 UTC m=+105.124755680" observedRunningTime="2025-10-07 12:26:29.4935319 +0000 UTC m=+105.481364165" watchObservedRunningTime="2025-10-07 12:26:29.497229267 +0000 UTC m=+105.485061522" Oct 07 12:26:29 crc kubenswrapper[4854]: I1007 12:26:29.531448 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zh2kv" podStartSLOduration=3.192203737 podStartE2EDuration="1m0.531413223s" podCreationTimestamp="2025-10-07 12:25:29 +0000 UTC" firstStartedPulling="2025-10-07 12:25:31.754028479 +0000 UTC m=+47.741860734" lastFinishedPulling="2025-10-07 12:26:29.093237975 +0000 UTC m=+105.081070220" observedRunningTime="2025-10-07 12:26:29.522259719 +0000 UTC m=+105.510091974" watchObservedRunningTime="2025-10-07 12:26:29.531413223 +0000 UTC m=+105.519245468" Oct 07 12:26:29 crc kubenswrapper[4854]: I1007 12:26:29.944404 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:26:29 crc kubenswrapper[4854]: I1007 12:26:29.944786 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:26:29 crc kubenswrapper[4854]: I1007 12:26:29.998895 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:26:30 crc kubenswrapper[4854]: I1007 12:26:30.335275 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:26:30 crc kubenswrapper[4854]: I1007 12:26:30.335327 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:26:30 crc kubenswrapper[4854]: I1007 12:26:30.386100 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:26:30 crc kubenswrapper[4854]: I1007 12:26:30.952663 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:26:30 crc kubenswrapper[4854]: I1007 12:26:30.952724 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:26:30 crc kubenswrapper[4854]: I1007 12:26:30.999234 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:26:31 crc kubenswrapper[4854]: I1007 12:26:31.304815 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:26:31 crc kubenswrapper[4854]: I1007 12:26:31.304875 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:26:31 crc kubenswrapper[4854]: I1007 12:26:31.356247 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:26:31 crc kubenswrapper[4854]: I1007 12:26:31.630005 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:26:31 crc kubenswrapper[4854]: I1007 12:26:31.652231 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:26:34 crc kubenswrapper[4854]: I1007 12:26:34.947935 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rcw26"] Oct 07 12:26:34 crc kubenswrapper[4854]: I1007 12:26:34.949427 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rcw26" podUID="12ccfbc2-fd11-4014-80a1-ab815ea4521d" containerName="registry-server" containerID="cri-o://8df528dbf160be0df56081b283a2a9f89d9dabed9022964e955dc539da70044e" gracePeriod=2 Oct 07 12:26:36 crc kubenswrapper[4854]: I1007 12:26:36.519607 4854 generic.go:334] "Generic (PLEG): container finished" podID="12ccfbc2-fd11-4014-80a1-ab815ea4521d" containerID="8df528dbf160be0df56081b283a2a9f89d9dabed9022964e955dc539da70044e" exitCode=0 Oct 07 12:26:36 crc kubenswrapper[4854]: I1007 12:26:36.519689 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcw26" event={"ID":"12ccfbc2-fd11-4014-80a1-ab815ea4521d","Type":"ContainerDied","Data":"8df528dbf160be0df56081b283a2a9f89d9dabed9022964e955dc539da70044e"} Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.279530 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.400395 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ccfbc2-fd11-4014-80a1-ab815ea4521d-utilities\") pod \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\" (UID: \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\") " Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.400563 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n49qp\" (UniqueName: \"kubernetes.io/projected/12ccfbc2-fd11-4014-80a1-ab815ea4521d-kube-api-access-n49qp\") pod \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\" (UID: \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\") " Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.400592 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ccfbc2-fd11-4014-80a1-ab815ea4521d-catalog-content\") pod \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\" (UID: \"12ccfbc2-fd11-4014-80a1-ab815ea4521d\") " Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.401527 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ccfbc2-fd11-4014-80a1-ab815ea4521d-utilities" (OuterVolumeSpecName: "utilities") pod "12ccfbc2-fd11-4014-80a1-ab815ea4521d" (UID: "12ccfbc2-fd11-4014-80a1-ab815ea4521d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.410317 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ccfbc2-fd11-4014-80a1-ab815ea4521d-kube-api-access-n49qp" (OuterVolumeSpecName: "kube-api-access-n49qp") pod "12ccfbc2-fd11-4014-80a1-ab815ea4521d" (UID: "12ccfbc2-fd11-4014-80a1-ab815ea4521d"). InnerVolumeSpecName "kube-api-access-n49qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.496179 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ccfbc2-fd11-4014-80a1-ab815ea4521d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12ccfbc2-fd11-4014-80a1-ab815ea4521d" (UID: "12ccfbc2-fd11-4014-80a1-ab815ea4521d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.502348 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ccfbc2-fd11-4014-80a1-ab815ea4521d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.502400 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n49qp\" (UniqueName: \"kubernetes.io/projected/12ccfbc2-fd11-4014-80a1-ab815ea4521d-kube-api-access-n49qp\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.502417 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ccfbc2-fd11-4014-80a1-ab815ea4521d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.530070 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcw26" event={"ID":"12ccfbc2-fd11-4014-80a1-ab815ea4521d","Type":"ContainerDied","Data":"4c015d34fd8f483a4c85e1ed5be6e4ae452407aec48a9ede3d1fa9348ce37069"} Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.530164 4854 scope.go:117] "RemoveContainer" containerID="8df528dbf160be0df56081b283a2a9f89d9dabed9022964e955dc539da70044e" Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.530181 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcw26" Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.551029 4854 scope.go:117] "RemoveContainer" containerID="fc797e6a0898bdaabb9575d6f3fd1145efb6bfe4ca4ef5b15c741e89045b45b3" Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.570542 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rcw26"] Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.572941 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rcw26"] Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.589038 4854 scope.go:117] "RemoveContainer" containerID="11a9875e79dccf68c2165292dfdf0be25eee1790704cf52bb7283c3564cdf236" Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.762449 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.762519 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:26:37 crc kubenswrapper[4854]: I1007 12:26:37.805319 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:26:38 crc kubenswrapper[4854]: I1007 12:26:38.272118 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:26:38 crc kubenswrapper[4854]: I1007 12:26:38.582833 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:26:38 crc kubenswrapper[4854]: I1007 12:26:38.630932 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:26:38 crc kubenswrapper[4854]: I1007 12:26:38.631016 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:26:38 crc kubenswrapper[4854]: I1007 12:26:38.690583 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:26:38 crc kubenswrapper[4854]: I1007 12:26:38.712453 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ccfbc2-fd11-4014-80a1-ab815ea4521d" path="/var/lib/kubelet/pods/12ccfbc2-fd11-4014-80a1-ab815ea4521d/volumes" Oct 07 12:26:39 crc kubenswrapper[4854]: I1007 12:26:39.588837 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:26:39 crc kubenswrapper[4854]: I1007 12:26:39.991455 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:26:40 crc kubenswrapper[4854]: I1007 12:26:40.148906 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vxvx"] Oct 07 12:26:40 crc kubenswrapper[4854]: I1007 12:26:40.149177 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7vxvx" podUID="bc3ab00d-65d3-4281-8fd5-65dda5839c8d" containerName="registry-server" containerID="cri-o://bd1cb0747c08a3270721022677bfc265a508b0a5578aee852ffdb4105ad2d112" gracePeriod=2 Oct 07 12:26:40 crc kubenswrapper[4854]: I1007 12:26:40.381822 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:26:41 crc kubenswrapper[4854]: I1007 12:26:41.566051 4854 generic.go:334] "Generic (PLEG): container finished" podID="bc3ab00d-65d3-4281-8fd5-65dda5839c8d" containerID="bd1cb0747c08a3270721022677bfc265a508b0a5578aee852ffdb4105ad2d112" exitCode=0 Oct 07 12:26:41 crc kubenswrapper[4854]: I1007 12:26:41.566106 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vxvx" event={"ID":"bc3ab00d-65d3-4281-8fd5-65dda5839c8d","Type":"ContainerDied","Data":"bd1cb0747c08a3270721022677bfc265a508b0a5578aee852ffdb4105ad2d112"} Oct 07 12:26:41 crc kubenswrapper[4854]: I1007 12:26:41.892914 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:26:41 crc kubenswrapper[4854]: I1007 12:26:41.970705 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-utilities\") pod \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\" (UID: \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\") " Oct 07 12:26:41 crc kubenswrapper[4854]: I1007 12:26:41.970817 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-catalog-content\") pod \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\" (UID: \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\") " Oct 07 12:26:41 crc kubenswrapper[4854]: I1007 12:26:41.970874 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6z89\" (UniqueName: \"kubernetes.io/projected/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-kube-api-access-k6z89\") pod \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\" (UID: \"bc3ab00d-65d3-4281-8fd5-65dda5839c8d\") " Oct 07 12:26:41 crc kubenswrapper[4854]: I1007 12:26:41.975920 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-utilities" (OuterVolumeSpecName: "utilities") pod "bc3ab00d-65d3-4281-8fd5-65dda5839c8d" (UID: "bc3ab00d-65d3-4281-8fd5-65dda5839c8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:26:41 crc kubenswrapper[4854]: I1007 12:26:41.981209 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-kube-api-access-k6z89" (OuterVolumeSpecName: "kube-api-access-k6z89") pod "bc3ab00d-65d3-4281-8fd5-65dda5839c8d" (UID: "bc3ab00d-65d3-4281-8fd5-65dda5839c8d"). InnerVolumeSpecName "kube-api-access-k6z89". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.015332 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc3ab00d-65d3-4281-8fd5-65dda5839c8d" (UID: "bc3ab00d-65d3-4281-8fd5-65dda5839c8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.073582 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.074019 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.074223 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6z89\" (UniqueName: \"kubernetes.io/projected/bc3ab00d-65d3-4281-8fd5-65dda5839c8d-kube-api-access-k6z89\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.547033 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-66kwk"] Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.547320 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-66kwk" podUID="f2519a25-7584-44fa-8751-eb0160c6eb14" containerName="registry-server" containerID="cri-o://d72d46497894a1b682d8cca4abc2011cd54015111e4c8118183bb91d055c383d" gracePeriod=2 Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.575424 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vxvx" event={"ID":"bc3ab00d-65d3-4281-8fd5-65dda5839c8d","Type":"ContainerDied","Data":"30c5fdb84f77f6a19f4b58d8b1328da738094ee4761882d1c20f5c9af9ee4b22"} Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.575493 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vxvx" Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.575498 4854 scope.go:117] "RemoveContainer" containerID="bd1cb0747c08a3270721022677bfc265a508b0a5578aee852ffdb4105ad2d112" Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.596202 4854 scope.go:117] "RemoveContainer" containerID="6986d79fdfe18c3b08f851401bba07c0ae302c98d4964e8f48fde5e83c98a118" Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.658740 4854 scope.go:117] "RemoveContainer" containerID="0c58d9c5b2b83485bb4e76c00aa37d0d526e5ee83b4a5759593efc6de8d6516d" Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.660490 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vxvx"] Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.663958 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7vxvx"] Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.714634 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc3ab00d-65d3-4281-8fd5-65dda5839c8d" path="/var/lib/kubelet/pods/bc3ab00d-65d3-4281-8fd5-65dda5839c8d/volumes" Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.745900 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh2kv"] Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.746552 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zh2kv" podUID="d7c51857-fb64-4aeb-9091-b50d3268b4ec" containerName="registry-server" containerID="cri-o://8d7ea341ad948ef3cacb9060a71ed776df3a20ef93189f5aad45d2a44059de9d" gracePeriod=2 Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.947405 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.988552 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2519a25-7584-44fa-8751-eb0160c6eb14-utilities\") pod \"f2519a25-7584-44fa-8751-eb0160c6eb14\" (UID: \"f2519a25-7584-44fa-8751-eb0160c6eb14\") " Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.988673 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2519a25-7584-44fa-8751-eb0160c6eb14-catalog-content\") pod \"f2519a25-7584-44fa-8751-eb0160c6eb14\" (UID: \"f2519a25-7584-44fa-8751-eb0160c6eb14\") " Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.988803 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdnwf\" (UniqueName: \"kubernetes.io/projected/f2519a25-7584-44fa-8751-eb0160c6eb14-kube-api-access-sdnwf\") pod \"f2519a25-7584-44fa-8751-eb0160c6eb14\" (UID: \"f2519a25-7584-44fa-8751-eb0160c6eb14\") " Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.989804 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2519a25-7584-44fa-8751-eb0160c6eb14-utilities" (OuterVolumeSpecName: "utilities") pod "f2519a25-7584-44fa-8751-eb0160c6eb14" (UID: "f2519a25-7584-44fa-8751-eb0160c6eb14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:26:42 crc kubenswrapper[4854]: I1007 12:26:42.995481 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2519a25-7584-44fa-8751-eb0160c6eb14-kube-api-access-sdnwf" (OuterVolumeSpecName: "kube-api-access-sdnwf") pod "f2519a25-7584-44fa-8751-eb0160c6eb14" (UID: "f2519a25-7584-44fa-8751-eb0160c6eb14"). InnerVolumeSpecName "kube-api-access-sdnwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.064863 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2519a25-7584-44fa-8751-eb0160c6eb14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2519a25-7584-44fa-8751-eb0160c6eb14" (UID: "f2519a25-7584-44fa-8751-eb0160c6eb14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.090787 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2519a25-7584-44fa-8751-eb0160c6eb14-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.090817 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2519a25-7584-44fa-8751-eb0160c6eb14-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.090829 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdnwf\" (UniqueName: \"kubernetes.io/projected/f2519a25-7584-44fa-8751-eb0160c6eb14-kube-api-access-sdnwf\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.155764 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.191642 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7c51857-fb64-4aeb-9091-b50d3268b4ec-utilities\") pod \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\" (UID: \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\") " Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.191810 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7c51857-fb64-4aeb-9091-b50d3268b4ec-catalog-content\") pod \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\" (UID: \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\") " Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.191921 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwxd5\" (UniqueName: \"kubernetes.io/projected/d7c51857-fb64-4aeb-9091-b50d3268b4ec-kube-api-access-cwxd5\") pod \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\" (UID: \"d7c51857-fb64-4aeb-9091-b50d3268b4ec\") " Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.193162 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7c51857-fb64-4aeb-9091-b50d3268b4ec-utilities" (OuterVolumeSpecName: "utilities") pod "d7c51857-fb64-4aeb-9091-b50d3268b4ec" (UID: "d7c51857-fb64-4aeb-9091-b50d3268b4ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.195633 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c51857-fb64-4aeb-9091-b50d3268b4ec-kube-api-access-cwxd5" (OuterVolumeSpecName: "kube-api-access-cwxd5") pod "d7c51857-fb64-4aeb-9091-b50d3268b4ec" (UID: "d7c51857-fb64-4aeb-9091-b50d3268b4ec"). InnerVolumeSpecName "kube-api-access-cwxd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.204875 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7c51857-fb64-4aeb-9091-b50d3268b4ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7c51857-fb64-4aeb-9091-b50d3268b4ec" (UID: "d7c51857-fb64-4aeb-9091-b50d3268b4ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.294347 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7c51857-fb64-4aeb-9091-b50d3268b4ec-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.294409 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwxd5\" (UniqueName: \"kubernetes.io/projected/d7c51857-fb64-4aeb-9091-b50d3268b4ec-kube-api-access-cwxd5\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.294432 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7c51857-fb64-4aeb-9091-b50d3268b4ec-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.584981 4854 generic.go:334] "Generic (PLEG): container finished" podID="d7c51857-fb64-4aeb-9091-b50d3268b4ec" containerID="8d7ea341ad948ef3cacb9060a71ed776df3a20ef93189f5aad45d2a44059de9d" exitCode=0 Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.585118 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zh2kv" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.585185 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh2kv" event={"ID":"d7c51857-fb64-4aeb-9091-b50d3268b4ec","Type":"ContainerDied","Data":"8d7ea341ad948ef3cacb9060a71ed776df3a20ef93189f5aad45d2a44059de9d"} Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.585297 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zh2kv" event={"ID":"d7c51857-fb64-4aeb-9091-b50d3268b4ec","Type":"ContainerDied","Data":"6ee9d784e9d650fb7f3e6a618d515ffa40812240935e1c55845d6273c2ea224a"} Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.585395 4854 scope.go:117] "RemoveContainer" containerID="8d7ea341ad948ef3cacb9060a71ed776df3a20ef93189f5aad45d2a44059de9d" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.589834 4854 generic.go:334] "Generic (PLEG): container finished" podID="f2519a25-7584-44fa-8751-eb0160c6eb14" containerID="d72d46497894a1b682d8cca4abc2011cd54015111e4c8118183bb91d055c383d" exitCode=0 Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.589915 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66kwk" event={"ID":"f2519a25-7584-44fa-8751-eb0160c6eb14","Type":"ContainerDied","Data":"d72d46497894a1b682d8cca4abc2011cd54015111e4c8118183bb91d055c383d"} Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.589946 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-66kwk" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.589976 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66kwk" event={"ID":"f2519a25-7584-44fa-8751-eb0160c6eb14","Type":"ContainerDied","Data":"007fbc4465578bf538e399b2de1c093d6746065913ddb4c430e133dec67c3f11"} Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.607673 4854 scope.go:117] "RemoveContainer" containerID="ced4ce77f9388502a9219c77bd21e1de0ace7ff06396bf631d6414185ce18a60" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.621499 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh2kv"] Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.623655 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zh2kv"] Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.638269 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-66kwk"] Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.641865 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-66kwk"] Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.651172 4854 scope.go:117] "RemoveContainer" containerID="bb8802c8d410a115d053ddb00a9c55b7ecdd50ae1d6e2eef305df5f851f01606" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.669226 4854 scope.go:117] "RemoveContainer" containerID="8d7ea341ad948ef3cacb9060a71ed776df3a20ef93189f5aad45d2a44059de9d" Oct 07 12:26:43 crc kubenswrapper[4854]: E1007 12:26:43.669814 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d7ea341ad948ef3cacb9060a71ed776df3a20ef93189f5aad45d2a44059de9d\": container with ID starting with 8d7ea341ad948ef3cacb9060a71ed776df3a20ef93189f5aad45d2a44059de9d not found: ID does not exist" containerID="8d7ea341ad948ef3cacb9060a71ed776df3a20ef93189f5aad45d2a44059de9d" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.669862 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d7ea341ad948ef3cacb9060a71ed776df3a20ef93189f5aad45d2a44059de9d"} err="failed to get container status \"8d7ea341ad948ef3cacb9060a71ed776df3a20ef93189f5aad45d2a44059de9d\": rpc error: code = NotFound desc = could not find container \"8d7ea341ad948ef3cacb9060a71ed776df3a20ef93189f5aad45d2a44059de9d\": container with ID starting with 8d7ea341ad948ef3cacb9060a71ed776df3a20ef93189f5aad45d2a44059de9d not found: ID does not exist" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.669930 4854 scope.go:117] "RemoveContainer" containerID="ced4ce77f9388502a9219c77bd21e1de0ace7ff06396bf631d6414185ce18a60" Oct 07 12:26:43 crc kubenswrapper[4854]: E1007 12:26:43.670362 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced4ce77f9388502a9219c77bd21e1de0ace7ff06396bf631d6414185ce18a60\": container with ID starting with ced4ce77f9388502a9219c77bd21e1de0ace7ff06396bf631d6414185ce18a60 not found: ID does not exist" containerID="ced4ce77f9388502a9219c77bd21e1de0ace7ff06396bf631d6414185ce18a60" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.670391 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced4ce77f9388502a9219c77bd21e1de0ace7ff06396bf631d6414185ce18a60"} err="failed to get container status \"ced4ce77f9388502a9219c77bd21e1de0ace7ff06396bf631d6414185ce18a60\": rpc error: code = NotFound desc = could not find container \"ced4ce77f9388502a9219c77bd21e1de0ace7ff06396bf631d6414185ce18a60\": container with ID starting with ced4ce77f9388502a9219c77bd21e1de0ace7ff06396bf631d6414185ce18a60 not found: ID does not exist" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.670408 4854 scope.go:117] "RemoveContainer" containerID="bb8802c8d410a115d053ddb00a9c55b7ecdd50ae1d6e2eef305df5f851f01606" Oct 07 12:26:43 crc kubenswrapper[4854]: E1007 12:26:43.670743 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8802c8d410a115d053ddb00a9c55b7ecdd50ae1d6e2eef305df5f851f01606\": container with ID starting with bb8802c8d410a115d053ddb00a9c55b7ecdd50ae1d6e2eef305df5f851f01606 not found: ID does not exist" containerID="bb8802c8d410a115d053ddb00a9c55b7ecdd50ae1d6e2eef305df5f851f01606" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.670790 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8802c8d410a115d053ddb00a9c55b7ecdd50ae1d6e2eef305df5f851f01606"} err="failed to get container status \"bb8802c8d410a115d053ddb00a9c55b7ecdd50ae1d6e2eef305df5f851f01606\": rpc error: code = NotFound desc = could not find container \"bb8802c8d410a115d053ddb00a9c55b7ecdd50ae1d6e2eef305df5f851f01606\": container with ID starting with bb8802c8d410a115d053ddb00a9c55b7ecdd50ae1d6e2eef305df5f851f01606 not found: ID does not exist" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.670818 4854 scope.go:117] "RemoveContainer" containerID="d72d46497894a1b682d8cca4abc2011cd54015111e4c8118183bb91d055c383d" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.687568 4854 scope.go:117] "RemoveContainer" containerID="01a4009660ae2ed3d46606e3f15ba2827eed6bd0ea74e553cfefcf7a5ce0776a" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.707544 4854 scope.go:117] "RemoveContainer" containerID="a57f6cc64b7307b131531d2dda49e8b1eab2a7d2e0f9634e98c5c20596a98c3a" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.725135 4854 scope.go:117] "RemoveContainer" containerID="d72d46497894a1b682d8cca4abc2011cd54015111e4c8118183bb91d055c383d" Oct 07 12:26:43 crc kubenswrapper[4854]: E1007 12:26:43.725705 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d72d46497894a1b682d8cca4abc2011cd54015111e4c8118183bb91d055c383d\": container with ID starting with d72d46497894a1b682d8cca4abc2011cd54015111e4c8118183bb91d055c383d not found: ID does not exist" containerID="d72d46497894a1b682d8cca4abc2011cd54015111e4c8118183bb91d055c383d" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.725752 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d72d46497894a1b682d8cca4abc2011cd54015111e4c8118183bb91d055c383d"} err="failed to get container status \"d72d46497894a1b682d8cca4abc2011cd54015111e4c8118183bb91d055c383d\": rpc error: code = NotFound desc = could not find container \"d72d46497894a1b682d8cca4abc2011cd54015111e4c8118183bb91d055c383d\": container with ID starting with d72d46497894a1b682d8cca4abc2011cd54015111e4c8118183bb91d055c383d not found: ID does not exist" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.725788 4854 scope.go:117] "RemoveContainer" containerID="01a4009660ae2ed3d46606e3f15ba2827eed6bd0ea74e553cfefcf7a5ce0776a" Oct 07 12:26:43 crc kubenswrapper[4854]: E1007 12:26:43.726109 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01a4009660ae2ed3d46606e3f15ba2827eed6bd0ea74e553cfefcf7a5ce0776a\": container with ID starting with 01a4009660ae2ed3d46606e3f15ba2827eed6bd0ea74e553cfefcf7a5ce0776a not found: ID does not exist" containerID="01a4009660ae2ed3d46606e3f15ba2827eed6bd0ea74e553cfefcf7a5ce0776a" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.726139 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01a4009660ae2ed3d46606e3f15ba2827eed6bd0ea74e553cfefcf7a5ce0776a"} err="failed to get container status \"01a4009660ae2ed3d46606e3f15ba2827eed6bd0ea74e553cfefcf7a5ce0776a\": rpc error: code = NotFound desc = could not find container \"01a4009660ae2ed3d46606e3f15ba2827eed6bd0ea74e553cfefcf7a5ce0776a\": container with ID starting with 01a4009660ae2ed3d46606e3f15ba2827eed6bd0ea74e553cfefcf7a5ce0776a not found: ID does not exist" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.726173 4854 scope.go:117] "RemoveContainer" containerID="a57f6cc64b7307b131531d2dda49e8b1eab2a7d2e0f9634e98c5c20596a98c3a" Oct 07 12:26:43 crc kubenswrapper[4854]: E1007 12:26:43.726466 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a57f6cc64b7307b131531d2dda49e8b1eab2a7d2e0f9634e98c5c20596a98c3a\": container with ID starting with a57f6cc64b7307b131531d2dda49e8b1eab2a7d2e0f9634e98c5c20596a98c3a not found: ID does not exist" containerID="a57f6cc64b7307b131531d2dda49e8b1eab2a7d2e0f9634e98c5c20596a98c3a" Oct 07 12:26:43 crc kubenswrapper[4854]: I1007 12:26:43.726523 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57f6cc64b7307b131531d2dda49e8b1eab2a7d2e0f9634e98c5c20596a98c3a"} err="failed to get container status \"a57f6cc64b7307b131531d2dda49e8b1eab2a7d2e0f9634e98c5c20596a98c3a\": rpc error: code = NotFound desc = could not find container \"a57f6cc64b7307b131531d2dda49e8b1eab2a7d2e0f9634e98c5c20596a98c3a\": container with ID starting with a57f6cc64b7307b131531d2dda49e8b1eab2a7d2e0f9634e98c5c20596a98c3a not found: ID does not exist" Oct 07 12:26:44 crc kubenswrapper[4854]: I1007 12:26:44.715633 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c51857-fb64-4aeb-9091-b50d3268b4ec" path="/var/lib/kubelet/pods/d7c51857-fb64-4aeb-9091-b50d3268b4ec/volumes" Oct 07 12:26:44 crc kubenswrapper[4854]: I1007 12:26:44.717371 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2519a25-7584-44fa-8751-eb0160c6eb14" path="/var/lib/kubelet/pods/f2519a25-7584-44fa-8751-eb0160c6eb14/volumes" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.241858 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" podUID="2c45dd65-0c11-4c29-8f62-d667bd61d974" containerName="oauth-openshift" containerID="cri-o://9c29b9fce1b630c9742dd9195d65dcf3c9d269806911501797983305c3954b05" gracePeriod=15 Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.618287 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.649886 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7494c98dcc-k8tbg"] Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650242 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ccfbc2-fd11-4014-80a1-ab815ea4521d" containerName="extract-content" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650263 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ccfbc2-fd11-4014-80a1-ab815ea4521d" containerName="extract-content" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650283 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc3ab00d-65d3-4281-8fd5-65dda5839c8d" containerName="extract-content" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650294 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3ab00d-65d3-4281-8fd5-65dda5839c8d" containerName="extract-content" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650311 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2519a25-7584-44fa-8751-eb0160c6eb14" containerName="extract-utilities" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650322 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2519a25-7584-44fa-8751-eb0160c6eb14" containerName="extract-utilities" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650362 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c51857-fb64-4aeb-9091-b50d3268b4ec" containerName="extract-content" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650372 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c51857-fb64-4aeb-9091-b50d3268b4ec" containerName="extract-content" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650388 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57c2262-320e-419b-97cb-47bc43675fe7" containerName="pruner" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650398 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57c2262-320e-419b-97cb-47bc43675fe7" containerName="pruner" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650414 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66ec6fc-5daf-4f6f-b057-80f0eb1a7584" containerName="pruner" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650424 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66ec6fc-5daf-4f6f-b057-80f0eb1a7584" containerName="pruner" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650441 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2519a25-7584-44fa-8751-eb0160c6eb14" containerName="extract-content" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650451 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2519a25-7584-44fa-8751-eb0160c6eb14" containerName="extract-content" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650467 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ccfbc2-fd11-4014-80a1-ab815ea4521d" containerName="registry-server" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650477 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ccfbc2-fd11-4014-80a1-ab815ea4521d" containerName="registry-server" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650490 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc3ab00d-65d3-4281-8fd5-65dda5839c8d" containerName="registry-server" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650500 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3ab00d-65d3-4281-8fd5-65dda5839c8d" containerName="registry-server" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650515 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c45dd65-0c11-4c29-8f62-d667bd61d974" containerName="oauth-openshift" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650524 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c45dd65-0c11-4c29-8f62-d667bd61d974" containerName="oauth-openshift" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650538 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e06c8ed-2c8c-4c09-a352-31432fbd7d40" containerName="kube-multus-additional-cni-plugins" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650548 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e06c8ed-2c8c-4c09-a352-31432fbd7d40" containerName="kube-multus-additional-cni-plugins" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650562 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c51857-fb64-4aeb-9091-b50d3268b4ec" containerName="extract-utilities" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650572 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c51857-fb64-4aeb-9091-b50d3268b4ec" containerName="extract-utilities" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650589 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc3ab00d-65d3-4281-8fd5-65dda5839c8d" containerName="extract-utilities" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650598 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3ab00d-65d3-4281-8fd5-65dda5839c8d" containerName="extract-utilities" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650614 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2519a25-7584-44fa-8751-eb0160c6eb14" containerName="registry-server" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650624 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2519a25-7584-44fa-8751-eb0160c6eb14" containerName="registry-server" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650640 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ccfbc2-fd11-4014-80a1-ab815ea4521d" containerName="extract-utilities" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650651 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ccfbc2-fd11-4014-80a1-ab815ea4521d" containerName="extract-utilities" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.650664 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c51857-fb64-4aeb-9091-b50d3268b4ec" containerName="registry-server" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650674 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c51857-fb64-4aeb-9091-b50d3268b4ec" containerName="registry-server" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650826 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ccfbc2-fd11-4014-80a1-ab815ea4521d" containerName="registry-server" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650844 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="c66ec6fc-5daf-4f6f-b057-80f0eb1a7584" containerName="pruner" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650858 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a57c2262-320e-419b-97cb-47bc43675fe7" containerName="pruner" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650872 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e06c8ed-2c8c-4c09-a352-31432fbd7d40" containerName="kube-multus-additional-cni-plugins" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650885 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2519a25-7584-44fa-8751-eb0160c6eb14" containerName="registry-server" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650905 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc3ab00d-65d3-4281-8fd5-65dda5839c8d" containerName="registry-server" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650920 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c45dd65-0c11-4c29-8f62-d667bd61d974" containerName="oauth-openshift" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.650934 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c51857-fb64-4aeb-9091-b50d3268b4ec" containerName="registry-server" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.651594 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.674672 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7494c98dcc-k8tbg"] Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.675045 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2msk\" (UniqueName: \"kubernetes.io/projected/2c45dd65-0c11-4c29-8f62-d667bd61d974-kube-api-access-d2msk\") pod \"2c45dd65-0c11-4c29-8f62-d667bd61d974\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.675180 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-login\") pod \"2c45dd65-0c11-4c29-8f62-d667bd61d974\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.675219 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-cliconfig\") pod \"2c45dd65-0c11-4c29-8f62-d667bd61d974\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.675275 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-router-certs\") pod \"2c45dd65-0c11-4c29-8f62-d667bd61d974\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.675300 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-idp-0-file-data\") pod \"2c45dd65-0c11-4c29-8f62-d667bd61d974\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.675340 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-trusted-ca-bundle\") pod \"2c45dd65-0c11-4c29-8f62-d667bd61d974\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.675361 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-serving-cert\") pod \"2c45dd65-0c11-4c29-8f62-d667bd61d974\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.675392 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-session\") pod \"2c45dd65-0c11-4c29-8f62-d667bd61d974\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.675587 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-provider-selection\") pod \"2c45dd65-0c11-4c29-8f62-d667bd61d974\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.675645 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c45dd65-0c11-4c29-8f62-d667bd61d974-audit-dir\") pod \"2c45dd65-0c11-4c29-8f62-d667bd61d974\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.675694 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-audit-policies\") pod \"2c45dd65-0c11-4c29-8f62-d667bd61d974\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.675745 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-error\") pod \"2c45dd65-0c11-4c29-8f62-d667bd61d974\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.675814 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-service-ca\") pod \"2c45dd65-0c11-4c29-8f62-d667bd61d974\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.675855 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-ocp-branding-template\") pod \"2c45dd65-0c11-4c29-8f62-d667bd61d974\" (UID: \"2c45dd65-0c11-4c29-8f62-d667bd61d974\") " Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.676092 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.676130 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-audit-dir\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.676177 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.676210 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.676375 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.676411 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-session\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.677233 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "2c45dd65-0c11-4c29-8f62-d667bd61d974" (UID: "2c45dd65-0c11-4c29-8f62-d667bd61d974"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.677217 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c45dd65-0c11-4c29-8f62-d667bd61d974-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2c45dd65-0c11-4c29-8f62-d667bd61d974" (UID: "2c45dd65-0c11-4c29-8f62-d667bd61d974"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.677294 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "2c45dd65-0c11-4c29-8f62-d667bd61d974" (UID: "2c45dd65-0c11-4c29-8f62-d667bd61d974"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.677711 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "2c45dd65-0c11-4c29-8f62-d667bd61d974" (UID: "2c45dd65-0c11-4c29-8f62-d667bd61d974"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.677763 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x756r\" (UniqueName: \"kubernetes.io/projected/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-kube-api-access-x756r\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.677865 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.677899 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.677946 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-audit-policies\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.677965 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-user-template-login\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.677986 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-user-template-error\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.677933 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2c45dd65-0c11-4c29-8f62-d667bd61d974" (UID: "2c45dd65-0c11-4c29-8f62-d667bd61d974"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.678329 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.678378 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.678598 4854 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c45dd65-0c11-4c29-8f62-d667bd61d974-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.678626 4854 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.678644 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.678710 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.678731 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.682825 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "2c45dd65-0c11-4c29-8f62-d667bd61d974" (UID: "2c45dd65-0c11-4c29-8f62-d667bd61d974"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.683029 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "2c45dd65-0c11-4c29-8f62-d667bd61d974" (UID: "2c45dd65-0c11-4c29-8f62-d667bd61d974"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.683563 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "2c45dd65-0c11-4c29-8f62-d667bd61d974" (UID: "2c45dd65-0c11-4c29-8f62-d667bd61d974"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.684209 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "2c45dd65-0c11-4c29-8f62-d667bd61d974" (UID: "2c45dd65-0c11-4c29-8f62-d667bd61d974"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.685989 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c45dd65-0c11-4c29-8f62-d667bd61d974-kube-api-access-d2msk" (OuterVolumeSpecName: "kube-api-access-d2msk") pod "2c45dd65-0c11-4c29-8f62-d667bd61d974" (UID: "2c45dd65-0c11-4c29-8f62-d667bd61d974"). InnerVolumeSpecName "kube-api-access-d2msk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.686338 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "2c45dd65-0c11-4c29-8f62-d667bd61d974" (UID: "2c45dd65-0c11-4c29-8f62-d667bd61d974"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.688835 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "2c45dd65-0c11-4c29-8f62-d667bd61d974" (UID: "2c45dd65-0c11-4c29-8f62-d667bd61d974"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.697599 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "2c45dd65-0c11-4c29-8f62-d667bd61d974" (UID: "2c45dd65-0c11-4c29-8f62-d667bd61d974"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.698028 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "2c45dd65-0c11-4c29-8f62-d667bd61d974" (UID: "2c45dd65-0c11-4c29-8f62-d667bd61d974"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.705052 4854 generic.go:334] "Generic (PLEG): container finished" podID="2c45dd65-0c11-4c29-8f62-d667bd61d974" containerID="9c29b9fce1b630c9742dd9195d65dcf3c9d269806911501797983305c3954b05" exitCode=0 Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.705099 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" event={"ID":"2c45dd65-0c11-4c29-8f62-d667bd61d974","Type":"ContainerDied","Data":"9c29b9fce1b630c9742dd9195d65dcf3c9d269806911501797983305c3954b05"} Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.705129 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" event={"ID":"2c45dd65-0c11-4c29-8f62-d667bd61d974","Type":"ContainerDied","Data":"61254e358b2dac7f661aecb8afb88edc26eed1baa32f0e9beb48f5e20a2e4840"} Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.705164 4854 scope.go:117] "RemoveContainer" containerID="9c29b9fce1b630c9742dd9195d65dcf3c9d269806911501797983305c3954b05" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.705275 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.743757 4854 scope.go:117] "RemoveContainer" containerID="9c29b9fce1b630c9742dd9195d65dcf3c9d269806911501797983305c3954b05" Oct 07 12:26:54 crc kubenswrapper[4854]: E1007 12:26:54.744318 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c29b9fce1b630c9742dd9195d65dcf3c9d269806911501797983305c3954b05\": container with ID starting with 9c29b9fce1b630c9742dd9195d65dcf3c9d269806911501797983305c3954b05 not found: ID does not exist" containerID="9c29b9fce1b630c9742dd9195d65dcf3c9d269806911501797983305c3954b05" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.744357 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c29b9fce1b630c9742dd9195d65dcf3c9d269806911501797983305c3954b05"} err="failed to get container status \"9c29b9fce1b630c9742dd9195d65dcf3c9d269806911501797983305c3954b05\": rpc error: code = NotFound desc = could not find container \"9c29b9fce1b630c9742dd9195d65dcf3c9d269806911501797983305c3954b05\": container with ID starting with 9c29b9fce1b630c9742dd9195d65dcf3c9d269806911501797983305c3954b05 not found: ID does not exist" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.780061 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-audit-policies\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.780103 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-user-template-login\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.780122 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-user-template-error\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.780172 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.780875 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.780915 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.780939 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-audit-dir\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.780961 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.780992 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.781011 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.781023 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-audit-policies\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.781054 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-session\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.781106 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-audit-dir\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.781165 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x756r\" (UniqueName: \"kubernetes.io/projected/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-kube-api-access-x756r\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.781991 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.782070 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.782249 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.782264 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.782302 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.782321 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.782333 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.782345 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2msk\" (UniqueName: \"kubernetes.io/projected/2c45dd65-0c11-4c29-8f62-d667bd61d974-kube-api-access-d2msk\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.782378 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.782395 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.782409 4854 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2c45dd65-0c11-4c29-8f62-d667bd61d974-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.782408 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.782716 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.782956 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.784055 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-user-template-error\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.785039 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.785260 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-session\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.786226 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.786369 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-user-template-login\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.786647 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.786849 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.787909 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.806236 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x756r\" (UniqueName: \"kubernetes.io/projected/6fdc3070-fc0c-4fd1-baf6-b70439fe08dc-kube-api-access-x756r\") pod \"oauth-openshift-7494c98dcc-k8tbg\" (UID: \"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:54 crc kubenswrapper[4854]: I1007 12:26:54.974299 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:55 crc kubenswrapper[4854]: I1007 12:26:55.227115 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7494c98dcc-k8tbg"] Oct 07 12:26:55 crc kubenswrapper[4854]: W1007 12:26:55.239273 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fdc3070_fc0c_4fd1_baf6_b70439fe08dc.slice/crio-fb608b3ae660ae6662bd1105fdf03e3a261c1ddd50b0b727d416014bf47c2fd3 WatchSource:0}: Error finding container fb608b3ae660ae6662bd1105fdf03e3a261c1ddd50b0b727d416014bf47c2fd3: Status 404 returned error can't find the container with id fb608b3ae660ae6662bd1105fdf03e3a261c1ddd50b0b727d416014bf47c2fd3 Oct 07 12:26:55 crc kubenswrapper[4854]: I1007 12:26:55.715908 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" event={"ID":"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc","Type":"ContainerStarted","Data":"dcabfbf1bbda460d8e49b62a4912c1195a2ff101b97e14c269ff46fe9ed3e91b"} Oct 07 12:26:55 crc kubenswrapper[4854]: I1007 12:26:55.716000 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" event={"ID":"6fdc3070-fc0c-4fd1-baf6-b70439fe08dc","Type":"ContainerStarted","Data":"fb608b3ae660ae6662bd1105fdf03e3a261c1ddd50b0b727d416014bf47c2fd3"} Oct 07 12:26:55 crc kubenswrapper[4854]: I1007 12:26:55.716332 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:26:55 crc kubenswrapper[4854]: I1007 12:26:55.743809 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" podStartSLOduration=26.743772425 podStartE2EDuration="26.743772425s" podCreationTimestamp="2025-10-07 12:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:26:55.739956745 +0000 UTC m=+131.727789010" watchObservedRunningTime="2025-10-07 12:26:55.743772425 +0000 UTC m=+131.731604690" Oct 07 12:26:56 crc kubenswrapper[4854]: I1007 12:26:56.302082 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7494c98dcc-k8tbg" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.452802 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7s59d"] Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.453877 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7s59d" podUID="87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" containerName="registry-server" containerID="cri-o://3a3a0ebfb1d077bddbca365dd95b49deca87b1158bfc34ef7de03f7c3c27a7e2" gracePeriod=30 Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.460650 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gnz24"] Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.461043 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gnz24" podUID="f1de49b7-71c6-450c-ab89-0c37fb18b32a" containerName="registry-server" containerID="cri-o://c07ac8dabbf7fe77a854d7d589e8a659fbacb855f2896a58bfdf65cc02402a32" gracePeriod=30 Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.470659 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nkh4b"] Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.470956 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" podUID="329204da-6485-459a-bf30-0ae870c46ca2" containerName="marketplace-operator" containerID="cri-o://33fcf1ba3846e030da024080cd90003baf5c10189156f3e2006d7d778b9aa2bb" gracePeriod=30 Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.479978 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn5m7"] Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.480337 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wn5m7" podUID="573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" containerName="registry-server" containerID="cri-o://e935299c2c78782e238a464493dbeb4ee7c683b72dea3552fea0953ccb46b0b0" gracePeriod=30 Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.508315 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bplpj"] Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.509396 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.512250 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zst5f"] Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.512681 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zst5f" podUID="e31cf8e1-9bcc-4bf0-9189-950e81595d38" containerName="registry-server" containerID="cri-o://53deea10f18e0e76fdc0a3d62084e00d88c92a3336e2a713042001c764541007" gracePeriod=30 Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.529922 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bplpj"] Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.585269 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5r6\" (UniqueName: \"kubernetes.io/projected/edab8120-5f75-4055-8538-bba0045cd1f2-kube-api-access-rv5r6\") pod \"marketplace-operator-79b997595-bplpj\" (UID: \"edab8120-5f75-4055-8538-bba0045cd1f2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.585330 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edab8120-5f75-4055-8538-bba0045cd1f2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bplpj\" (UID: \"edab8120-5f75-4055-8538-bba0045cd1f2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.585391 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/edab8120-5f75-4055-8538-bba0045cd1f2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bplpj\" (UID: \"edab8120-5f75-4055-8538-bba0045cd1f2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.686895 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/edab8120-5f75-4055-8538-bba0045cd1f2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bplpj\" (UID: \"edab8120-5f75-4055-8538-bba0045cd1f2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.686971 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5r6\" (UniqueName: \"kubernetes.io/projected/edab8120-5f75-4055-8538-bba0045cd1f2-kube-api-access-rv5r6\") pod \"marketplace-operator-79b997595-bplpj\" (UID: \"edab8120-5f75-4055-8538-bba0045cd1f2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.687002 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edab8120-5f75-4055-8538-bba0045cd1f2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bplpj\" (UID: \"edab8120-5f75-4055-8538-bba0045cd1f2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.688716 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edab8120-5f75-4055-8538-bba0045cd1f2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bplpj\" (UID: \"edab8120-5f75-4055-8538-bba0045cd1f2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.707347 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/edab8120-5f75-4055-8538-bba0045cd1f2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bplpj\" (UID: \"edab8120-5f75-4055-8538-bba0045cd1f2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.714240 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5r6\" (UniqueName: \"kubernetes.io/projected/edab8120-5f75-4055-8538-bba0045cd1f2-kube-api-access-rv5r6\") pod \"marketplace-operator-79b997595-bplpj\" (UID: \"edab8120-5f75-4055-8538-bba0045cd1f2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.816303 4854 generic.go:334] "Generic (PLEG): container finished" podID="e31cf8e1-9bcc-4bf0-9189-950e81595d38" containerID="53deea10f18e0e76fdc0a3d62084e00d88c92a3336e2a713042001c764541007" exitCode=0 Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.816396 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zst5f" event={"ID":"e31cf8e1-9bcc-4bf0-9189-950e81595d38","Type":"ContainerDied","Data":"53deea10f18e0e76fdc0a3d62084e00d88c92a3336e2a713042001c764541007"} Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.820174 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn5m7" event={"ID":"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8","Type":"ContainerDied","Data":"e935299c2c78782e238a464493dbeb4ee7c683b72dea3552fea0953ccb46b0b0"} Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.820195 4854 generic.go:334] "Generic (PLEG): container finished" podID="573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" containerID="e935299c2c78782e238a464493dbeb4ee7c683b72dea3552fea0953ccb46b0b0" exitCode=0 Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.821962 4854 generic.go:334] "Generic (PLEG): container finished" podID="329204da-6485-459a-bf30-0ae870c46ca2" containerID="33fcf1ba3846e030da024080cd90003baf5c10189156f3e2006d7d778b9aa2bb" exitCode=0 Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.822030 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" event={"ID":"329204da-6485-459a-bf30-0ae870c46ca2","Type":"ContainerDied","Data":"33fcf1ba3846e030da024080cd90003baf5c10189156f3e2006d7d778b9aa2bb"} Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.824418 4854 generic.go:334] "Generic (PLEG): container finished" podID="f1de49b7-71c6-450c-ab89-0c37fb18b32a" containerID="c07ac8dabbf7fe77a854d7d589e8a659fbacb855f2896a58bfdf65cc02402a32" exitCode=0 Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.824481 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnz24" event={"ID":"f1de49b7-71c6-450c-ab89-0c37fb18b32a","Type":"ContainerDied","Data":"c07ac8dabbf7fe77a854d7d589e8a659fbacb855f2896a58bfdf65cc02402a32"} Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.827406 4854 generic.go:334] "Generic (PLEG): container finished" podID="87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" containerID="3a3a0ebfb1d077bddbca365dd95b49deca87b1158bfc34ef7de03f7c3c27a7e2" exitCode=0 Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.827442 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7s59d" event={"ID":"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8","Type":"ContainerDied","Data":"3a3a0ebfb1d077bddbca365dd95b49deca87b1158bfc34ef7de03f7c3c27a7e2"} Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.827485 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7s59d" event={"ID":"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8","Type":"ContainerDied","Data":"fab199944ed004b66c0ca79d88cea00a61018e26c8491205911fa9d2ff7b6658"} Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.827499 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fab199944ed004b66c0ca79d88cea00a61018e26c8491205911fa9d2ff7b6658" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.903763 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.909423 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.929899 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.932207 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.946902 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.994102 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfbst\" (UniqueName: \"kubernetes.io/projected/329204da-6485-459a-bf30-0ae870c46ca2-kube-api-access-kfbst\") pod \"329204da-6485-459a-bf30-0ae870c46ca2\" (UID: \"329204da-6485-459a-bf30-0ae870c46ca2\") " Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.994162 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1de49b7-71c6-450c-ab89-0c37fb18b32a-catalog-content\") pod \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\" (UID: \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\") " Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.994191 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-catalog-content\") pod \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\" (UID: \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\") " Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.994214 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/329204da-6485-459a-bf30-0ae870c46ca2-marketplace-trusted-ca\") pod \"329204da-6485-459a-bf30-0ae870c46ca2\" (UID: \"329204da-6485-459a-bf30-0ae870c46ca2\") " Oct 07 12:27:08 crc kubenswrapper[4854]: I1007 12:27:08.998463 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/329204da-6485-459a-bf30-0ae870c46ca2-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "329204da-6485-459a-bf30-0ae870c46ca2" (UID: "329204da-6485-459a-bf30-0ae870c46ca2"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.002081 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329204da-6485-459a-bf30-0ae870c46ca2-kube-api-access-kfbst" (OuterVolumeSpecName: "kube-api-access-kfbst") pod "329204da-6485-459a-bf30-0ae870c46ca2" (UID: "329204da-6485-459a-bf30-0ae870c46ca2"). InnerVolumeSpecName "kube-api-access-kfbst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.068513 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1de49b7-71c6-450c-ab89-0c37fb18b32a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1de49b7-71c6-450c-ab89-0c37fb18b32a" (UID: "f1de49b7-71c6-450c-ab89-0c37fb18b32a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.068720 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" (UID: "87c2c08e-dd9a-47b9-8e82-5419fdb6cda8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.095928 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-utilities\") pod \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\" (UID: \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\") " Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.096100 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mnxx\" (UniqueName: \"kubernetes.io/projected/f1de49b7-71c6-450c-ab89-0c37fb18b32a-kube-api-access-2mnxx\") pod \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\" (UID: \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\") " Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.096243 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-catalog-content\") pod \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\" (UID: \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\") " Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.096365 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5w2t\" (UniqueName: \"kubernetes.io/projected/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-kube-api-access-j5w2t\") pod \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\" (UID: \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\") " Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.096413 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/329204da-6485-459a-bf30-0ae870c46ca2-marketplace-operator-metrics\") pod \"329204da-6485-459a-bf30-0ae870c46ca2\" (UID: \"329204da-6485-459a-bf30-0ae870c46ca2\") " Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.096473 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wxz5\" (UniqueName: \"kubernetes.io/projected/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-kube-api-access-6wxz5\") pod \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\" (UID: \"87c2c08e-dd9a-47b9-8e82-5419fdb6cda8\") " Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.096535 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-utilities\") pod \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\" (UID: \"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8\") " Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.096567 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1de49b7-71c6-450c-ab89-0c37fb18b32a-utilities\") pod \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\" (UID: \"f1de49b7-71c6-450c-ab89-0c37fb18b32a\") " Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.096875 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-utilities" (OuterVolumeSpecName: "utilities") pod "87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" (UID: "87c2c08e-dd9a-47b9-8e82-5419fdb6cda8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.097120 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.097135 4854 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/329204da-6485-459a-bf30-0ae870c46ca2-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.097169 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.097183 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1de49b7-71c6-450c-ab89-0c37fb18b32a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.097196 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfbst\" (UniqueName: \"kubernetes.io/projected/329204da-6485-459a-bf30-0ae870c46ca2-kube-api-access-kfbst\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.097863 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1de49b7-71c6-450c-ab89-0c37fb18b32a-utilities" (OuterVolumeSpecName: "utilities") pod "f1de49b7-71c6-450c-ab89-0c37fb18b32a" (UID: "f1de49b7-71c6-450c-ab89-0c37fb18b32a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.097909 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-utilities" (OuterVolumeSpecName: "utilities") pod "573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" (UID: "573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.100348 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-kube-api-access-6wxz5" (OuterVolumeSpecName: "kube-api-access-6wxz5") pod "87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" (UID: "87c2c08e-dd9a-47b9-8e82-5419fdb6cda8"). InnerVolumeSpecName "kube-api-access-6wxz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.101859 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1de49b7-71c6-450c-ab89-0c37fb18b32a-kube-api-access-2mnxx" (OuterVolumeSpecName: "kube-api-access-2mnxx") pod "f1de49b7-71c6-450c-ab89-0c37fb18b32a" (UID: "f1de49b7-71c6-450c-ab89-0c37fb18b32a"). InnerVolumeSpecName "kube-api-access-2mnxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.101945 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-kube-api-access-j5w2t" (OuterVolumeSpecName: "kube-api-access-j5w2t") pod "573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" (UID: "573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8"). InnerVolumeSpecName "kube-api-access-j5w2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.103846 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329204da-6485-459a-bf30-0ae870c46ca2-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "329204da-6485-459a-bf30-0ae870c46ca2" (UID: "329204da-6485-459a-bf30-0ae870c46ca2"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.113696 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" (UID: "573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.165564 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bplpj"] Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.199546 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.199581 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1de49b7-71c6-450c-ab89-0c37fb18b32a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.199594 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mnxx\" (UniqueName: \"kubernetes.io/projected/f1de49b7-71c6-450c-ab89-0c37fb18b32a-kube-api-access-2mnxx\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.199606 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.199615 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5w2t\" (UniqueName: \"kubernetes.io/projected/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8-kube-api-access-j5w2t\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.199625 4854 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/329204da-6485-459a-bf30-0ae870c46ca2-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.199637 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wxz5\" (UniqueName: \"kubernetes.io/projected/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8-kube-api-access-6wxz5\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.349482 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.501763 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e31cf8e1-9bcc-4bf0-9189-950e81595d38-catalog-content\") pod \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\" (UID: \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\") " Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.501827 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e31cf8e1-9bcc-4bf0-9189-950e81595d38-utilities\") pod \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\" (UID: \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\") " Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.501947 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsps7\" (UniqueName: \"kubernetes.io/projected/e31cf8e1-9bcc-4bf0-9189-950e81595d38-kube-api-access-xsps7\") pod \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\" (UID: \"e31cf8e1-9bcc-4bf0-9189-950e81595d38\") " Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.502729 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e31cf8e1-9bcc-4bf0-9189-950e81595d38-utilities" (OuterVolumeSpecName: "utilities") pod "e31cf8e1-9bcc-4bf0-9189-950e81595d38" (UID: "e31cf8e1-9bcc-4bf0-9189-950e81595d38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.503200 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e31cf8e1-9bcc-4bf0-9189-950e81595d38-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.505370 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e31cf8e1-9bcc-4bf0-9189-950e81595d38-kube-api-access-xsps7" (OuterVolumeSpecName: "kube-api-access-xsps7") pod "e31cf8e1-9bcc-4bf0-9189-950e81595d38" (UID: "e31cf8e1-9bcc-4bf0-9189-950e81595d38"). InnerVolumeSpecName "kube-api-access-xsps7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.602705 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e31cf8e1-9bcc-4bf0-9189-950e81595d38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e31cf8e1-9bcc-4bf0-9189-950e81595d38" (UID: "e31cf8e1-9bcc-4bf0-9189-950e81595d38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.604910 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e31cf8e1-9bcc-4bf0-9189-950e81595d38-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.604951 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsps7\" (UniqueName: \"kubernetes.io/projected/e31cf8e1-9bcc-4bf0-9189-950e81595d38-kube-api-access-xsps7\") on node \"crc\" DevicePath \"\"" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.837119 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zst5f" event={"ID":"e31cf8e1-9bcc-4bf0-9189-950e81595d38","Type":"ContainerDied","Data":"c492240cad7e765ce2fe16a4b023e385ba4cf0dfde1fa0466ec057dbe587b061"} Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.837180 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zst5f" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.837229 4854 scope.go:117] "RemoveContainer" containerID="53deea10f18e0e76fdc0a3d62084e00d88c92a3336e2a713042001c764541007" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.839860 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn5m7" event={"ID":"573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8","Type":"ContainerDied","Data":"eda3f62d6b983fbc852aad92fdb3709c93fa8d5013037eb80a4c84a7e956ae69"} Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.839922 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wn5m7" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.841490 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.841527 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nkh4b" event={"ID":"329204da-6485-459a-bf30-0ae870c46ca2","Type":"ContainerDied","Data":"0762e223565550e4607d4d0cfa4a10e11783430320c8bf8ff08ab67d03b93d50"} Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.845186 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnz24" event={"ID":"f1de49b7-71c6-450c-ab89-0c37fb18b32a","Type":"ContainerDied","Data":"43c7c4bb6c694f43e57e4cc6ede701032fd69701a586a2b87ff859d1d33387fc"} Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.845308 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnz24" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.851650 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" event={"ID":"edab8120-5f75-4055-8538-bba0045cd1f2","Type":"ContainerStarted","Data":"49cb92460696c4502cad78a3a17a03c9f3e7afa74bf13e6e696e3ab912ac4ed6"} Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.851705 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" event={"ID":"edab8120-5f75-4055-8538-bba0045cd1f2","Type":"ContainerStarted","Data":"8d34ed8b6cbbc78b68e38b687006f3ef10f8cb13872f3c765cc3bcbc011f8736"} Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.851725 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7s59d" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.855673 4854 scope.go:117] "RemoveContainer" containerID="ff1af110163ccec3e98465b0e7935881e3fa0b1fde56ce13b5d7f87e73f42c1a" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.901396 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" podStartSLOduration=1.901366201 podStartE2EDuration="1.901366201s" podCreationTimestamp="2025-10-07 12:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:27:09.893741001 +0000 UTC m=+145.881573256" watchObservedRunningTime="2025-10-07 12:27:09.901366201 +0000 UTC m=+145.889198456" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.903582 4854 scope.go:117] "RemoveContainer" containerID="d10aa26c8cb4dc0a99da5a7b94b543cf071ce1012b41962157f4f22f89263455" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.916699 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn5m7"] Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.920438 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn5m7"] Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.934891 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nkh4b"] Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.939821 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nkh4b"] Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.940450 4854 scope.go:117] "RemoveContainer" containerID="e935299c2c78782e238a464493dbeb4ee7c683b72dea3552fea0953ccb46b0b0" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.943480 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gnz24"] Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.945429 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gnz24"] Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.954780 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zst5f"] Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.958095 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zst5f"] Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.961351 4854 scope.go:117] "RemoveContainer" containerID="def927a5514997fdbc27dc01f48d1b411872978049138270a5748babe4f131a4" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.963735 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7s59d"] Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.968917 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7s59d"] Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.975645 4854 scope.go:117] "RemoveContainer" containerID="7d444d8cc24a4a36c65b7e6c8ddc040ce526c4fa501eec3a85478dd346f2542d" Oct 07 12:27:09 crc kubenswrapper[4854]: I1007 12:27:09.996169 4854 scope.go:117] "RemoveContainer" containerID="33fcf1ba3846e030da024080cd90003baf5c10189156f3e2006d7d778b9aa2bb" Oct 07 12:27:10 crc kubenswrapper[4854]: I1007 12:27:10.010904 4854 scope.go:117] "RemoveContainer" containerID="c07ac8dabbf7fe77a854d7d589e8a659fbacb855f2896a58bfdf65cc02402a32" Oct 07 12:27:10 crc kubenswrapper[4854]: I1007 12:27:10.025182 4854 scope.go:117] "RemoveContainer" containerID="c630e58e6d22169a2e0a3cf2846de0e3376f92c7acf2122ea3a11efff281b537" Oct 07 12:27:10 crc kubenswrapper[4854]: I1007 12:27:10.050739 4854 scope.go:117] "RemoveContainer" containerID="f08b48fcdb02df154fab97a3b33d81a3092f36ffbb8de343cef5d7b9c953a38a" Oct 07 12:27:10 crc kubenswrapper[4854]: I1007 12:27:10.714092 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329204da-6485-459a-bf30-0ae870c46ca2" path="/var/lib/kubelet/pods/329204da-6485-459a-bf30-0ae870c46ca2/volumes" Oct 07 12:27:10 crc kubenswrapper[4854]: I1007 12:27:10.715685 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" path="/var/lib/kubelet/pods/573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8/volumes" Oct 07 12:27:10 crc kubenswrapper[4854]: I1007 12:27:10.716560 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" path="/var/lib/kubelet/pods/87c2c08e-dd9a-47b9-8e82-5419fdb6cda8/volumes" Oct 07 12:27:10 crc kubenswrapper[4854]: I1007 12:27:10.717999 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e31cf8e1-9bcc-4bf0-9189-950e81595d38" path="/var/lib/kubelet/pods/e31cf8e1-9bcc-4bf0-9189-950e81595d38/volumes" Oct 07 12:27:10 crc kubenswrapper[4854]: I1007 12:27:10.718802 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1de49b7-71c6-450c-ab89-0c37fb18b32a" path="/var/lib/kubelet/pods/f1de49b7-71c6-450c-ab89-0c37fb18b32a/volumes" Oct 07 12:27:10 crc kubenswrapper[4854]: I1007 12:27:10.807340 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:27:10 crc kubenswrapper[4854]: I1007 12:27:10.807727 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:27:10 crc kubenswrapper[4854]: I1007 12:27:10.865075 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" Oct 07 12:27:10 crc kubenswrapper[4854]: I1007 12:27:10.873102 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bplpj" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.558305 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fddbx"] Oct 07 12:27:11 crc kubenswrapper[4854]: E1007 12:27:11.558860 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1de49b7-71c6-450c-ab89-0c37fb18b32a" containerName="extract-utilities" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.558897 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1de49b7-71c6-450c-ab89-0c37fb18b32a" containerName="extract-utilities" Oct 07 12:27:11 crc kubenswrapper[4854]: E1007 12:27:11.558925 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1de49b7-71c6-450c-ab89-0c37fb18b32a" containerName="registry-server" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.558945 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1de49b7-71c6-450c-ab89-0c37fb18b32a" containerName="registry-server" Oct 07 12:27:11 crc kubenswrapper[4854]: E1007 12:27:11.559008 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" containerName="registry-server" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559026 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" containerName="registry-server" Oct 07 12:27:11 crc kubenswrapper[4854]: E1007 12:27:11.559050 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e31cf8e1-9bcc-4bf0-9189-950e81595d38" containerName="extract-utilities" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559066 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e31cf8e1-9bcc-4bf0-9189-950e81595d38" containerName="extract-utilities" Oct 07 12:27:11 crc kubenswrapper[4854]: E1007 12:27:11.559090 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" containerName="registry-server" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559130 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" containerName="registry-server" Oct 07 12:27:11 crc kubenswrapper[4854]: E1007 12:27:11.559214 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e31cf8e1-9bcc-4bf0-9189-950e81595d38" containerName="registry-server" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559244 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e31cf8e1-9bcc-4bf0-9189-950e81595d38" containerName="registry-server" Oct 07 12:27:11 crc kubenswrapper[4854]: E1007 12:27:11.559261 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1de49b7-71c6-450c-ab89-0c37fb18b32a" containerName="extract-content" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559274 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1de49b7-71c6-450c-ab89-0c37fb18b32a" containerName="extract-content" Oct 07 12:27:11 crc kubenswrapper[4854]: E1007 12:27:11.559287 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" containerName="extract-content" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559300 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" containerName="extract-content" Oct 07 12:27:11 crc kubenswrapper[4854]: E1007 12:27:11.559319 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" containerName="extract-utilities" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559333 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" containerName="extract-utilities" Oct 07 12:27:11 crc kubenswrapper[4854]: E1007 12:27:11.559347 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" containerName="extract-utilities" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559362 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" containerName="extract-utilities" Oct 07 12:27:11 crc kubenswrapper[4854]: E1007 12:27:11.559379 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" containerName="extract-content" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559403 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" containerName="extract-content" Oct 07 12:27:11 crc kubenswrapper[4854]: E1007 12:27:11.559429 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e31cf8e1-9bcc-4bf0-9189-950e81595d38" containerName="extract-content" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559444 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e31cf8e1-9bcc-4bf0-9189-950e81595d38" containerName="extract-content" Oct 07 12:27:11 crc kubenswrapper[4854]: E1007 12:27:11.559470 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329204da-6485-459a-bf30-0ae870c46ca2" containerName="marketplace-operator" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559486 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="329204da-6485-459a-bf30-0ae870c46ca2" containerName="marketplace-operator" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559713 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="329204da-6485-459a-bf30-0ae870c46ca2" containerName="marketplace-operator" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559747 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1de49b7-71c6-450c-ab89-0c37fb18b32a" containerName="registry-server" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559769 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="e31cf8e1-9bcc-4bf0-9189-950e81595d38" containerName="registry-server" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559786 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="573bfe9e-2b5e-47dc-9d08-9bccdca9a7b8" containerName="registry-server" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.559802 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c2c08e-dd9a-47b9-8e82-5419fdb6cda8" containerName="registry-server" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.561446 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.564318 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.571223 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fddbx"] Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.733648 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzc7m\" (UniqueName: \"kubernetes.io/projected/7078ad02-317f-4f7f-a11b-7c1d24adac56-kube-api-access-zzc7m\") pod \"redhat-marketplace-fddbx\" (UID: \"7078ad02-317f-4f7f-a11b-7c1d24adac56\") " pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.733717 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7078ad02-317f-4f7f-a11b-7c1d24adac56-catalog-content\") pod \"redhat-marketplace-fddbx\" (UID: \"7078ad02-317f-4f7f-a11b-7c1d24adac56\") " pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.733749 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7078ad02-317f-4f7f-a11b-7c1d24adac56-utilities\") pod \"redhat-marketplace-fddbx\" (UID: \"7078ad02-317f-4f7f-a11b-7c1d24adac56\") " pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.756590 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d4k2f"] Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.758264 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.762333 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.770746 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d4k2f"] Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.834732 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzc7m\" (UniqueName: \"kubernetes.io/projected/7078ad02-317f-4f7f-a11b-7c1d24adac56-kube-api-access-zzc7m\") pod \"redhat-marketplace-fddbx\" (UID: \"7078ad02-317f-4f7f-a11b-7c1d24adac56\") " pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.834791 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7078ad02-317f-4f7f-a11b-7c1d24adac56-catalog-content\") pod \"redhat-marketplace-fddbx\" (UID: \"7078ad02-317f-4f7f-a11b-7c1d24adac56\") " pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.834820 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7078ad02-317f-4f7f-a11b-7c1d24adac56-utilities\") pod \"redhat-marketplace-fddbx\" (UID: \"7078ad02-317f-4f7f-a11b-7c1d24adac56\") " pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.835430 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7078ad02-317f-4f7f-a11b-7c1d24adac56-utilities\") pod \"redhat-marketplace-fddbx\" (UID: \"7078ad02-317f-4f7f-a11b-7c1d24adac56\") " pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.835673 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7078ad02-317f-4f7f-a11b-7c1d24adac56-catalog-content\") pod \"redhat-marketplace-fddbx\" (UID: \"7078ad02-317f-4f7f-a11b-7c1d24adac56\") " pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.857589 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzc7m\" (UniqueName: \"kubernetes.io/projected/7078ad02-317f-4f7f-a11b-7c1d24adac56-kube-api-access-zzc7m\") pod \"redhat-marketplace-fddbx\" (UID: \"7078ad02-317f-4f7f-a11b-7c1d24adac56\") " pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.887928 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.936613 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a424ee-6bfe-4135-95fa-beb839c92eab-catalog-content\") pod \"redhat-operators-d4k2f\" (UID: \"69a424ee-6bfe-4135-95fa-beb839c92eab\") " pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.936709 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmqfc\" (UniqueName: \"kubernetes.io/projected/69a424ee-6bfe-4135-95fa-beb839c92eab-kube-api-access-rmqfc\") pod \"redhat-operators-d4k2f\" (UID: \"69a424ee-6bfe-4135-95fa-beb839c92eab\") " pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:11 crc kubenswrapper[4854]: I1007 12:27:11.936777 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a424ee-6bfe-4135-95fa-beb839c92eab-utilities\") pod \"redhat-operators-d4k2f\" (UID: \"69a424ee-6bfe-4135-95fa-beb839c92eab\") " pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.037724 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmqfc\" (UniqueName: \"kubernetes.io/projected/69a424ee-6bfe-4135-95fa-beb839c92eab-kube-api-access-rmqfc\") pod \"redhat-operators-d4k2f\" (UID: \"69a424ee-6bfe-4135-95fa-beb839c92eab\") " pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.038106 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a424ee-6bfe-4135-95fa-beb839c92eab-utilities\") pod \"redhat-operators-d4k2f\" (UID: \"69a424ee-6bfe-4135-95fa-beb839c92eab\") " pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.038177 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a424ee-6bfe-4135-95fa-beb839c92eab-catalog-content\") pod \"redhat-operators-d4k2f\" (UID: \"69a424ee-6bfe-4135-95fa-beb839c92eab\") " pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.038730 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a424ee-6bfe-4135-95fa-beb839c92eab-catalog-content\") pod \"redhat-operators-d4k2f\" (UID: \"69a424ee-6bfe-4135-95fa-beb839c92eab\") " pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.038758 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a424ee-6bfe-4135-95fa-beb839c92eab-utilities\") pod \"redhat-operators-d4k2f\" (UID: \"69a424ee-6bfe-4135-95fa-beb839c92eab\") " pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.058685 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmqfc\" (UniqueName: \"kubernetes.io/projected/69a424ee-6bfe-4135-95fa-beb839c92eab-kube-api-access-rmqfc\") pod \"redhat-operators-d4k2f\" (UID: \"69a424ee-6bfe-4135-95fa-beb839c92eab\") " pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.082247 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.085907 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fddbx"] Oct 07 12:27:12 crc kubenswrapper[4854]: W1007 12:27:12.103752 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7078ad02_317f_4f7f_a11b_7c1d24adac56.slice/crio-0b1b67bc8c5b8cebb4e0450d8ede4e0ce79471a1a1d0a9060264d73715c09ed5 WatchSource:0}: Error finding container 0b1b67bc8c5b8cebb4e0450d8ede4e0ce79471a1a1d0a9060264d73715c09ed5: Status 404 returned error can't find the container with id 0b1b67bc8c5b8cebb4e0450d8ede4e0ce79471a1a1d0a9060264d73715c09ed5 Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.296883 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d4k2f"] Oct 07 12:27:12 crc kubenswrapper[4854]: W1007 12:27:12.305349 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a424ee_6bfe_4135_95fa_beb839c92eab.slice/crio-e87ceee1739e194c836c798cc3e6398d4ece72ce033693c629e6fb17c2e64e9e WatchSource:0}: Error finding container e87ceee1739e194c836c798cc3e6398d4ece72ce033693c629e6fb17c2e64e9e: Status 404 returned error can't find the container with id e87ceee1739e194c836c798cc3e6398d4ece72ce033693c629e6fb17c2e64e9e Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.884470 4854 generic.go:334] "Generic (PLEG): container finished" podID="69a424ee-6bfe-4135-95fa-beb839c92eab" containerID="be230a3a61a190bf95dfbb12f22694c8dfa5da3b195f7816283e3a2a173aab98" exitCode=0 Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.884598 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4k2f" event={"ID":"69a424ee-6bfe-4135-95fa-beb839c92eab","Type":"ContainerDied","Data":"be230a3a61a190bf95dfbb12f22694c8dfa5da3b195f7816283e3a2a173aab98"} Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.885025 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4k2f" event={"ID":"69a424ee-6bfe-4135-95fa-beb839c92eab","Type":"ContainerStarted","Data":"e87ceee1739e194c836c798cc3e6398d4ece72ce033693c629e6fb17c2e64e9e"} Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.889120 4854 generic.go:334] "Generic (PLEG): container finished" podID="7078ad02-317f-4f7f-a11b-7c1d24adac56" containerID="3ca1de371275f3a3629e7ba306eef371e320c317b1388f0cd759abad03cb3d82" exitCode=0 Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.889175 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fddbx" event={"ID":"7078ad02-317f-4f7f-a11b-7c1d24adac56","Type":"ContainerDied","Data":"3ca1de371275f3a3629e7ba306eef371e320c317b1388f0cd759abad03cb3d82"} Oct 07 12:27:12 crc kubenswrapper[4854]: I1007 12:27:12.889251 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fddbx" event={"ID":"7078ad02-317f-4f7f-a11b-7c1d24adac56","Type":"ContainerStarted","Data":"0b1b67bc8c5b8cebb4e0450d8ede4e0ce79471a1a1d0a9060264d73715c09ed5"} Oct 07 12:27:13 crc kubenswrapper[4854]: I1007 12:27:13.898294 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4k2f" event={"ID":"69a424ee-6bfe-4135-95fa-beb839c92eab","Type":"ContainerStarted","Data":"6d81f5b74f5c6c9c6d2c649ce83cb82ba41e4735fb037fc1637c5fa75e94ca3c"} Oct 07 12:27:13 crc kubenswrapper[4854]: I1007 12:27:13.900566 4854 generic.go:334] "Generic (PLEG): container finished" podID="7078ad02-317f-4f7f-a11b-7c1d24adac56" containerID="18addda6f2d2f19a16a320d5461f0f0baabadf5cf4eb7379eeafc44394a6623a" exitCode=0 Oct 07 12:27:13 crc kubenswrapper[4854]: I1007 12:27:13.900625 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fddbx" event={"ID":"7078ad02-317f-4f7f-a11b-7c1d24adac56","Type":"ContainerDied","Data":"18addda6f2d2f19a16a320d5461f0f0baabadf5cf4eb7379eeafc44394a6623a"} Oct 07 12:27:13 crc kubenswrapper[4854]: I1007 12:27:13.966424 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rvktg"] Oct 07 12:27:13 crc kubenswrapper[4854]: I1007 12:27:13.977366 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvktg"] Oct 07 12:27:13 crc kubenswrapper[4854]: I1007 12:27:13.977570 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:13 crc kubenswrapper[4854]: I1007 12:27:13.985454 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.074049 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx6qf\" (UniqueName: \"kubernetes.io/projected/d9fd837b-e039-4782-8fa7-272f368cc9cd-kube-api-access-bx6qf\") pod \"certified-operators-rvktg\" (UID: \"d9fd837b-e039-4782-8fa7-272f368cc9cd\") " pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.074546 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fd837b-e039-4782-8fa7-272f368cc9cd-utilities\") pod \"certified-operators-rvktg\" (UID: \"d9fd837b-e039-4782-8fa7-272f368cc9cd\") " pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.074824 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fd837b-e039-4782-8fa7-272f368cc9cd-catalog-content\") pod \"certified-operators-rvktg\" (UID: \"d9fd837b-e039-4782-8fa7-272f368cc9cd\") " pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.155843 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ttrh6"] Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.157088 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.160630 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.173052 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ttrh6"] Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.177362 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fd837b-e039-4782-8fa7-272f368cc9cd-utilities\") pod \"certified-operators-rvktg\" (UID: \"d9fd837b-e039-4782-8fa7-272f368cc9cd\") " pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.177436 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fd837b-e039-4782-8fa7-272f368cc9cd-catalog-content\") pod \"certified-operators-rvktg\" (UID: \"d9fd837b-e039-4782-8fa7-272f368cc9cd\") " pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.177489 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx6qf\" (UniqueName: \"kubernetes.io/projected/d9fd837b-e039-4782-8fa7-272f368cc9cd-kube-api-access-bx6qf\") pod \"certified-operators-rvktg\" (UID: \"d9fd837b-e039-4782-8fa7-272f368cc9cd\") " pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.178450 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fd837b-e039-4782-8fa7-272f368cc9cd-utilities\") pod \"certified-operators-rvktg\" (UID: \"d9fd837b-e039-4782-8fa7-272f368cc9cd\") " pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.178751 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fd837b-e039-4782-8fa7-272f368cc9cd-catalog-content\") pod \"certified-operators-rvktg\" (UID: \"d9fd837b-e039-4782-8fa7-272f368cc9cd\") " pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.201540 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx6qf\" (UniqueName: \"kubernetes.io/projected/d9fd837b-e039-4782-8fa7-272f368cc9cd-kube-api-access-bx6qf\") pod \"certified-operators-rvktg\" (UID: \"d9fd837b-e039-4782-8fa7-272f368cc9cd\") " pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.279830 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdgbs\" (UniqueName: \"kubernetes.io/projected/539f954c-bee8-4661-af57-1ba452d3dddb-kube-api-access-gdgbs\") pod \"community-operators-ttrh6\" (UID: \"539f954c-bee8-4661-af57-1ba452d3dddb\") " pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.280014 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/539f954c-bee8-4661-af57-1ba452d3dddb-utilities\") pod \"community-operators-ttrh6\" (UID: \"539f954c-bee8-4661-af57-1ba452d3dddb\") " pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.280209 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/539f954c-bee8-4661-af57-1ba452d3dddb-catalog-content\") pod \"community-operators-ttrh6\" (UID: \"539f954c-bee8-4661-af57-1ba452d3dddb\") " pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.308628 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.381628 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/539f954c-bee8-4661-af57-1ba452d3dddb-utilities\") pod \"community-operators-ttrh6\" (UID: \"539f954c-bee8-4661-af57-1ba452d3dddb\") " pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.382158 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/539f954c-bee8-4661-af57-1ba452d3dddb-catalog-content\") pod \"community-operators-ttrh6\" (UID: \"539f954c-bee8-4661-af57-1ba452d3dddb\") " pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.382259 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdgbs\" (UniqueName: \"kubernetes.io/projected/539f954c-bee8-4661-af57-1ba452d3dddb-kube-api-access-gdgbs\") pod \"community-operators-ttrh6\" (UID: \"539f954c-bee8-4661-af57-1ba452d3dddb\") " pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.383459 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/539f954c-bee8-4661-af57-1ba452d3dddb-utilities\") pod \"community-operators-ttrh6\" (UID: \"539f954c-bee8-4661-af57-1ba452d3dddb\") " pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.383784 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/539f954c-bee8-4661-af57-1ba452d3dddb-catalog-content\") pod \"community-operators-ttrh6\" (UID: \"539f954c-bee8-4661-af57-1ba452d3dddb\") " pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.412855 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdgbs\" (UniqueName: \"kubernetes.io/projected/539f954c-bee8-4661-af57-1ba452d3dddb-kube-api-access-gdgbs\") pod \"community-operators-ttrh6\" (UID: \"539f954c-bee8-4661-af57-1ba452d3dddb\") " pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.477978 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.557260 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvktg"] Oct 07 12:27:14 crc kubenswrapper[4854]: W1007 12:27:14.566641 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9fd837b_e039_4782_8fa7_272f368cc9cd.slice/crio-3400c1de6b29ab77f339fc7237b3488743bb9ffa1bd34c0ca443fab73ccc23dd WatchSource:0}: Error finding container 3400c1de6b29ab77f339fc7237b3488743bb9ffa1bd34c0ca443fab73ccc23dd: Status 404 returned error can't find the container with id 3400c1de6b29ab77f339fc7237b3488743bb9ffa1bd34c0ca443fab73ccc23dd Oct 07 12:27:14 crc kubenswrapper[4854]: W1007 12:27:14.715900 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod539f954c_bee8_4661_af57_1ba452d3dddb.slice/crio-629487324c355f3d5ed2f81586f0f9be0dbaf1416b98c821ee244a637a703215 WatchSource:0}: Error finding container 629487324c355f3d5ed2f81586f0f9be0dbaf1416b98c821ee244a637a703215: Status 404 returned error can't find the container with id 629487324c355f3d5ed2f81586f0f9be0dbaf1416b98c821ee244a637a703215 Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.717285 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ttrh6"] Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.910625 4854 generic.go:334] "Generic (PLEG): container finished" podID="d9fd837b-e039-4782-8fa7-272f368cc9cd" containerID="72d3f007478b952194d08c4b1d7617d92263456a89b670ef5d8a3cac50d3a1a4" exitCode=0 Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.910743 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvktg" event={"ID":"d9fd837b-e039-4782-8fa7-272f368cc9cd","Type":"ContainerDied","Data":"72d3f007478b952194d08c4b1d7617d92263456a89b670ef5d8a3cac50d3a1a4"} Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.911289 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvktg" event={"ID":"d9fd837b-e039-4782-8fa7-272f368cc9cd","Type":"ContainerStarted","Data":"3400c1de6b29ab77f339fc7237b3488743bb9ffa1bd34c0ca443fab73ccc23dd"} Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.917070 4854 generic.go:334] "Generic (PLEG): container finished" podID="69a424ee-6bfe-4135-95fa-beb839c92eab" containerID="6d81f5b74f5c6c9c6d2c649ce83cb82ba41e4735fb037fc1637c5fa75e94ca3c" exitCode=0 Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.917131 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4k2f" event={"ID":"69a424ee-6bfe-4135-95fa-beb839c92eab","Type":"ContainerDied","Data":"6d81f5b74f5c6c9c6d2c649ce83cb82ba41e4735fb037fc1637c5fa75e94ca3c"} Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.925019 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ttrh6" event={"ID":"539f954c-bee8-4661-af57-1ba452d3dddb","Type":"ContainerStarted","Data":"901fdfd2ea0022078bf45523bfb5e108326cf7a377710d971c19646bfc4d884d"} Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.925189 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ttrh6" event={"ID":"539f954c-bee8-4661-af57-1ba452d3dddb","Type":"ContainerStarted","Data":"629487324c355f3d5ed2f81586f0f9be0dbaf1416b98c821ee244a637a703215"} Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.938649 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fddbx" event={"ID":"7078ad02-317f-4f7f-a11b-7c1d24adac56","Type":"ContainerStarted","Data":"c9a16a720b2ad89022c032574468d8161fe36ba2923526dc523f9ffa28a8cffe"} Oct 07 12:27:14 crc kubenswrapper[4854]: I1007 12:27:14.994403 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fddbx" podStartSLOduration=2.440600386 podStartE2EDuration="3.99438061s" podCreationTimestamp="2025-10-07 12:27:11 +0000 UTC" firstStartedPulling="2025-10-07 12:27:12.89080508 +0000 UTC m=+148.878637335" lastFinishedPulling="2025-10-07 12:27:14.444585314 +0000 UTC m=+150.432417559" observedRunningTime="2025-10-07 12:27:14.994134283 +0000 UTC m=+150.981966538" watchObservedRunningTime="2025-10-07 12:27:14.99438061 +0000 UTC m=+150.982212865" Oct 07 12:27:15 crc kubenswrapper[4854]: I1007 12:27:15.955933 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4k2f" event={"ID":"69a424ee-6bfe-4135-95fa-beb839c92eab","Type":"ContainerStarted","Data":"6c8196f19e2a2c277b182e834a5ac2434f12faaab9046e44e3b13b259d22192e"} Oct 07 12:27:15 crc kubenswrapper[4854]: I1007 12:27:15.957681 4854 generic.go:334] "Generic (PLEG): container finished" podID="539f954c-bee8-4661-af57-1ba452d3dddb" containerID="901fdfd2ea0022078bf45523bfb5e108326cf7a377710d971c19646bfc4d884d" exitCode=0 Oct 07 12:27:15 crc kubenswrapper[4854]: I1007 12:27:15.957889 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ttrh6" event={"ID":"539f954c-bee8-4661-af57-1ba452d3dddb","Type":"ContainerDied","Data":"901fdfd2ea0022078bf45523bfb5e108326cf7a377710d971c19646bfc4d884d"} Oct 07 12:27:15 crc kubenswrapper[4854]: I1007 12:27:15.989002 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d4k2f" podStartSLOduration=2.529675416 podStartE2EDuration="4.988937305s" podCreationTimestamp="2025-10-07 12:27:11 +0000 UTC" firstStartedPulling="2025-10-07 12:27:12.889034029 +0000 UTC m=+148.876866294" lastFinishedPulling="2025-10-07 12:27:15.348295938 +0000 UTC m=+151.336128183" observedRunningTime="2025-10-07 12:27:15.98530058 +0000 UTC m=+151.973132845" watchObservedRunningTime="2025-10-07 12:27:15.988937305 +0000 UTC m=+151.976769570" Oct 07 12:27:16 crc kubenswrapper[4854]: I1007 12:27:16.973228 4854 generic.go:334] "Generic (PLEG): container finished" podID="d9fd837b-e039-4782-8fa7-272f368cc9cd" containerID="ce19bb1f5b13bd5c033c578b4cef57cf5881c876b7e2938850502d8660340225" exitCode=0 Oct 07 12:27:16 crc kubenswrapper[4854]: I1007 12:27:16.973334 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvktg" event={"ID":"d9fd837b-e039-4782-8fa7-272f368cc9cd","Type":"ContainerDied","Data":"ce19bb1f5b13bd5c033c578b4cef57cf5881c876b7e2938850502d8660340225"} Oct 07 12:27:17 crc kubenswrapper[4854]: I1007 12:27:17.983600 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvktg" event={"ID":"d9fd837b-e039-4782-8fa7-272f368cc9cd","Type":"ContainerStarted","Data":"6af263cee8cd6e2a56ca9b03164e63c28347aadfb6812058fc8e42504ee00bae"} Oct 07 12:27:17 crc kubenswrapper[4854]: I1007 12:27:17.988064 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ttrh6" event={"ID":"539f954c-bee8-4661-af57-1ba452d3dddb","Type":"ContainerStarted","Data":"330dfaac06987d2e152c11b9799a9ab8398adb7f2e1b6795b99209fa5f102b89"} Oct 07 12:27:18 crc kubenswrapper[4854]: I1007 12:27:18.002353 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rvktg" podStartSLOduration=2.333529451 podStartE2EDuration="5.002329164s" podCreationTimestamp="2025-10-07 12:27:13 +0000 UTC" firstStartedPulling="2025-10-07 12:27:14.913819776 +0000 UTC m=+150.901652031" lastFinishedPulling="2025-10-07 12:27:17.582619499 +0000 UTC m=+153.570451744" observedRunningTime="2025-10-07 12:27:17.998517395 +0000 UTC m=+153.986349690" watchObservedRunningTime="2025-10-07 12:27:18.002329164 +0000 UTC m=+153.990161419" Oct 07 12:27:18 crc kubenswrapper[4854]: I1007 12:27:18.997069 4854 generic.go:334] "Generic (PLEG): container finished" podID="539f954c-bee8-4661-af57-1ba452d3dddb" containerID="330dfaac06987d2e152c11b9799a9ab8398adb7f2e1b6795b99209fa5f102b89" exitCode=0 Oct 07 12:27:18 crc kubenswrapper[4854]: I1007 12:27:18.997194 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ttrh6" event={"ID":"539f954c-bee8-4661-af57-1ba452d3dddb","Type":"ContainerDied","Data":"330dfaac06987d2e152c11b9799a9ab8398adb7f2e1b6795b99209fa5f102b89"} Oct 07 12:27:20 crc kubenswrapper[4854]: I1007 12:27:20.005620 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ttrh6" event={"ID":"539f954c-bee8-4661-af57-1ba452d3dddb","Type":"ContainerStarted","Data":"bbc68da619105374d4c86c6dd2ab0766919479932c7052d48583a5b7c742fe8c"} Oct 07 12:27:20 crc kubenswrapper[4854]: I1007 12:27:20.027323 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ttrh6" podStartSLOduration=2.529642839 podStartE2EDuration="6.027302467s" podCreationTimestamp="2025-10-07 12:27:14 +0000 UTC" firstStartedPulling="2025-10-07 12:27:15.959411543 +0000 UTC m=+151.947243838" lastFinishedPulling="2025-10-07 12:27:19.457071221 +0000 UTC m=+155.444903466" observedRunningTime="2025-10-07 12:27:20.023654052 +0000 UTC m=+156.011486307" watchObservedRunningTime="2025-10-07 12:27:20.027302467 +0000 UTC m=+156.015134722" Oct 07 12:27:21 crc kubenswrapper[4854]: I1007 12:27:21.888626 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:21 crc kubenswrapper[4854]: I1007 12:27:21.888701 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:21 crc kubenswrapper[4854]: I1007 12:27:21.935258 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:22 crc kubenswrapper[4854]: I1007 12:27:22.057703 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fddbx" Oct 07 12:27:22 crc kubenswrapper[4854]: I1007 12:27:22.082449 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:22 crc kubenswrapper[4854]: I1007 12:27:22.082509 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:22 crc kubenswrapper[4854]: I1007 12:27:22.127421 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:23 crc kubenswrapper[4854]: I1007 12:27:23.071134 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:27:24 crc kubenswrapper[4854]: I1007 12:27:24.310596 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:24 crc kubenswrapper[4854]: I1007 12:27:24.311737 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:24 crc kubenswrapper[4854]: I1007 12:27:24.369708 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:24 crc kubenswrapper[4854]: I1007 12:27:24.478627 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:24 crc kubenswrapper[4854]: I1007 12:27:24.478692 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:24 crc kubenswrapper[4854]: I1007 12:27:24.521761 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:24 crc kubenswrapper[4854]: I1007 12:27:24.730812 4854 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod2c45dd65-0c11-4c29-8f62-d667bd61d974"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod2c45dd65-0c11-4c29-8f62-d667bd61d974] : Timed out while waiting for systemd to remove kubepods-burstable-pod2c45dd65_0c11_4c29_8f62_d667bd61d974.slice" Oct 07 12:27:24 crc kubenswrapper[4854]: E1007 12:27:24.730909 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable pod2c45dd65-0c11-4c29-8f62-d667bd61d974] : unable to destroy cgroup paths for cgroup [kubepods burstable pod2c45dd65-0c11-4c29-8f62-d667bd61d974] : Timed out while waiting for systemd to remove kubepods-burstable-pod2c45dd65_0c11_4c29_8f62_d667bd61d974.slice" pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" podUID="2c45dd65-0c11-4c29-8f62-d667bd61d974" Oct 07 12:27:25 crc kubenswrapper[4854]: I1007 12:27:25.034728 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-87pqd" Oct 07 12:27:25 crc kubenswrapper[4854]: I1007 12:27:25.065758 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-87pqd"] Oct 07 12:27:25 crc kubenswrapper[4854]: I1007 12:27:25.076606 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-87pqd"] Oct 07 12:27:25 crc kubenswrapper[4854]: I1007 12:27:25.082327 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rvktg" Oct 07 12:27:25 crc kubenswrapper[4854]: I1007 12:27:25.094351 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ttrh6" Oct 07 12:27:26 crc kubenswrapper[4854]: I1007 12:27:26.710529 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c45dd65-0c11-4c29-8f62-d667bd61d974" path="/var/lib/kubelet/pods/2c45dd65-0c11-4c29-8f62-d667bd61d974/volumes" Oct 07 12:27:40 crc kubenswrapper[4854]: I1007 12:27:40.808368 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:27:40 crc kubenswrapper[4854]: I1007 12:27:40.809288 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:28:10 crc kubenswrapper[4854]: I1007 12:28:10.808126 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:28:10 crc kubenswrapper[4854]: I1007 12:28:10.808813 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:28:10 crc kubenswrapper[4854]: I1007 12:28:10.808895 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:28:10 crc kubenswrapper[4854]: I1007 12:28:10.809603 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4999b61b3dbea84a514ca4a92eeb5610c1abb6cd234764d1084d95f3698e08c8"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 12:28:10 crc kubenswrapper[4854]: I1007 12:28:10.809680 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://4999b61b3dbea84a514ca4a92eeb5610c1abb6cd234764d1084d95f3698e08c8" gracePeriod=600 Oct 07 12:28:11 crc kubenswrapper[4854]: I1007 12:28:11.390435 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="4999b61b3dbea84a514ca4a92eeb5610c1abb6cd234764d1084d95f3698e08c8" exitCode=0 Oct 07 12:28:11 crc kubenswrapper[4854]: I1007 12:28:11.390500 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"4999b61b3dbea84a514ca4a92eeb5610c1abb6cd234764d1084d95f3698e08c8"} Oct 07 12:28:12 crc kubenswrapper[4854]: I1007 12:28:12.402097 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"2384c1ed3c6c1a4d362b7cd3182c7ae041c6ae68b38e9626a42db83d715f023f"} Oct 07 12:30:00 crc kubenswrapper[4854]: I1007 12:30:00.153546 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624"] Oct 07 12:30:00 crc kubenswrapper[4854]: I1007 12:30:00.155032 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" Oct 07 12:30:00 crc kubenswrapper[4854]: I1007 12:30:00.160044 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 12:30:00 crc kubenswrapper[4854]: I1007 12:30:00.160509 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 12:30:00 crc kubenswrapper[4854]: I1007 12:30:00.165228 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624"] Oct 07 12:30:00 crc kubenswrapper[4854]: I1007 12:30:00.249052 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23b08e27-c5ae-4332-80ad-df3db344f0ee-secret-volume\") pod \"collect-profiles-29330670-ql624\" (UID: \"23b08e27-c5ae-4332-80ad-df3db344f0ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" Oct 07 12:30:00 crc kubenswrapper[4854]: I1007 12:30:00.249148 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjmjh\" (UniqueName: \"kubernetes.io/projected/23b08e27-c5ae-4332-80ad-df3db344f0ee-kube-api-access-kjmjh\") pod \"collect-profiles-29330670-ql624\" (UID: \"23b08e27-c5ae-4332-80ad-df3db344f0ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" Oct 07 12:30:00 crc kubenswrapper[4854]: I1007 12:30:00.249556 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23b08e27-c5ae-4332-80ad-df3db344f0ee-config-volume\") pod \"collect-profiles-29330670-ql624\" (UID: \"23b08e27-c5ae-4332-80ad-df3db344f0ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" Oct 07 12:30:00 crc kubenswrapper[4854]: I1007 12:30:00.350431 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23b08e27-c5ae-4332-80ad-df3db344f0ee-secret-volume\") pod \"collect-profiles-29330670-ql624\" (UID: \"23b08e27-c5ae-4332-80ad-df3db344f0ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" Oct 07 12:30:00 crc kubenswrapper[4854]: I1007 12:30:00.350816 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjmjh\" (UniqueName: \"kubernetes.io/projected/23b08e27-c5ae-4332-80ad-df3db344f0ee-kube-api-access-kjmjh\") pod \"collect-profiles-29330670-ql624\" (UID: \"23b08e27-c5ae-4332-80ad-df3db344f0ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" Oct 07 12:30:00 crc kubenswrapper[4854]: I1007 12:30:00.350839 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23b08e27-c5ae-4332-80ad-df3db344f0ee-config-volume\") pod \"collect-profiles-29330670-ql624\" (UID: \"23b08e27-c5ae-4332-80ad-df3db344f0ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" Oct 07 12:30:00 crc kubenswrapper[4854]: I1007 12:30:00.351826 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23b08e27-c5ae-4332-80ad-df3db344f0ee-config-volume\") pod \"collect-profiles-29330670-ql624\" (UID: \"23b08e27-c5ae-4332-80ad-df3db344f0ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" Oct 07 12:30:00 crc kubenswrapper[4854]: I1007 12:30:00.357944 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23b08e27-c5ae-4332-80ad-df3db344f0ee-secret-volume\") pod \"collect-profiles-29330670-ql624\" (UID: \"23b08e27-c5ae-4332-80ad-df3db344f0ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" Oct 07 12:30:00 crc kubenswrapper[4854]: I1007 12:30:00.371013 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjmjh\" (UniqueName: \"kubernetes.io/projected/23b08e27-c5ae-4332-80ad-df3db344f0ee-kube-api-access-kjmjh\") pod \"collect-profiles-29330670-ql624\" (UID: \"23b08e27-c5ae-4332-80ad-df3db344f0ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" Oct 07 12:30:01 crc kubenswrapper[4854]: I1007 12:30:01.226227 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" Oct 07 12:30:01 crc kubenswrapper[4854]: I1007 12:30:01.636796 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624"] Oct 07 12:30:02 crc kubenswrapper[4854]: I1007 12:30:02.128868 4854 generic.go:334] "Generic (PLEG): container finished" podID="23b08e27-c5ae-4332-80ad-df3db344f0ee" containerID="33d79c6fba47059b7e54f946e88de2dbd8995c0a3f1e10b4e0fdc606546f0061" exitCode=0 Oct 07 12:30:02 crc kubenswrapper[4854]: I1007 12:30:02.128995 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" event={"ID":"23b08e27-c5ae-4332-80ad-df3db344f0ee","Type":"ContainerDied","Data":"33d79c6fba47059b7e54f946e88de2dbd8995c0a3f1e10b4e0fdc606546f0061"} Oct 07 12:30:02 crc kubenswrapper[4854]: I1007 12:30:02.129568 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" event={"ID":"23b08e27-c5ae-4332-80ad-df3db344f0ee","Type":"ContainerStarted","Data":"ae72c954ce6fb0a7f46725d0a9e2439b3c9b66bd6872aee9b53831dafe470b87"} Oct 07 12:30:03 crc kubenswrapper[4854]: I1007 12:30:03.377920 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" Oct 07 12:30:03 crc kubenswrapper[4854]: I1007 12:30:03.394056 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23b08e27-c5ae-4332-80ad-df3db344f0ee-secret-volume\") pod \"23b08e27-c5ae-4332-80ad-df3db344f0ee\" (UID: \"23b08e27-c5ae-4332-80ad-df3db344f0ee\") " Oct 07 12:30:03 crc kubenswrapper[4854]: I1007 12:30:03.394131 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23b08e27-c5ae-4332-80ad-df3db344f0ee-config-volume\") pod \"23b08e27-c5ae-4332-80ad-df3db344f0ee\" (UID: \"23b08e27-c5ae-4332-80ad-df3db344f0ee\") " Oct 07 12:30:03 crc kubenswrapper[4854]: I1007 12:30:03.394255 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjmjh\" (UniqueName: \"kubernetes.io/projected/23b08e27-c5ae-4332-80ad-df3db344f0ee-kube-api-access-kjmjh\") pod \"23b08e27-c5ae-4332-80ad-df3db344f0ee\" (UID: \"23b08e27-c5ae-4332-80ad-df3db344f0ee\") " Oct 07 12:30:03 crc kubenswrapper[4854]: I1007 12:30:03.395808 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b08e27-c5ae-4332-80ad-df3db344f0ee-config-volume" (OuterVolumeSpecName: "config-volume") pod "23b08e27-c5ae-4332-80ad-df3db344f0ee" (UID: "23b08e27-c5ae-4332-80ad-df3db344f0ee"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:30:03 crc kubenswrapper[4854]: I1007 12:30:03.402599 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b08e27-c5ae-4332-80ad-df3db344f0ee-kube-api-access-kjmjh" (OuterVolumeSpecName: "kube-api-access-kjmjh") pod "23b08e27-c5ae-4332-80ad-df3db344f0ee" (UID: "23b08e27-c5ae-4332-80ad-df3db344f0ee"). InnerVolumeSpecName "kube-api-access-kjmjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:30:03 crc kubenswrapper[4854]: I1007 12:30:03.404810 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b08e27-c5ae-4332-80ad-df3db344f0ee-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "23b08e27-c5ae-4332-80ad-df3db344f0ee" (UID: "23b08e27-c5ae-4332-80ad-df3db344f0ee"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:30:03 crc kubenswrapper[4854]: I1007 12:30:03.495846 4854 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23b08e27-c5ae-4332-80ad-df3db344f0ee-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 12:30:03 crc kubenswrapper[4854]: I1007 12:30:03.495905 4854 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23b08e27-c5ae-4332-80ad-df3db344f0ee-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 12:30:03 crc kubenswrapper[4854]: I1007 12:30:03.495932 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjmjh\" (UniqueName: \"kubernetes.io/projected/23b08e27-c5ae-4332-80ad-df3db344f0ee-kube-api-access-kjmjh\") on node \"crc\" DevicePath \"\"" Oct 07 12:30:04 crc kubenswrapper[4854]: I1007 12:30:04.162798 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" event={"ID":"23b08e27-c5ae-4332-80ad-df3db344f0ee","Type":"ContainerDied","Data":"ae72c954ce6fb0a7f46725d0a9e2439b3c9b66bd6872aee9b53831dafe470b87"} Oct 07 12:30:04 crc kubenswrapper[4854]: I1007 12:30:04.162976 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624" Oct 07 12:30:04 crc kubenswrapper[4854]: I1007 12:30:04.163040 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae72c954ce6fb0a7f46725d0a9e2439b3c9b66bd6872aee9b53831dafe470b87" Oct 07 12:30:40 crc kubenswrapper[4854]: I1007 12:30:40.808538 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:30:40 crc kubenswrapper[4854]: I1007 12:30:40.809363 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:31:10 crc kubenswrapper[4854]: I1007 12:31:10.807574 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:31:10 crc kubenswrapper[4854]: I1007 12:31:10.808497 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:31:13 crc kubenswrapper[4854]: I1007 12:31:13.923529 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w7c7q"] Oct 07 12:31:13 crc kubenswrapper[4854]: E1007 12:31:13.923829 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b08e27-c5ae-4332-80ad-df3db344f0ee" containerName="collect-profiles" Oct 07 12:31:13 crc kubenswrapper[4854]: I1007 12:31:13.923844 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b08e27-c5ae-4332-80ad-df3db344f0ee" containerName="collect-profiles" Oct 07 12:31:13 crc kubenswrapper[4854]: I1007 12:31:13.923967 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b08e27-c5ae-4332-80ad-df3db344f0ee" containerName="collect-profiles" Oct 07 12:31:13 crc kubenswrapper[4854]: I1007 12:31:13.924483 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:13 crc kubenswrapper[4854]: I1007 12:31:13.939007 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w7c7q"] Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.028274 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cdfa8c7-e0e8-43f7-a291-935333ecc328-trusted-ca\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.029020 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cdfa8c7-e0e8-43f7-a291-935333ecc328-registry-tls\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.029124 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cdfa8c7-e0e8-43f7-a291-935333ecc328-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.029279 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cdfa8c7-e0e8-43f7-a291-935333ecc328-bound-sa-token\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.029388 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfd2c\" (UniqueName: \"kubernetes.io/projected/4cdfa8c7-e0e8-43f7-a291-935333ecc328-kube-api-access-lfd2c\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.029531 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cdfa8c7-e0e8-43f7-a291-935333ecc328-registry-certificates\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.029631 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cdfa8c7-e0e8-43f7-a291-935333ecc328-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.029742 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.057984 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.131524 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cdfa8c7-e0e8-43f7-a291-935333ecc328-registry-tls\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.131585 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cdfa8c7-e0e8-43f7-a291-935333ecc328-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.131646 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cdfa8c7-e0e8-43f7-a291-935333ecc328-bound-sa-token\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.132295 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfd2c\" (UniqueName: \"kubernetes.io/projected/4cdfa8c7-e0e8-43f7-a291-935333ecc328-kube-api-access-lfd2c\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.132687 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cdfa8c7-e0e8-43f7-a291-935333ecc328-registry-certificates\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.132738 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cdfa8c7-e0e8-43f7-a291-935333ecc328-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.132943 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cdfa8c7-e0e8-43f7-a291-935333ecc328-trusted-ca\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.133255 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cdfa8c7-e0e8-43f7-a291-935333ecc328-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.134477 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cdfa8c7-e0e8-43f7-a291-935333ecc328-trusted-ca\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.134663 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cdfa8c7-e0e8-43f7-a291-935333ecc328-registry-certificates\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.139166 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cdfa8c7-e0e8-43f7-a291-935333ecc328-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.139244 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cdfa8c7-e0e8-43f7-a291-935333ecc328-registry-tls\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.149026 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cdfa8c7-e0e8-43f7-a291-935333ecc328-bound-sa-token\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.149471 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfd2c\" (UniqueName: \"kubernetes.io/projected/4cdfa8c7-e0e8-43f7-a291-935333ecc328-kube-api-access-lfd2c\") pod \"image-registry-66df7c8f76-w7c7q\" (UID: \"4cdfa8c7-e0e8-43f7-a291-935333ecc328\") " pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.245706 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.497728 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w7c7q"] Oct 07 12:31:14 crc kubenswrapper[4854]: I1007 12:31:14.646324 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" event={"ID":"4cdfa8c7-e0e8-43f7-a291-935333ecc328","Type":"ContainerStarted","Data":"e45e22a4e6680a8d1362b5057f8c9f5b7152e92c1497fa2d24be7e9dd4f893c4"} Oct 07 12:31:15 crc kubenswrapper[4854]: I1007 12:31:15.655496 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" event={"ID":"4cdfa8c7-e0e8-43f7-a291-935333ecc328","Type":"ContainerStarted","Data":"b7b188e838ddea916f6e37e7e0bf14f7f922602221b6725107fe08314806a7be"} Oct 07 12:31:15 crc kubenswrapper[4854]: I1007 12:31:15.655969 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:15 crc kubenswrapper[4854]: I1007 12:31:15.690974 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" podStartSLOduration=2.6909405189999998 podStartE2EDuration="2.690940519s" podCreationTimestamp="2025-10-07 12:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:31:15.689846814 +0000 UTC m=+391.677679109" watchObservedRunningTime="2025-10-07 12:31:15.690940519 +0000 UTC m=+391.678772804" Oct 07 12:31:34 crc kubenswrapper[4854]: I1007 12:31:34.252751 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-w7c7q" Oct 07 12:31:34 crc kubenswrapper[4854]: I1007 12:31:34.330034 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-44m5s"] Oct 07 12:31:40 crc kubenswrapper[4854]: I1007 12:31:40.807488 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:31:40 crc kubenswrapper[4854]: I1007 12:31:40.807901 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:31:40 crc kubenswrapper[4854]: I1007 12:31:40.807973 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:31:40 crc kubenswrapper[4854]: I1007 12:31:40.808744 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2384c1ed3c6c1a4d362b7cd3182c7ae041c6ae68b38e9626a42db83d715f023f"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 12:31:40 crc kubenswrapper[4854]: I1007 12:31:40.808847 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://2384c1ed3c6c1a4d362b7cd3182c7ae041c6ae68b38e9626a42db83d715f023f" gracePeriod=600 Oct 07 12:31:41 crc kubenswrapper[4854]: I1007 12:31:41.841709 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="2384c1ed3c6c1a4d362b7cd3182c7ae041c6ae68b38e9626a42db83d715f023f" exitCode=0 Oct 07 12:31:41 crc kubenswrapper[4854]: I1007 12:31:41.841832 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"2384c1ed3c6c1a4d362b7cd3182c7ae041c6ae68b38e9626a42db83d715f023f"} Oct 07 12:31:41 crc kubenswrapper[4854]: I1007 12:31:41.842279 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"f5e36bd11ba8e83568e3659962294b47149b50c3ca170d0c33beb4ab9960604f"} Oct 07 12:31:41 crc kubenswrapper[4854]: I1007 12:31:41.842326 4854 scope.go:117] "RemoveContainer" containerID="4999b61b3dbea84a514ca4a92eeb5610c1abb6cd234764d1084d95f3698e08c8" Oct 07 12:31:44 crc kubenswrapper[4854]: I1007 12:31:44.954109 4854 scope.go:117] "RemoveContainer" containerID="e49df6e67a8514d29c88c5c1c96e817b2fd25df6d80184c7d2c1b670a3952d64" Oct 07 12:31:59 crc kubenswrapper[4854]: I1007 12:31:59.414421 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" podUID="f83bc29b-b3be-4578-bae7-d2867242278c" containerName="registry" containerID="cri-o://15aa6a910972caaf604e16d798f8c368f58a51c9265446916463df8595e8c834" gracePeriod=30 Oct 07 12:31:59 crc kubenswrapper[4854]: I1007 12:31:59.889460 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:31:59 crc kubenswrapper[4854]: I1007 12:31:59.981100 4854 generic.go:334] "Generic (PLEG): container finished" podID="f83bc29b-b3be-4578-bae7-d2867242278c" containerID="15aa6a910972caaf604e16d798f8c368f58a51c9265446916463df8595e8c834" exitCode=0 Oct 07 12:31:59 crc kubenswrapper[4854]: I1007 12:31:59.981204 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" event={"ID":"f83bc29b-b3be-4578-bae7-d2867242278c","Type":"ContainerDied","Data":"15aa6a910972caaf604e16d798f8c368f58a51c9265446916463df8595e8c834"} Oct 07 12:31:59 crc kubenswrapper[4854]: I1007 12:31:59.981247 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" Oct 07 12:31:59 crc kubenswrapper[4854]: I1007 12:31:59.981280 4854 scope.go:117] "RemoveContainer" containerID="15aa6a910972caaf604e16d798f8c368f58a51c9265446916463df8595e8c834" Oct 07 12:31:59 crc kubenswrapper[4854]: I1007 12:31:59.981256 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-44m5s" event={"ID":"f83bc29b-b3be-4578-bae7-d2867242278c","Type":"ContainerDied","Data":"8d51c65b3bcbb17d2ecc397697168e77c67e04eaeeea2a01026039811b7925f7"} Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.005064 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f83bc29b-b3be-4578-bae7-d2867242278c-trusted-ca\") pod \"f83bc29b-b3be-4578-bae7-d2867242278c\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.005236 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f83bc29b-b3be-4578-bae7-d2867242278c-registry-certificates\") pod \"f83bc29b-b3be-4578-bae7-d2867242278c\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.005277 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-registry-tls\") pod \"f83bc29b-b3be-4578-bae7-d2867242278c\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.005371 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmm84\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-kube-api-access-zmm84\") pod \"f83bc29b-b3be-4578-bae7-d2867242278c\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.005406 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f83bc29b-b3be-4578-bae7-d2867242278c-ca-trust-extracted\") pod \"f83bc29b-b3be-4578-bae7-d2867242278c\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.005439 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f83bc29b-b3be-4578-bae7-d2867242278c-installation-pull-secrets\") pod \"f83bc29b-b3be-4578-bae7-d2867242278c\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.005464 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-bound-sa-token\") pod \"f83bc29b-b3be-4578-bae7-d2867242278c\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.005670 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f83bc29b-b3be-4578-bae7-d2867242278c\" (UID: \"f83bc29b-b3be-4578-bae7-d2867242278c\") " Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.007188 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f83bc29b-b3be-4578-bae7-d2867242278c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f83bc29b-b3be-4578-bae7-d2867242278c" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.007424 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f83bc29b-b3be-4578-bae7-d2867242278c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f83bc29b-b3be-4578-bae7-d2867242278c" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.010965 4854 scope.go:117] "RemoveContainer" containerID="15aa6a910972caaf604e16d798f8c368f58a51c9265446916463df8595e8c834" Oct 07 12:32:00 crc kubenswrapper[4854]: E1007 12:32:00.011738 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15aa6a910972caaf604e16d798f8c368f58a51c9265446916463df8595e8c834\": container with ID starting with 15aa6a910972caaf604e16d798f8c368f58a51c9265446916463df8595e8c834 not found: ID does not exist" containerID="15aa6a910972caaf604e16d798f8c368f58a51c9265446916463df8595e8c834" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.011895 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15aa6a910972caaf604e16d798f8c368f58a51c9265446916463df8595e8c834"} err="failed to get container status \"15aa6a910972caaf604e16d798f8c368f58a51c9265446916463df8595e8c834\": rpc error: code = NotFound desc = could not find container \"15aa6a910972caaf604e16d798f8c368f58a51c9265446916463df8595e8c834\": container with ID starting with 15aa6a910972caaf604e16d798f8c368f58a51c9265446916463df8595e8c834 not found: ID does not exist" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.020253 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f83bc29b-b3be-4578-bae7-d2867242278c" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.022493 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f83bc29b-b3be-4578-bae7-d2867242278c" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.022576 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83bc29b-b3be-4578-bae7-d2867242278c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f83bc29b-b3be-4578-bae7-d2867242278c" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.022894 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f83bc29b-b3be-4578-bae7-d2867242278c" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.023144 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-kube-api-access-zmm84" (OuterVolumeSpecName: "kube-api-access-zmm84") pod "f83bc29b-b3be-4578-bae7-d2867242278c" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c"). InnerVolumeSpecName "kube-api-access-zmm84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.041919 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f83bc29b-b3be-4578-bae7-d2867242278c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f83bc29b-b3be-4578-bae7-d2867242278c" (UID: "f83bc29b-b3be-4578-bae7-d2867242278c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.107874 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmm84\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-kube-api-access-zmm84\") on node \"crc\" DevicePath \"\"" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.107935 4854 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f83bc29b-b3be-4578-bae7-d2867242278c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.107948 4854 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f83bc29b-b3be-4578-bae7-d2867242278c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.107960 4854 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.107973 4854 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f83bc29b-b3be-4578-bae7-d2867242278c-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.107985 4854 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f83bc29b-b3be-4578-bae7-d2867242278c-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.107997 4854 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f83bc29b-b3be-4578-bae7-d2867242278c-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.323098 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-44m5s"] Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.329750 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-44m5s"] Oct 07 12:32:00 crc kubenswrapper[4854]: I1007 12:32:00.716279 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f83bc29b-b3be-4578-bae7-d2867242278c" path="/var/lib/kubelet/pods/f83bc29b-b3be-4578-bae7-d2867242278c/volumes" Oct 07 12:32:44 crc kubenswrapper[4854]: I1007 12:32:44.989433 4854 scope.go:117] "RemoveContainer" containerID="09d9a54021829f0632b9eaaaa56ef1adecdd184afc0fe3cd415e5242727ac350" Oct 07 12:32:45 crc kubenswrapper[4854]: I1007 12:32:45.022617 4854 scope.go:117] "RemoveContainer" containerID="3a3a0ebfb1d077bddbca365dd95b49deca87b1158bfc34ef7de03f7c3c27a7e2" Oct 07 12:34:08 crc kubenswrapper[4854]: I1007 12:34:08.840123 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m4vxt"] Oct 07 12:34:08 crc kubenswrapper[4854]: I1007 12:34:08.848960 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="ovn-controller" containerID="cri-o://1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac" gracePeriod=30 Oct 07 12:34:08 crc kubenswrapper[4854]: I1007 12:34:08.849637 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="sbdb" containerID="cri-o://6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d" gracePeriod=30 Oct 07 12:34:08 crc kubenswrapper[4854]: I1007 12:34:08.849712 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="nbdb" containerID="cri-o://6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12" gracePeriod=30 Oct 07 12:34:08 crc kubenswrapper[4854]: I1007 12:34:08.849769 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="northd" containerID="cri-o://89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996" gracePeriod=30 Oct 07 12:34:08 crc kubenswrapper[4854]: I1007 12:34:08.849825 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35" gracePeriod=30 Oct 07 12:34:08 crc kubenswrapper[4854]: I1007 12:34:08.849878 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="kube-rbac-proxy-node" containerID="cri-o://0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f" gracePeriod=30 Oct 07 12:34:08 crc kubenswrapper[4854]: I1007 12:34:08.849965 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="ovn-acl-logging" containerID="cri-o://21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906" gracePeriod=30 Oct 07 12:34:08 crc kubenswrapper[4854]: I1007 12:34:08.909475 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="ovnkube-controller" containerID="cri-o://680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d" gracePeriod=30 Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.721861 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m4vxt_9ccb7160-6ff2-43f3-927b-5bf4aced4993/ovn-acl-logging/0.log" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.722974 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m4vxt_9ccb7160-6ff2-43f3-927b-5bf4aced4993/ovn-controller/0.log" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.723741 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822000 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsbpg"] Oct 07 12:34:09 crc kubenswrapper[4854]: E1007 12:34:09.822387 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="northd" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822412 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="northd" Oct 07 12:34:09 crc kubenswrapper[4854]: E1007 12:34:09.822435 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="ovn-controller" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822448 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="ovn-controller" Oct 07 12:34:09 crc kubenswrapper[4854]: E1007 12:34:09.822461 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="kube-rbac-proxy-node" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822475 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="kube-rbac-proxy-node" Oct 07 12:34:09 crc kubenswrapper[4854]: E1007 12:34:09.822494 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="sbdb" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822509 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="sbdb" Oct 07 12:34:09 crc kubenswrapper[4854]: E1007 12:34:09.822529 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="kubecfg-setup" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822541 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="kubecfg-setup" Oct 07 12:34:09 crc kubenswrapper[4854]: E1007 12:34:09.822559 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83bc29b-b3be-4578-bae7-d2867242278c" containerName="registry" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822571 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83bc29b-b3be-4578-bae7-d2867242278c" containerName="registry" Oct 07 12:34:09 crc kubenswrapper[4854]: E1007 12:34:09.822585 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="ovnkube-controller" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822597 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="ovnkube-controller" Oct 07 12:34:09 crc kubenswrapper[4854]: E1007 12:34:09.822612 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="nbdb" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822623 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="nbdb" Oct 07 12:34:09 crc kubenswrapper[4854]: E1007 12:34:09.822641 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="ovn-acl-logging" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822654 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="ovn-acl-logging" Oct 07 12:34:09 crc kubenswrapper[4854]: E1007 12:34:09.822670 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822682 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822871 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83bc29b-b3be-4578-bae7-d2867242278c" containerName="registry" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822890 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="northd" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822911 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="ovnkube-controller" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822929 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="ovn-acl-logging" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822943 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="kube-rbac-proxy-node" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822961 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="ovn-controller" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822981 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.822995 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="sbdb" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.823016 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerName="nbdb" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.827547 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.919894 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-openvswitch\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.920018 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.920418 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-script-lib\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.920591 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-node-log\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.920791 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-node-log" (OuterVolumeSpecName: "node-log") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.920913 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-log-socket\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.920941 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-run-netns\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921013 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-ovn\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921078 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-env-overrides\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921114 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-systemd\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921124 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921137 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-run-ovn-kubernetes\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921189 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovn-node-metrics-cert\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921211 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-log-socket" (OuterVolumeSpecName: "log-socket") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921212 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-cni-netd\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921237 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921256 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-var-lib-cni-networks-ovn-kubernetes\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921268 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921281 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-systemd-units\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921297 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-var-lib-openvswitch\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921315 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7zc7\" (UniqueName: \"kubernetes.io/projected/9ccb7160-6ff2-43f3-927b-5bf4aced4993-kube-api-access-b7zc7\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921331 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-etc-openvswitch\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921347 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-cni-bin\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921366 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-slash\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921383 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-kubelet\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921402 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-config\") pod \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\" (UID: \"9ccb7160-6ff2-43f3-927b-5bf4aced4993\") " Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921401 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921507 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ae09042-a894-400b-987b-6fca25bbbcb9-ovn-node-metrics-cert\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921530 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-node-log\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921548 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921565 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-cni-netd\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921589 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ae09042-a894-400b-987b-6fca25bbbcb9-ovnkube-script-lib\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921622 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-var-lib-openvswitch\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921640 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-run-openvswitch\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921676 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ae09042-a894-400b-987b-6fca25bbbcb9-env-overrides\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921691 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-log-socket\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921713 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-kubelet\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921716 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921731 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-run-netns\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921745 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-run-systemd\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921768 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-systemd-units\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921799 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921815 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-cni-bin\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921835 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-etc-openvswitch\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921862 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8j28\" (UniqueName: \"kubernetes.io/projected/0ae09042-a894-400b-987b-6fca25bbbcb9-kube-api-access-r8j28\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921880 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-run-ovn\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921902 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ae09042-a894-400b-987b-6fca25bbbcb9-ovnkube-config\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921921 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-slash\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921957 4854 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921967 4854 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921975 4854 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-node-log\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921984 4854 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-log-socket\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.921992 4854 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.922002 4854 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.922013 4854 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.922021 4854 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.922052 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.922072 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.922089 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-slash" (OuterVolumeSpecName: "host-slash") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.922107 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.922431 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.922458 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.922543 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.922535 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.923199 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.928566 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.928965 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m4vxt_9ccb7160-6ff2-43f3-927b-5bf4aced4993/ovn-acl-logging/0.log" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.929777 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m4vxt_9ccb7160-6ff2-43f3-927b-5bf4aced4993/ovn-controller/0.log" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.930350 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ccb7160-6ff2-43f3-927b-5bf4aced4993-kube-api-access-b7zc7" (OuterVolumeSpecName: "kube-api-access-b7zc7") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "kube-api-access-b7zc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.930611 4854 generic.go:334] "Generic (PLEG): container finished" podID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerID="680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d" exitCode=0 Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.930661 4854 generic.go:334] "Generic (PLEG): container finished" podID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerID="6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d" exitCode=0 Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.930680 4854 generic.go:334] "Generic (PLEG): container finished" podID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerID="6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12" exitCode=0 Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.930693 4854 generic.go:334] "Generic (PLEG): container finished" podID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerID="89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996" exitCode=0 Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.930707 4854 generic.go:334] "Generic (PLEG): container finished" podID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerID="e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35" exitCode=0 Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.930719 4854 generic.go:334] "Generic (PLEG): container finished" podID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerID="0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f" exitCode=0 Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.930732 4854 generic.go:334] "Generic (PLEG): container finished" podID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerID="21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906" exitCode=143 Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.930747 4854 generic.go:334] "Generic (PLEG): container finished" podID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" containerID="1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac" exitCode=143 Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.930968 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931254 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerDied","Data":"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931305 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerDied","Data":"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931322 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerDied","Data":"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931334 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerDied","Data":"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931350 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerDied","Data":"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931367 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerDied","Data":"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931385 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931403 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931410 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931420 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerDied","Data":"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931431 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931438 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931444 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931451 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931457 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931464 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931470 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931477 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931484 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931491 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerDied","Data":"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931502 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931508 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931514 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931519 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931526 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931531 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931536 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931542 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931548 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931555 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m4vxt" event={"ID":"9ccb7160-6ff2-43f3-927b-5bf4aced4993","Type":"ContainerDied","Data":"81a1b553acc5e91bccf70674e862b6a2c4174d17ab70276e16fefe8b99f48902"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931564 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931570 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931576 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931582 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931587 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931593 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931600 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931606 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931612 4854 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.931628 4854 scope.go:117] "RemoveContainer" containerID="680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.934958 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nkr42_260ab665-6a8a-44ee-9a16-5ff284b35eba/kube-multus/0.log" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.935096 4854 generic.go:334] "Generic (PLEG): container finished" podID="260ab665-6a8a-44ee-9a16-5ff284b35eba" containerID="892e830d128143e462f68fe62bb8ad0c67d8a5f4e51a57e1710e037d34ff9c51" exitCode=2 Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.935163 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nkr42" event={"ID":"260ab665-6a8a-44ee-9a16-5ff284b35eba","Type":"ContainerDied","Data":"892e830d128143e462f68fe62bb8ad0c67d8a5f4e51a57e1710e037d34ff9c51"} Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.935882 4854 scope.go:117] "RemoveContainer" containerID="892e830d128143e462f68fe62bb8ad0c67d8a5f4e51a57e1710e037d34ff9c51" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.942457 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "9ccb7160-6ff2-43f3-927b-5bf4aced4993" (UID: "9ccb7160-6ff2-43f3-927b-5bf4aced4993"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.971078 4854 scope.go:117] "RemoveContainer" containerID="6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d" Oct 07 12:34:09 crc kubenswrapper[4854]: I1007 12:34:09.999255 4854 scope.go:117] "RemoveContainer" containerID="6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022430 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8j28\" (UniqueName: \"kubernetes.io/projected/0ae09042-a894-400b-987b-6fca25bbbcb9-kube-api-access-r8j28\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022478 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-run-ovn\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022502 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ae09042-a894-400b-987b-6fca25bbbcb9-ovnkube-config\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022521 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-slash\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022559 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-node-log\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022583 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022602 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ae09042-a894-400b-987b-6fca25bbbcb9-ovn-node-metrics-cert\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022598 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-run-ovn\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022644 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-cni-netd\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022621 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-cni-netd\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022674 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-slash\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022704 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-node-log\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022702 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ae09042-a894-400b-987b-6fca25bbbcb9-ovnkube-script-lib\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022765 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-var-lib-openvswitch\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022785 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-run-openvswitch\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022806 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ae09042-a894-400b-987b-6fca25bbbcb9-env-overrides\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022823 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-log-socket\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022864 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-kubelet\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022882 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-run-netns\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022897 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-run-systemd\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022917 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-systemd-units\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022939 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-var-lib-openvswitch\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022972 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.022952 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023001 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-run-openvswitch\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023016 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-cni-bin\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023126 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023249 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-etc-openvswitch\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023304 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-cni-bin\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023339 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-etc-openvswitch\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023371 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-systemd-units\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023415 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-log-socket\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023441 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-run-netns\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023467 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-host-kubelet\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023639 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ae09042-a894-400b-987b-6fca25bbbcb9-run-systemd\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023664 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ae09042-a894-400b-987b-6fca25bbbcb9-env-overrides\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023675 4854 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023727 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ae09042-a894-400b-987b-6fca25bbbcb9-ovnkube-script-lib\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023733 4854 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023786 4854 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023806 4854 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023825 4854 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023843 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7zc7\" (UniqueName: \"kubernetes.io/projected/9ccb7160-6ff2-43f3-927b-5bf4aced4993-kube-api-access-b7zc7\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023862 4854 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023879 4854 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023897 4854 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023917 4854 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-slash\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023935 4854 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ccb7160-6ff2-43f3-927b-5bf4aced4993-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.023953 4854 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ccb7160-6ff2-43f3-927b-5bf4aced4993-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.026581 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ae09042-a894-400b-987b-6fca25bbbcb9-ovn-node-metrics-cert\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.026674 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ae09042-a894-400b-987b-6fca25bbbcb9-ovnkube-config\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.035623 4854 scope.go:117] "RemoveContainer" containerID="89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.044999 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8j28\" (UniqueName: \"kubernetes.io/projected/0ae09042-a894-400b-987b-6fca25bbbcb9-kube-api-access-r8j28\") pod \"ovnkube-node-vsbpg\" (UID: \"0ae09042-a894-400b-987b-6fca25bbbcb9\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.074541 4854 scope.go:117] "RemoveContainer" containerID="e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.102526 4854 scope.go:117] "RemoveContainer" containerID="0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.116985 4854 scope.go:117] "RemoveContainer" containerID="21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.133431 4854 scope.go:117] "RemoveContainer" containerID="1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.150990 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.152834 4854 scope.go:117] "RemoveContainer" containerID="572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.173556 4854 scope.go:117] "RemoveContainer" containerID="680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d" Oct 07 12:34:10 crc kubenswrapper[4854]: E1007 12:34:10.174517 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d\": container with ID starting with 680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d not found: ID does not exist" containerID="680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.174577 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d"} err="failed to get container status \"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d\": rpc error: code = NotFound desc = could not find container \"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d\": container with ID starting with 680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.174621 4854 scope.go:117] "RemoveContainer" containerID="6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d" Oct 07 12:34:10 crc kubenswrapper[4854]: E1007 12:34:10.175641 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d\": container with ID starting with 6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d not found: ID does not exist" containerID="6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.175682 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d"} err="failed to get container status \"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d\": rpc error: code = NotFound desc = could not find container \"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d\": container with ID starting with 6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.175709 4854 scope.go:117] "RemoveContainer" containerID="6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12" Oct 07 12:34:10 crc kubenswrapper[4854]: E1007 12:34:10.176592 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12\": container with ID starting with 6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12 not found: ID does not exist" containerID="6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.176666 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12"} err="failed to get container status \"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12\": rpc error: code = NotFound desc = could not find container \"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12\": container with ID starting with 6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.176727 4854 scope.go:117] "RemoveContainer" containerID="89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996" Oct 07 12:34:10 crc kubenswrapper[4854]: E1007 12:34:10.177460 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996\": container with ID starting with 89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996 not found: ID does not exist" containerID="89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.177491 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996"} err="failed to get container status \"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996\": rpc error: code = NotFound desc = could not find container \"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996\": container with ID starting with 89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.177539 4854 scope.go:117] "RemoveContainer" containerID="e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35" Oct 07 12:34:10 crc kubenswrapper[4854]: E1007 12:34:10.177864 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35\": container with ID starting with e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35 not found: ID does not exist" containerID="e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.177910 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35"} err="failed to get container status \"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35\": rpc error: code = NotFound desc = could not find container \"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35\": container with ID starting with e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.177960 4854 scope.go:117] "RemoveContainer" containerID="0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f" Oct 07 12:34:10 crc kubenswrapper[4854]: E1007 12:34:10.178299 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f\": container with ID starting with 0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f not found: ID does not exist" containerID="0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.178323 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f"} err="failed to get container status \"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f\": rpc error: code = NotFound desc = could not find container \"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f\": container with ID starting with 0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.178337 4854 scope.go:117] "RemoveContainer" containerID="21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906" Oct 07 12:34:10 crc kubenswrapper[4854]: E1007 12:34:10.178638 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906\": container with ID starting with 21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906 not found: ID does not exist" containerID="21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.178698 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906"} err="failed to get container status \"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906\": rpc error: code = NotFound desc = could not find container \"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906\": container with ID starting with 21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.178717 4854 scope.go:117] "RemoveContainer" containerID="1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac" Oct 07 12:34:10 crc kubenswrapper[4854]: E1007 12:34:10.178994 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac\": container with ID starting with 1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac not found: ID does not exist" containerID="1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.179059 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac"} err="failed to get container status \"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac\": rpc error: code = NotFound desc = could not find container \"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac\": container with ID starting with 1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.179078 4854 scope.go:117] "RemoveContainer" containerID="572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75" Oct 07 12:34:10 crc kubenswrapper[4854]: E1007 12:34:10.179761 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75\": container with ID starting with 572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75 not found: ID does not exist" containerID="572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.179802 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75"} err="failed to get container status \"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75\": rpc error: code = NotFound desc = could not find container \"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75\": container with ID starting with 572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.179848 4854 scope.go:117] "RemoveContainer" containerID="680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.180242 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d"} err="failed to get container status \"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d\": rpc error: code = NotFound desc = could not find container \"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d\": container with ID starting with 680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.180268 4854 scope.go:117] "RemoveContainer" containerID="6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.180840 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d"} err="failed to get container status \"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d\": rpc error: code = NotFound desc = could not find container \"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d\": container with ID starting with 6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.180864 4854 scope.go:117] "RemoveContainer" containerID="6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.181122 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12"} err="failed to get container status \"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12\": rpc error: code = NotFound desc = could not find container \"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12\": container with ID starting with 6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.181218 4854 scope.go:117] "RemoveContainer" containerID="89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.182530 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996"} err="failed to get container status \"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996\": rpc error: code = NotFound desc = could not find container \"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996\": container with ID starting with 89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.182557 4854 scope.go:117] "RemoveContainer" containerID="e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.182845 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35"} err="failed to get container status \"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35\": rpc error: code = NotFound desc = could not find container \"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35\": container with ID starting with e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.182875 4854 scope.go:117] "RemoveContainer" containerID="0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.183084 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f"} err="failed to get container status \"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f\": rpc error: code = NotFound desc = could not find container \"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f\": container with ID starting with 0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.183107 4854 scope.go:117] "RemoveContainer" containerID="21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.183335 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906"} err="failed to get container status \"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906\": rpc error: code = NotFound desc = could not find container \"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906\": container with ID starting with 21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.183360 4854 scope.go:117] "RemoveContainer" containerID="1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.183587 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac"} err="failed to get container status \"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac\": rpc error: code = NotFound desc = could not find container \"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac\": container with ID starting with 1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.183614 4854 scope.go:117] "RemoveContainer" containerID="572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.183936 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75"} err="failed to get container status \"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75\": rpc error: code = NotFound desc = could not find container \"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75\": container with ID starting with 572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.183964 4854 scope.go:117] "RemoveContainer" containerID="680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.184207 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d"} err="failed to get container status \"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d\": rpc error: code = NotFound desc = could not find container \"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d\": container with ID starting with 680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.184228 4854 scope.go:117] "RemoveContainer" containerID="6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.184422 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d"} err="failed to get container status \"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d\": rpc error: code = NotFound desc = could not find container \"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d\": container with ID starting with 6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.184448 4854 scope.go:117] "RemoveContainer" containerID="6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.184615 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12"} err="failed to get container status \"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12\": rpc error: code = NotFound desc = could not find container \"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12\": container with ID starting with 6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.184640 4854 scope.go:117] "RemoveContainer" containerID="89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.184838 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996"} err="failed to get container status \"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996\": rpc error: code = NotFound desc = could not find container \"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996\": container with ID starting with 89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.184864 4854 scope.go:117] "RemoveContainer" containerID="e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.185060 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35"} err="failed to get container status \"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35\": rpc error: code = NotFound desc = could not find container \"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35\": container with ID starting with e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.185086 4854 scope.go:117] "RemoveContainer" containerID="0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.185370 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f"} err="failed to get container status \"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f\": rpc error: code = NotFound desc = could not find container \"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f\": container with ID starting with 0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.185399 4854 scope.go:117] "RemoveContainer" containerID="21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.185594 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906"} err="failed to get container status \"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906\": rpc error: code = NotFound desc = could not find container \"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906\": container with ID starting with 21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.185614 4854 scope.go:117] "RemoveContainer" containerID="1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.185825 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac"} err="failed to get container status \"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac\": rpc error: code = NotFound desc = could not find container \"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac\": container with ID starting with 1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.185851 4854 scope.go:117] "RemoveContainer" containerID="572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.186048 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75"} err="failed to get container status \"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75\": rpc error: code = NotFound desc = could not find container \"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75\": container with ID starting with 572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.186070 4854 scope.go:117] "RemoveContainer" containerID="680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.186282 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d"} err="failed to get container status \"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d\": rpc error: code = NotFound desc = could not find container \"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d\": container with ID starting with 680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.186303 4854 scope.go:117] "RemoveContainer" containerID="6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.186511 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d"} err="failed to get container status \"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d\": rpc error: code = NotFound desc = could not find container \"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d\": container with ID starting with 6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.186537 4854 scope.go:117] "RemoveContainer" containerID="6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.186746 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12"} err="failed to get container status \"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12\": rpc error: code = NotFound desc = could not find container \"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12\": container with ID starting with 6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.186770 4854 scope.go:117] "RemoveContainer" containerID="89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.186970 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996"} err="failed to get container status \"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996\": rpc error: code = NotFound desc = could not find container \"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996\": container with ID starting with 89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.187000 4854 scope.go:117] "RemoveContainer" containerID="e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.187274 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35"} err="failed to get container status \"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35\": rpc error: code = NotFound desc = could not find container \"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35\": container with ID starting with e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.187304 4854 scope.go:117] "RemoveContainer" containerID="0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.187491 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f"} err="failed to get container status \"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f\": rpc error: code = NotFound desc = could not find container \"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f\": container with ID starting with 0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.187514 4854 scope.go:117] "RemoveContainer" containerID="21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.187687 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906"} err="failed to get container status \"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906\": rpc error: code = NotFound desc = could not find container \"21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906\": container with ID starting with 21c4baccb3eb05829eb125222627fab79bca929659775b3eb7f11e24bc892906 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.187705 4854 scope.go:117] "RemoveContainer" containerID="1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.187888 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac"} err="failed to get container status \"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac\": rpc error: code = NotFound desc = could not find container \"1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac\": container with ID starting with 1dc2017bde039f1888538091f42cb032f1216ae9080e639ccff79fde98cc05ac not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.187907 4854 scope.go:117] "RemoveContainer" containerID="572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.188074 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75"} err="failed to get container status \"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75\": rpc error: code = NotFound desc = could not find container \"572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75\": container with ID starting with 572357f77919a83d6f419fffd64a2c085e2744388f232638860c944a7eb76b75 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.188093 4854 scope.go:117] "RemoveContainer" containerID="680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.188274 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d"} err="failed to get container status \"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d\": rpc error: code = NotFound desc = could not find container \"680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d\": container with ID starting with 680db6f5dd33c83cab0a5cc73d0eecb3f977af950a4ca4f00e7aded23db88b7d not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.188292 4854 scope.go:117] "RemoveContainer" containerID="6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.188535 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d"} err="failed to get container status \"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d\": rpc error: code = NotFound desc = could not find container \"6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d\": container with ID starting with 6dc8baaa306fa92a686458c8f93b90894976bf4702080f18018a4305b17c427d not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.188562 4854 scope.go:117] "RemoveContainer" containerID="6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.188748 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12"} err="failed to get container status \"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12\": rpc error: code = NotFound desc = could not find container \"6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12\": container with ID starting with 6c57b33be29ca7545e4fc52914dd5701cf15a096a947d6332438f046450bed12 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.188767 4854 scope.go:117] "RemoveContainer" containerID="89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.188964 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996"} err="failed to get container status \"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996\": rpc error: code = NotFound desc = could not find container \"89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996\": container with ID starting with 89728864853aa241a579c1af1bf877a2d1c765c22b0a6e59ebdcd0c5287bf996 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.188988 4854 scope.go:117] "RemoveContainer" containerID="e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.189358 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35"} err="failed to get container status \"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35\": rpc error: code = NotFound desc = could not find container \"e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35\": container with ID starting with e0e44ff6bb2d828e0c7ea0781e5df4b33d788969928f81f8c0cc612506a4ac35 not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.189380 4854 scope.go:117] "RemoveContainer" containerID="0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.189585 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f"} err="failed to get container status \"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f\": rpc error: code = NotFound desc = could not find container \"0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f\": container with ID starting with 0f2ca50c8f0029b31b8fbe483d0ce72dfc8969ae85381c75e1c379d1f868447f not found: ID does not exist" Oct 07 12:34:10 crc kubenswrapper[4854]: W1007 12:34:10.193017 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ae09042_a894_400b_987b_6fca25bbbcb9.slice/crio-977dd7d00bc9648314537963c7677fc21af9d3e68eae2ed0d3a8dbcf870eaf87 WatchSource:0}: Error finding container 977dd7d00bc9648314537963c7677fc21af9d3e68eae2ed0d3a8dbcf870eaf87: Status 404 returned error can't find the container with id 977dd7d00bc9648314537963c7677fc21af9d3e68eae2ed0d3a8dbcf870eaf87 Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.279891 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m4vxt"] Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.285655 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m4vxt"] Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.717705 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ccb7160-6ff2-43f3-927b-5bf4aced4993" path="/var/lib/kubelet/pods/9ccb7160-6ff2-43f3-927b-5bf4aced4993/volumes" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.807446 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.807515 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.946127 4854 generic.go:334] "Generic (PLEG): container finished" podID="0ae09042-a894-400b-987b-6fca25bbbcb9" containerID="c84ff2515b1a9d8e2b1d2e96d0c668c9400e3be94fc66154dd62aa46ed7f4195" exitCode=0 Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.946212 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" event={"ID":"0ae09042-a894-400b-987b-6fca25bbbcb9","Type":"ContainerDied","Data":"c84ff2515b1a9d8e2b1d2e96d0c668c9400e3be94fc66154dd62aa46ed7f4195"} Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.946322 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" event={"ID":"0ae09042-a894-400b-987b-6fca25bbbcb9","Type":"ContainerStarted","Data":"977dd7d00bc9648314537963c7677fc21af9d3e68eae2ed0d3a8dbcf870eaf87"} Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.951954 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nkr42_260ab665-6a8a-44ee-9a16-5ff284b35eba/kube-multus/0.log" Oct 07 12:34:10 crc kubenswrapper[4854]: I1007 12:34:10.952093 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nkr42" event={"ID":"260ab665-6a8a-44ee-9a16-5ff284b35eba","Type":"ContainerStarted","Data":"6415f8a979e93f9c9d5408e25c1b679f922f80c808aa3f3c598146cd305a526e"} Oct 07 12:34:11 crc kubenswrapper[4854]: I1007 12:34:11.964174 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" event={"ID":"0ae09042-a894-400b-987b-6fca25bbbcb9","Type":"ContainerStarted","Data":"80cf828e4342d3844b0ea6835b05e0340c2092d72ea1ad0437ea2e70c34a4741"} Oct 07 12:34:11 crc kubenswrapper[4854]: I1007 12:34:11.964983 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" event={"ID":"0ae09042-a894-400b-987b-6fca25bbbcb9","Type":"ContainerStarted","Data":"5cf3b0336fec240a9f2de44ec644723ef3abefbe8d04645115b648d7497a45e8"} Oct 07 12:34:11 crc kubenswrapper[4854]: I1007 12:34:11.964997 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" event={"ID":"0ae09042-a894-400b-987b-6fca25bbbcb9","Type":"ContainerStarted","Data":"acdabbf9e557516b07d99996a1e9e7a885a18274608004d1b98bbd49d4181e1e"} Oct 07 12:34:11 crc kubenswrapper[4854]: I1007 12:34:11.965009 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" event={"ID":"0ae09042-a894-400b-987b-6fca25bbbcb9","Type":"ContainerStarted","Data":"12b710e34a586b746d0ff4ba7f5fd77f695e06db5e10ce9f84a8842a153a2434"} Oct 07 12:34:11 crc kubenswrapper[4854]: I1007 12:34:11.965022 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" event={"ID":"0ae09042-a894-400b-987b-6fca25bbbcb9","Type":"ContainerStarted","Data":"0bfa3fff694b3a1502896bb327bbda38353551ace861f5946ce821d8e6c3aa61"} Oct 07 12:34:11 crc kubenswrapper[4854]: I1007 12:34:11.965034 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" event={"ID":"0ae09042-a894-400b-987b-6fca25bbbcb9","Type":"ContainerStarted","Data":"fba2f26c2260d2f8e82a60bfefe0592d2084a5ed527a92c17e97af57973fb2ed"} Oct 07 12:34:14 crc kubenswrapper[4854]: I1007 12:34:14.990087 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" event={"ID":"0ae09042-a894-400b-987b-6fca25bbbcb9","Type":"ContainerStarted","Data":"62641d401dcd316b930cef5e1bbc6f857e7732e6f6c9ce86856699b2d3471aa3"} Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.563041 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-bxz4f"] Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.565428 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.568703 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.568773 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.568834 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.569383 4854 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-glcct" Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.627073 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgk2r\" (UniqueName: \"kubernetes.io/projected/d63f1f8a-2d04-4ef7-9c24-25bb39809250-kube-api-access-qgk2r\") pod \"crc-storage-crc-bxz4f\" (UID: \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\") " pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.627164 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d63f1f8a-2d04-4ef7-9c24-25bb39809250-crc-storage\") pod \"crc-storage-crc-bxz4f\" (UID: \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\") " pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.627356 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d63f1f8a-2d04-4ef7-9c24-25bb39809250-node-mnt\") pod \"crc-storage-crc-bxz4f\" (UID: \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\") " pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.728952 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgk2r\" (UniqueName: \"kubernetes.io/projected/d63f1f8a-2d04-4ef7-9c24-25bb39809250-kube-api-access-qgk2r\") pod \"crc-storage-crc-bxz4f\" (UID: \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\") " pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.729338 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d63f1f8a-2d04-4ef7-9c24-25bb39809250-crc-storage\") pod \"crc-storage-crc-bxz4f\" (UID: \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\") " pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.729537 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d63f1f8a-2d04-4ef7-9c24-25bb39809250-node-mnt\") pod \"crc-storage-crc-bxz4f\" (UID: \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\") " pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.730129 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d63f1f8a-2d04-4ef7-9c24-25bb39809250-node-mnt\") pod \"crc-storage-crc-bxz4f\" (UID: \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\") " pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.730579 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d63f1f8a-2d04-4ef7-9c24-25bb39809250-crc-storage\") pod \"crc-storage-crc-bxz4f\" (UID: \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\") " pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.762629 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgk2r\" (UniqueName: \"kubernetes.io/projected/d63f1f8a-2d04-4ef7-9c24-25bb39809250-kube-api-access-qgk2r\") pod \"crc-storage-crc-bxz4f\" (UID: \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\") " pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:16 crc kubenswrapper[4854]: I1007 12:34:16.891745 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:16 crc kubenswrapper[4854]: E1007 12:34:16.926942 4854 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bxz4f_crc-storage_d63f1f8a-2d04-4ef7-9c24-25bb39809250_0(298da478b7321cd92b383b647336d6ae733d12ae68b70e9a281e14da908627d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 12:34:16 crc kubenswrapper[4854]: E1007 12:34:16.927073 4854 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bxz4f_crc-storage_d63f1f8a-2d04-4ef7-9c24-25bb39809250_0(298da478b7321cd92b383b647336d6ae733d12ae68b70e9a281e14da908627d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:16 crc kubenswrapper[4854]: E1007 12:34:16.927115 4854 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bxz4f_crc-storage_d63f1f8a-2d04-4ef7-9c24-25bb39809250_0(298da478b7321cd92b383b647336d6ae733d12ae68b70e9a281e14da908627d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:16 crc kubenswrapper[4854]: E1007 12:34:16.927233 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-bxz4f_crc-storage(d63f1f8a-2d04-4ef7-9c24-25bb39809250)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-bxz4f_crc-storage(d63f1f8a-2d04-4ef7-9c24-25bb39809250)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bxz4f_crc-storage_d63f1f8a-2d04-4ef7-9c24-25bb39809250_0(298da478b7321cd92b383b647336d6ae733d12ae68b70e9a281e14da908627d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-bxz4f" podUID="d63f1f8a-2d04-4ef7-9c24-25bb39809250" Oct 07 12:34:17 crc kubenswrapper[4854]: I1007 12:34:17.014437 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" event={"ID":"0ae09042-a894-400b-987b-6fca25bbbcb9","Type":"ContainerStarted","Data":"bce926ea0065f5a0da7607a1d9eb2eeefdb74e12aa9e1a524d138ecb82f18785"} Oct 07 12:34:18 crc kubenswrapper[4854]: I1007 12:34:18.019499 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:18 crc kubenswrapper[4854]: I1007 12:34:18.019558 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:18 crc kubenswrapper[4854]: I1007 12:34:18.061342 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:18 crc kubenswrapper[4854]: I1007 12:34:18.069883 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" podStartSLOduration=9.069857999 podStartE2EDuration="9.069857999s" podCreationTimestamp="2025-10-07 12:34:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:34:18.062655691 +0000 UTC m=+574.050487946" watchObservedRunningTime="2025-10-07 12:34:18.069857999 +0000 UTC m=+574.057690254" Oct 07 12:34:18 crc kubenswrapper[4854]: I1007 12:34:18.431984 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bxz4f"] Oct 07 12:34:18 crc kubenswrapper[4854]: I1007 12:34:18.432161 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:18 crc kubenswrapper[4854]: I1007 12:34:18.432837 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:18 crc kubenswrapper[4854]: E1007 12:34:18.464150 4854 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bxz4f_crc-storage_d63f1f8a-2d04-4ef7-9c24-25bb39809250_0(53b058425b669c8df71b1edce12f0abf2d8afefe7925bbae4f5b67b758aa2830): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 12:34:18 crc kubenswrapper[4854]: E1007 12:34:18.466334 4854 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bxz4f_crc-storage_d63f1f8a-2d04-4ef7-9c24-25bb39809250_0(53b058425b669c8df71b1edce12f0abf2d8afefe7925bbae4f5b67b758aa2830): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:18 crc kubenswrapper[4854]: E1007 12:34:18.466382 4854 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bxz4f_crc-storage_d63f1f8a-2d04-4ef7-9c24-25bb39809250_0(53b058425b669c8df71b1edce12f0abf2d8afefe7925bbae4f5b67b758aa2830): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:18 crc kubenswrapper[4854]: E1007 12:34:18.466489 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-bxz4f_crc-storage(d63f1f8a-2d04-4ef7-9c24-25bb39809250)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-bxz4f_crc-storage(d63f1f8a-2d04-4ef7-9c24-25bb39809250)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-bxz4f_crc-storage_d63f1f8a-2d04-4ef7-9c24-25bb39809250_0(53b058425b669c8df71b1edce12f0abf2d8afefe7925bbae4f5b67b758aa2830): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-bxz4f" podUID="d63f1f8a-2d04-4ef7-9c24-25bb39809250" Oct 07 12:34:19 crc kubenswrapper[4854]: I1007 12:34:19.026316 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:19 crc kubenswrapper[4854]: I1007 12:34:19.061683 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:21 crc kubenswrapper[4854]: I1007 12:34:21.078925 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsbpg" Oct 07 12:34:33 crc kubenswrapper[4854]: I1007 12:34:33.702336 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:33 crc kubenswrapper[4854]: I1007 12:34:33.703792 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:33 crc kubenswrapper[4854]: I1007 12:34:33.952002 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bxz4f"] Oct 07 12:34:33 crc kubenswrapper[4854]: I1007 12:34:33.959781 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 12:34:34 crc kubenswrapper[4854]: I1007 12:34:34.130250 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bxz4f" event={"ID":"d63f1f8a-2d04-4ef7-9c24-25bb39809250","Type":"ContainerStarted","Data":"f7e7882b13a115ebd213a99eb6fb0c7cab0e9bc0ce86e86d221122bba3d35289"} Oct 07 12:34:40 crc kubenswrapper[4854]: I1007 12:34:40.807996 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:34:40 crc kubenswrapper[4854]: I1007 12:34:40.808772 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:34:41 crc kubenswrapper[4854]: I1007 12:34:41.183633 4854 generic.go:334] "Generic (PLEG): container finished" podID="d63f1f8a-2d04-4ef7-9c24-25bb39809250" containerID="2f297ee6c743aa9873760accb9f8555a341df361ace6131af3a2092c7dc90149" exitCode=0 Oct 07 12:34:41 crc kubenswrapper[4854]: I1007 12:34:41.183690 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bxz4f" event={"ID":"d63f1f8a-2d04-4ef7-9c24-25bb39809250","Type":"ContainerDied","Data":"2f297ee6c743aa9873760accb9f8555a341df361ace6131af3a2092c7dc90149"} Oct 07 12:34:42 crc kubenswrapper[4854]: I1007 12:34:42.430267 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:42 crc kubenswrapper[4854]: I1007 12:34:42.577003 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d63f1f8a-2d04-4ef7-9c24-25bb39809250-node-mnt\") pod \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\" (UID: \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\") " Oct 07 12:34:42 crc kubenswrapper[4854]: I1007 12:34:42.577198 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d63f1f8a-2d04-4ef7-9c24-25bb39809250-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "d63f1f8a-2d04-4ef7-9c24-25bb39809250" (UID: "d63f1f8a-2d04-4ef7-9c24-25bb39809250"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:34:42 crc kubenswrapper[4854]: I1007 12:34:42.577398 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgk2r\" (UniqueName: \"kubernetes.io/projected/d63f1f8a-2d04-4ef7-9c24-25bb39809250-kube-api-access-qgk2r\") pod \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\" (UID: \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\") " Oct 07 12:34:42 crc kubenswrapper[4854]: I1007 12:34:42.577485 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d63f1f8a-2d04-4ef7-9c24-25bb39809250-crc-storage\") pod \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\" (UID: \"d63f1f8a-2d04-4ef7-9c24-25bb39809250\") " Oct 07 12:34:42 crc kubenswrapper[4854]: I1007 12:34:42.577904 4854 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d63f1f8a-2d04-4ef7-9c24-25bb39809250-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:42 crc kubenswrapper[4854]: I1007 12:34:42.584352 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d63f1f8a-2d04-4ef7-9c24-25bb39809250-kube-api-access-qgk2r" (OuterVolumeSpecName: "kube-api-access-qgk2r") pod "d63f1f8a-2d04-4ef7-9c24-25bb39809250" (UID: "d63f1f8a-2d04-4ef7-9c24-25bb39809250"). InnerVolumeSpecName "kube-api-access-qgk2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:34:42 crc kubenswrapper[4854]: I1007 12:34:42.602467 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d63f1f8a-2d04-4ef7-9c24-25bb39809250-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "d63f1f8a-2d04-4ef7-9c24-25bb39809250" (UID: "d63f1f8a-2d04-4ef7-9c24-25bb39809250"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:34:42 crc kubenswrapper[4854]: I1007 12:34:42.679308 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgk2r\" (UniqueName: \"kubernetes.io/projected/d63f1f8a-2d04-4ef7-9c24-25bb39809250-kube-api-access-qgk2r\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:42 crc kubenswrapper[4854]: I1007 12:34:42.679374 4854 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d63f1f8a-2d04-4ef7-9c24-25bb39809250-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:43 crc kubenswrapper[4854]: I1007 12:34:43.201194 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bxz4f" event={"ID":"d63f1f8a-2d04-4ef7-9c24-25bb39809250","Type":"ContainerDied","Data":"f7e7882b13a115ebd213a99eb6fb0c7cab0e9bc0ce86e86d221122bba3d35289"} Oct 07 12:34:43 crc kubenswrapper[4854]: I1007 12:34:43.201260 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7e7882b13a115ebd213a99eb6fb0c7cab0e9bc0ce86e86d221122bba3d35289" Oct 07 12:34:43 crc kubenswrapper[4854]: I1007 12:34:43.201337 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bxz4f" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.563642 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f"] Oct 07 12:34:50 crc kubenswrapper[4854]: E1007 12:34:50.564657 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63f1f8a-2d04-4ef7-9c24-25bb39809250" containerName="storage" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.564678 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63f1f8a-2d04-4ef7-9c24-25bb39809250" containerName="storage" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.564810 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="d63f1f8a-2d04-4ef7-9c24-25bb39809250" containerName="storage" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.565852 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.569812 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.581290 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f"] Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.702924 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh5dt\" (UniqueName: \"kubernetes.io/projected/6153b740-17ba-4004-8da8-002b8717dcc2-kube-api-access-gh5dt\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f\" (UID: \"6153b740-17ba-4004-8da8-002b8717dcc2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.703194 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6153b740-17ba-4004-8da8-002b8717dcc2-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f\" (UID: \"6153b740-17ba-4004-8da8-002b8717dcc2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.703290 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6153b740-17ba-4004-8da8-002b8717dcc2-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f\" (UID: \"6153b740-17ba-4004-8da8-002b8717dcc2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.804266 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6153b740-17ba-4004-8da8-002b8717dcc2-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f\" (UID: \"6153b740-17ba-4004-8da8-002b8717dcc2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.804724 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh5dt\" (UniqueName: \"kubernetes.io/projected/6153b740-17ba-4004-8da8-002b8717dcc2-kube-api-access-gh5dt\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f\" (UID: \"6153b740-17ba-4004-8da8-002b8717dcc2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.804802 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6153b740-17ba-4004-8da8-002b8717dcc2-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f\" (UID: \"6153b740-17ba-4004-8da8-002b8717dcc2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.805128 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6153b740-17ba-4004-8da8-002b8717dcc2-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f\" (UID: \"6153b740-17ba-4004-8da8-002b8717dcc2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.805939 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6153b740-17ba-4004-8da8-002b8717dcc2-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f\" (UID: \"6153b740-17ba-4004-8da8-002b8717dcc2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.832232 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh5dt\" (UniqueName: \"kubernetes.io/projected/6153b740-17ba-4004-8da8-002b8717dcc2-kube-api-access-gh5dt\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f\" (UID: \"6153b740-17ba-4004-8da8-002b8717dcc2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" Oct 07 12:34:50 crc kubenswrapper[4854]: I1007 12:34:50.890430 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" Oct 07 12:34:51 crc kubenswrapper[4854]: I1007 12:34:51.090903 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f"] Oct 07 12:34:51 crc kubenswrapper[4854]: I1007 12:34:51.258074 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" event={"ID":"6153b740-17ba-4004-8da8-002b8717dcc2","Type":"ContainerStarted","Data":"c792ece5e5faddf278913fe6e5559932bca2fe98d959b50f3ae73276c4f9dd60"} Oct 07 12:34:52 crc kubenswrapper[4854]: I1007 12:34:52.266649 4854 generic.go:334] "Generic (PLEG): container finished" podID="6153b740-17ba-4004-8da8-002b8717dcc2" containerID="d325341fe4788dc2387e8210df7b9ecc727500b2fa404b0c821d98dcab3c022f" exitCode=0 Oct 07 12:34:52 crc kubenswrapper[4854]: I1007 12:34:52.266767 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" event={"ID":"6153b740-17ba-4004-8da8-002b8717dcc2","Type":"ContainerDied","Data":"d325341fe4788dc2387e8210df7b9ecc727500b2fa404b0c821d98dcab3c022f"} Oct 07 12:34:54 crc kubenswrapper[4854]: I1007 12:34:54.284069 4854 generic.go:334] "Generic (PLEG): container finished" podID="6153b740-17ba-4004-8da8-002b8717dcc2" containerID="6ca6aa4ab5bc091bd0f4306ef9ec6dfc21215181a2c08ab38f329036dc98fcc3" exitCode=0 Oct 07 12:34:54 crc kubenswrapper[4854]: I1007 12:34:54.284836 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" event={"ID":"6153b740-17ba-4004-8da8-002b8717dcc2","Type":"ContainerDied","Data":"6ca6aa4ab5bc091bd0f4306ef9ec6dfc21215181a2c08ab38f329036dc98fcc3"} Oct 07 12:34:55 crc kubenswrapper[4854]: I1007 12:34:55.296524 4854 generic.go:334] "Generic (PLEG): container finished" podID="6153b740-17ba-4004-8da8-002b8717dcc2" containerID="f11152bf210b0ef500fd573bb1c01ec6e51d28330f61dd897e90de5bd84efbd9" exitCode=0 Oct 07 12:34:55 crc kubenswrapper[4854]: I1007 12:34:55.298678 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" event={"ID":"6153b740-17ba-4004-8da8-002b8717dcc2","Type":"ContainerDied","Data":"f11152bf210b0ef500fd573bb1c01ec6e51d28330f61dd897e90de5bd84efbd9"} Oct 07 12:34:56 crc kubenswrapper[4854]: I1007 12:34:56.576990 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" Oct 07 12:34:56 crc kubenswrapper[4854]: I1007 12:34:56.594020 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6153b740-17ba-4004-8da8-002b8717dcc2-bundle\") pod \"6153b740-17ba-4004-8da8-002b8717dcc2\" (UID: \"6153b740-17ba-4004-8da8-002b8717dcc2\") " Oct 07 12:34:56 crc kubenswrapper[4854]: I1007 12:34:56.594163 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6153b740-17ba-4004-8da8-002b8717dcc2-util\") pod \"6153b740-17ba-4004-8da8-002b8717dcc2\" (UID: \"6153b740-17ba-4004-8da8-002b8717dcc2\") " Oct 07 12:34:56 crc kubenswrapper[4854]: I1007 12:34:56.594203 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh5dt\" (UniqueName: \"kubernetes.io/projected/6153b740-17ba-4004-8da8-002b8717dcc2-kube-api-access-gh5dt\") pod \"6153b740-17ba-4004-8da8-002b8717dcc2\" (UID: \"6153b740-17ba-4004-8da8-002b8717dcc2\") " Oct 07 12:34:56 crc kubenswrapper[4854]: I1007 12:34:56.594811 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6153b740-17ba-4004-8da8-002b8717dcc2-bundle" (OuterVolumeSpecName: "bundle") pod "6153b740-17ba-4004-8da8-002b8717dcc2" (UID: "6153b740-17ba-4004-8da8-002b8717dcc2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:34:56 crc kubenswrapper[4854]: I1007 12:34:56.611581 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6153b740-17ba-4004-8da8-002b8717dcc2-util" (OuterVolumeSpecName: "util") pod "6153b740-17ba-4004-8da8-002b8717dcc2" (UID: "6153b740-17ba-4004-8da8-002b8717dcc2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:34:56 crc kubenswrapper[4854]: I1007 12:34:56.625607 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6153b740-17ba-4004-8da8-002b8717dcc2-kube-api-access-gh5dt" (OuterVolumeSpecName: "kube-api-access-gh5dt") pod "6153b740-17ba-4004-8da8-002b8717dcc2" (UID: "6153b740-17ba-4004-8da8-002b8717dcc2"). InnerVolumeSpecName "kube-api-access-gh5dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:34:56 crc kubenswrapper[4854]: I1007 12:34:56.696399 4854 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6153b740-17ba-4004-8da8-002b8717dcc2-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:56 crc kubenswrapper[4854]: I1007 12:34:56.696439 4854 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6153b740-17ba-4004-8da8-002b8717dcc2-util\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:56 crc kubenswrapper[4854]: I1007 12:34:56.696449 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh5dt\" (UniqueName: \"kubernetes.io/projected/6153b740-17ba-4004-8da8-002b8717dcc2-kube-api-access-gh5dt\") on node \"crc\" DevicePath \"\"" Oct 07 12:34:57 crc kubenswrapper[4854]: I1007 12:34:57.310552 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" event={"ID":"6153b740-17ba-4004-8da8-002b8717dcc2","Type":"ContainerDied","Data":"c792ece5e5faddf278913fe6e5559932bca2fe98d959b50f3ae73276c4f9dd60"} Oct 07 12:34:57 crc kubenswrapper[4854]: I1007 12:34:57.310601 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c792ece5e5faddf278913fe6e5559932bca2fe98d959b50f3ae73276c4f9dd60" Oct 07 12:34:57 crc kubenswrapper[4854]: I1007 12:34:57.310657 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f" Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.204692 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-jp5sx"] Oct 07 12:34:59 crc kubenswrapper[4854]: E1007 12:34:59.205471 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6153b740-17ba-4004-8da8-002b8717dcc2" containerName="pull" Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.205492 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6153b740-17ba-4004-8da8-002b8717dcc2" containerName="pull" Oct 07 12:34:59 crc kubenswrapper[4854]: E1007 12:34:59.205516 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6153b740-17ba-4004-8da8-002b8717dcc2" containerName="extract" Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.205525 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6153b740-17ba-4004-8da8-002b8717dcc2" containerName="extract" Oct 07 12:34:59 crc kubenswrapper[4854]: E1007 12:34:59.205549 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6153b740-17ba-4004-8da8-002b8717dcc2" containerName="util" Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.205558 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6153b740-17ba-4004-8da8-002b8717dcc2" containerName="util" Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.205683 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6153b740-17ba-4004-8da8-002b8717dcc2" containerName="extract" Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.206299 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-jp5sx" Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.209628 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.209747 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-njwwk" Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.210079 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.220126 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-jp5sx"] Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.228577 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r5pb\" (UniqueName: \"kubernetes.io/projected/c77584c6-4cc6-4e7b-89b4-368d9a8775f5-kube-api-access-9r5pb\") pod \"nmstate-operator-858ddd8f98-jp5sx\" (UID: \"c77584c6-4cc6-4e7b-89b4-368d9a8775f5\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-jp5sx" Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.329878 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r5pb\" (UniqueName: \"kubernetes.io/projected/c77584c6-4cc6-4e7b-89b4-368d9a8775f5-kube-api-access-9r5pb\") pod \"nmstate-operator-858ddd8f98-jp5sx\" (UID: \"c77584c6-4cc6-4e7b-89b4-368d9a8775f5\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-jp5sx" Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.376561 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r5pb\" (UniqueName: \"kubernetes.io/projected/c77584c6-4cc6-4e7b-89b4-368d9a8775f5-kube-api-access-9r5pb\") pod \"nmstate-operator-858ddd8f98-jp5sx\" (UID: \"c77584c6-4cc6-4e7b-89b4-368d9a8775f5\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-jp5sx" Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.525393 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-jp5sx" Oct 07 12:34:59 crc kubenswrapper[4854]: I1007 12:34:59.750665 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-jp5sx"] Oct 07 12:35:00 crc kubenswrapper[4854]: I1007 12:35:00.329254 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-jp5sx" event={"ID":"c77584c6-4cc6-4e7b-89b4-368d9a8775f5","Type":"ContainerStarted","Data":"82e4940411681df1975805e05a9af2100b3503c6ff2623458b79456a4fdafc62"} Oct 07 12:35:03 crc kubenswrapper[4854]: I1007 12:35:03.352755 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-jp5sx" event={"ID":"c77584c6-4cc6-4e7b-89b4-368d9a8775f5","Type":"ContainerStarted","Data":"4a818771b83e274abdf831b050d08b451311ea54414ac3d25bb22e38e4bc8ddc"} Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.342743 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-jp5sx" podStartSLOduration=2.201865947 podStartE2EDuration="5.342714608s" podCreationTimestamp="2025-10-07 12:34:59 +0000 UTC" firstStartedPulling="2025-10-07 12:34:59.776375154 +0000 UTC m=+615.764207419" lastFinishedPulling="2025-10-07 12:35:02.917223795 +0000 UTC m=+618.905056080" observedRunningTime="2025-10-07 12:35:03.381557732 +0000 UTC m=+619.369390027" watchObservedRunningTime="2025-10-07 12:35:04.342714608 +0000 UTC m=+620.330546863" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.346661 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-gs6dc"] Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.348496 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gs6dc" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.351182 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wfbbr" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.352026 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr"] Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.353087 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.359078 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.378124 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-gs6dc"] Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.384051 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr"] Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.392886 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-xx54b"] Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.393810 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.412259 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85gdj\" (UniqueName: \"kubernetes.io/projected/173b0967-6286-4bff-94b6-b77502d681c3-kube-api-access-85gdj\") pod \"nmstate-handler-xx54b\" (UID: \"173b0967-6286-4bff-94b6-b77502d681c3\") " pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.412321 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/173b0967-6286-4bff-94b6-b77502d681c3-nmstate-lock\") pod \"nmstate-handler-xx54b\" (UID: \"173b0967-6286-4bff-94b6-b77502d681c3\") " pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.412363 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/173b0967-6286-4bff-94b6-b77502d681c3-ovs-socket\") pod \"nmstate-handler-xx54b\" (UID: \"173b0967-6286-4bff-94b6-b77502d681c3\") " pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.412394 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n8lz\" (UniqueName: \"kubernetes.io/projected/2beeb70d-afc7-4e87-b8a3-3d0398207f60-kube-api-access-4n8lz\") pod \"nmstate-metrics-fdff9cb8d-gs6dc\" (UID: \"2beeb70d-afc7-4e87-b8a3-3d0398207f60\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gs6dc" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.412415 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/173b0967-6286-4bff-94b6-b77502d681c3-dbus-socket\") pod \"nmstate-handler-xx54b\" (UID: \"173b0967-6286-4bff-94b6-b77502d681c3\") " pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.412468 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h7wj\" (UniqueName: \"kubernetes.io/projected/b7917a01-6d98-45d9-b647-605da890ce27-kube-api-access-5h7wj\") pod \"nmstate-webhook-6cdbc54649-4zcnr\" (UID: \"b7917a01-6d98-45d9-b647-605da890ce27\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.412528 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b7917a01-6d98-45d9-b647-605da890ce27-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-4zcnr\" (UID: \"b7917a01-6d98-45d9-b647-605da890ce27\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.513999 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85gdj\" (UniqueName: \"kubernetes.io/projected/173b0967-6286-4bff-94b6-b77502d681c3-kube-api-access-85gdj\") pod \"nmstate-handler-xx54b\" (UID: \"173b0967-6286-4bff-94b6-b77502d681c3\") " pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.514084 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/173b0967-6286-4bff-94b6-b77502d681c3-nmstate-lock\") pod \"nmstate-handler-xx54b\" (UID: \"173b0967-6286-4bff-94b6-b77502d681c3\") " pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.514117 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/173b0967-6286-4bff-94b6-b77502d681c3-ovs-socket\") pod \"nmstate-handler-xx54b\" (UID: \"173b0967-6286-4bff-94b6-b77502d681c3\") " pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.514168 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n8lz\" (UniqueName: \"kubernetes.io/projected/2beeb70d-afc7-4e87-b8a3-3d0398207f60-kube-api-access-4n8lz\") pod \"nmstate-metrics-fdff9cb8d-gs6dc\" (UID: \"2beeb70d-afc7-4e87-b8a3-3d0398207f60\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gs6dc" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.514205 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/173b0967-6286-4bff-94b6-b77502d681c3-dbus-socket\") pod \"nmstate-handler-xx54b\" (UID: \"173b0967-6286-4bff-94b6-b77502d681c3\") " pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.514238 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h7wj\" (UniqueName: \"kubernetes.io/projected/b7917a01-6d98-45d9-b647-605da890ce27-kube-api-access-5h7wj\") pod \"nmstate-webhook-6cdbc54649-4zcnr\" (UID: \"b7917a01-6d98-45d9-b647-605da890ce27\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.514282 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b7917a01-6d98-45d9-b647-605da890ce27-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-4zcnr\" (UID: \"b7917a01-6d98-45d9-b647-605da890ce27\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr" Oct 07 12:35:04 crc kubenswrapper[4854]: E1007 12:35:04.514476 4854 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 07 12:35:04 crc kubenswrapper[4854]: E1007 12:35:04.514564 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7917a01-6d98-45d9-b647-605da890ce27-tls-key-pair podName:b7917a01-6d98-45d9-b647-605da890ce27 nodeName:}" failed. No retries permitted until 2025-10-07 12:35:05.014532756 +0000 UTC m=+621.002365011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/b7917a01-6d98-45d9-b647-605da890ce27-tls-key-pair") pod "nmstate-webhook-6cdbc54649-4zcnr" (UID: "b7917a01-6d98-45d9-b647-605da890ce27") : secret "openshift-nmstate-webhook" not found Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.515099 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/173b0967-6286-4bff-94b6-b77502d681c3-nmstate-lock\") pod \"nmstate-handler-xx54b\" (UID: \"173b0967-6286-4bff-94b6-b77502d681c3\") " pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.515173 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/173b0967-6286-4bff-94b6-b77502d681c3-ovs-socket\") pod \"nmstate-handler-xx54b\" (UID: \"173b0967-6286-4bff-94b6-b77502d681c3\") " pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.515659 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/173b0967-6286-4bff-94b6-b77502d681c3-dbus-socket\") pod \"nmstate-handler-xx54b\" (UID: \"173b0967-6286-4bff-94b6-b77502d681c3\") " pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.518862 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m"] Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.519648 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.529496 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.529792 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.529919 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mvwn5" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.548603 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n8lz\" (UniqueName: \"kubernetes.io/projected/2beeb70d-afc7-4e87-b8a3-3d0398207f60-kube-api-access-4n8lz\") pod \"nmstate-metrics-fdff9cb8d-gs6dc\" (UID: \"2beeb70d-afc7-4e87-b8a3-3d0398207f60\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gs6dc" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.552993 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h7wj\" (UniqueName: \"kubernetes.io/projected/b7917a01-6d98-45d9-b647-605da890ce27-kube-api-access-5h7wj\") pod \"nmstate-webhook-6cdbc54649-4zcnr\" (UID: \"b7917a01-6d98-45d9-b647-605da890ce27\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.554017 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m"] Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.562769 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85gdj\" (UniqueName: \"kubernetes.io/projected/173b0967-6286-4bff-94b6-b77502d681c3-kube-api-access-85gdj\") pod \"nmstate-handler-xx54b\" (UID: \"173b0967-6286-4bff-94b6-b77502d681c3\") " pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.675510 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gs6dc" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.714189 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.717829 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r87gj\" (UniqueName: \"kubernetes.io/projected/20300568-d59a-4c2e-9311-56d1c20419c2-kube-api-access-r87gj\") pod \"nmstate-console-plugin-6b874cbd85-h2m6m\" (UID: \"20300568-d59a-4c2e-9311-56d1c20419c2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.717915 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/20300568-d59a-4c2e-9311-56d1c20419c2-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-h2m6m\" (UID: \"20300568-d59a-4c2e-9311-56d1c20419c2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.717989 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/20300568-d59a-4c2e-9311-56d1c20419c2-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-h2m6m\" (UID: \"20300568-d59a-4c2e-9311-56d1c20419c2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.782263 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-59b7f8d4bf-cdmw9"] Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.783120 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.796999 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59b7f8d4bf-cdmw9"] Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.818932 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r87gj\" (UniqueName: \"kubernetes.io/projected/20300568-d59a-4c2e-9311-56d1c20419c2-kube-api-access-r87gj\") pod \"nmstate-console-plugin-6b874cbd85-h2m6m\" (UID: \"20300568-d59a-4c2e-9311-56d1c20419c2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.819019 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/20300568-d59a-4c2e-9311-56d1c20419c2-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-h2m6m\" (UID: \"20300568-d59a-4c2e-9311-56d1c20419c2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.819114 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/20300568-d59a-4c2e-9311-56d1c20419c2-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-h2m6m\" (UID: \"20300568-d59a-4c2e-9311-56d1c20419c2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.824955 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/20300568-d59a-4c2e-9311-56d1c20419c2-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-h2m6m\" (UID: \"20300568-d59a-4c2e-9311-56d1c20419c2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.833189 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/20300568-d59a-4c2e-9311-56d1c20419c2-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-h2m6m\" (UID: \"20300568-d59a-4c2e-9311-56d1c20419c2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.842513 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r87gj\" (UniqueName: \"kubernetes.io/projected/20300568-d59a-4c2e-9311-56d1c20419c2-kube-api-access-r87gj\") pod \"nmstate-console-plugin-6b874cbd85-h2m6m\" (UID: \"20300568-d59a-4c2e-9311-56d1c20419c2\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.844237 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.923881 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/befaa955-2147-4f65-983c-55b28e1d8a11-console-config\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.924291 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9xgx\" (UniqueName: \"kubernetes.io/projected/befaa955-2147-4f65-983c-55b28e1d8a11-kube-api-access-f9xgx\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.924914 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/befaa955-2147-4f65-983c-55b28e1d8a11-console-serving-cert\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.924944 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/befaa955-2147-4f65-983c-55b28e1d8a11-console-oauth-config\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.925105 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/befaa955-2147-4f65-983c-55b28e1d8a11-service-ca\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.925279 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/befaa955-2147-4f65-983c-55b28e1d8a11-trusted-ca-bundle\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.925373 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/befaa955-2147-4f65-983c-55b28e1d8a11-oauth-serving-cert\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:04 crc kubenswrapper[4854]: I1007 12:35:04.942848 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-gs6dc"] Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.027138 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9xgx\" (UniqueName: \"kubernetes.io/projected/befaa955-2147-4f65-983c-55b28e1d8a11-kube-api-access-f9xgx\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.028368 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/befaa955-2147-4f65-983c-55b28e1d8a11-console-serving-cert\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.028387 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/befaa955-2147-4f65-983c-55b28e1d8a11-console-oauth-config\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.029425 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/befaa955-2147-4f65-983c-55b28e1d8a11-service-ca\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.029482 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/befaa955-2147-4f65-983c-55b28e1d8a11-trusted-ca-bundle\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.029522 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/befaa955-2147-4f65-983c-55b28e1d8a11-oauth-serving-cert\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.029590 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b7917a01-6d98-45d9-b647-605da890ce27-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-4zcnr\" (UID: \"b7917a01-6d98-45d9-b647-605da890ce27\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.029652 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/befaa955-2147-4f65-983c-55b28e1d8a11-console-config\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.030550 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/befaa955-2147-4f65-983c-55b28e1d8a11-console-config\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.031523 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/befaa955-2147-4f65-983c-55b28e1d8a11-oauth-serving-cert\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.032329 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/befaa955-2147-4f65-983c-55b28e1d8a11-service-ca\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.032766 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/befaa955-2147-4f65-983c-55b28e1d8a11-console-oauth-config\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.032838 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/befaa955-2147-4f65-983c-55b28e1d8a11-trusted-ca-bundle\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.037186 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/befaa955-2147-4f65-983c-55b28e1d8a11-console-serving-cert\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.037774 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b7917a01-6d98-45d9-b647-605da890ce27-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-4zcnr\" (UID: \"b7917a01-6d98-45d9-b647-605da890ce27\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.049033 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9xgx\" (UniqueName: \"kubernetes.io/projected/befaa955-2147-4f65-983c-55b28e1d8a11-kube-api-access-f9xgx\") pod \"console-59b7f8d4bf-cdmw9\" (UID: \"befaa955-2147-4f65-983c-55b28e1d8a11\") " pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.050779 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m"] Oct 07 12:35:05 crc kubenswrapper[4854]: W1007 12:35:05.054400 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20300568_d59a_4c2e_9311_56d1c20419c2.slice/crio-a0f41a81c93b15dbd5c80fbca10bfc6864cb45da02a86b16e92e873511f08827 WatchSource:0}: Error finding container a0f41a81c93b15dbd5c80fbca10bfc6864cb45da02a86b16e92e873511f08827: Status 404 returned error can't find the container with id a0f41a81c93b15dbd5c80fbca10bfc6864cb45da02a86b16e92e873511f08827 Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.104235 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.286003 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr" Oct 07 12:35:05 crc kubenswrapper[4854]: W1007 12:35:05.348612 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbefaa955_2147_4f65_983c_55b28e1d8a11.slice/crio-7a6982c362aae8a1f0e9c6f4199c62fe2dc97f46bb2982f145750e7838df991a WatchSource:0}: Error finding container 7a6982c362aae8a1f0e9c6f4199c62fe2dc97f46bb2982f145750e7838df991a: Status 404 returned error can't find the container with id 7a6982c362aae8a1f0e9c6f4199c62fe2dc97f46bb2982f145750e7838df991a Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.350993 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59b7f8d4bf-cdmw9"] Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.366089 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gs6dc" event={"ID":"2beeb70d-afc7-4e87-b8a3-3d0398207f60","Type":"ContainerStarted","Data":"11acdf0fae9f6cd715dd36209b0e4b7de1da8dd0406d5673e1557d435157781e"} Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.367262 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m" event={"ID":"20300568-d59a-4c2e-9311-56d1c20419c2","Type":"ContainerStarted","Data":"a0f41a81c93b15dbd5c80fbca10bfc6864cb45da02a86b16e92e873511f08827"} Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.369090 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59b7f8d4bf-cdmw9" event={"ID":"befaa955-2147-4f65-983c-55b28e1d8a11","Type":"ContainerStarted","Data":"7a6982c362aae8a1f0e9c6f4199c62fe2dc97f46bb2982f145750e7838df991a"} Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.369989 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xx54b" event={"ID":"173b0967-6286-4bff-94b6-b77502d681c3","Type":"ContainerStarted","Data":"35a12a09380fe2d128dfeba018aa02112bc5e9d078458e3a531a2915bbbc8262"} Oct 07 12:35:05 crc kubenswrapper[4854]: I1007 12:35:05.545381 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr"] Oct 07 12:35:06 crc kubenswrapper[4854]: I1007 12:35:06.377279 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59b7f8d4bf-cdmw9" event={"ID":"befaa955-2147-4f65-983c-55b28e1d8a11","Type":"ContainerStarted","Data":"c7db589eeb5ffa956e80ce3bf2c15095cbb89363df1211d8949f40c8741fa4c8"} Oct 07 12:35:06 crc kubenswrapper[4854]: I1007 12:35:06.378329 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr" event={"ID":"b7917a01-6d98-45d9-b647-605da890ce27","Type":"ContainerStarted","Data":"bb850df7c0e2fc336a740f37ab76f4a3e5408ca1eabe020b5256925b71df0bfb"} Oct 07 12:35:06 crc kubenswrapper[4854]: I1007 12:35:06.403306 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59b7f8d4bf-cdmw9" podStartSLOduration=2.403284487 podStartE2EDuration="2.403284487s" podCreationTimestamp="2025-10-07 12:35:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:35:06.399217508 +0000 UTC m=+622.387049773" watchObservedRunningTime="2025-10-07 12:35:06.403284487 +0000 UTC m=+622.391116742" Oct 07 12:35:08 crc kubenswrapper[4854]: I1007 12:35:08.393469 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xx54b" event={"ID":"173b0967-6286-4bff-94b6-b77502d681c3","Type":"ContainerStarted","Data":"04d376d20126beb8e88823a44c1404ae17ca03f37c0118a08c8680513b4d6318"} Oct 07 12:35:08 crc kubenswrapper[4854]: I1007 12:35:08.393989 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:08 crc kubenswrapper[4854]: I1007 12:35:08.395864 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr" event={"ID":"b7917a01-6d98-45d9-b647-605da890ce27","Type":"ContainerStarted","Data":"b4ebca8508238240fd1232441663daea8ffb5a15270a8556f1b33f43607f4f11"} Oct 07 12:35:08 crc kubenswrapper[4854]: I1007 12:35:08.396081 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr" Oct 07 12:35:08 crc kubenswrapper[4854]: I1007 12:35:08.397474 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gs6dc" event={"ID":"2beeb70d-afc7-4e87-b8a3-3d0398207f60","Type":"ContainerStarted","Data":"8713c091b9dc8c99475167e92696923dd222ce4b8a89fcae740b202c2570e746"} Oct 07 12:35:08 crc kubenswrapper[4854]: I1007 12:35:08.399654 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m" event={"ID":"20300568-d59a-4c2e-9311-56d1c20419c2","Type":"ContainerStarted","Data":"2e2fcb27169aae2e08bdb2a4d209f3b623bc25859f674b6d9b91a4d0a86ebc10"} Oct 07 12:35:08 crc kubenswrapper[4854]: I1007 12:35:08.435126 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr" podStartSLOduration=2.145783462 podStartE2EDuration="4.435102083s" podCreationTimestamp="2025-10-07 12:35:04 +0000 UTC" firstStartedPulling="2025-10-07 12:35:05.571716983 +0000 UTC m=+621.559549248" lastFinishedPulling="2025-10-07 12:35:07.861035594 +0000 UTC m=+623.848867869" observedRunningTime="2025-10-07 12:35:08.432958961 +0000 UTC m=+624.420791216" watchObservedRunningTime="2025-10-07 12:35:08.435102083 +0000 UTC m=+624.422934338" Oct 07 12:35:08 crc kubenswrapper[4854]: I1007 12:35:08.438527 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-xx54b" podStartSLOduration=1.331653647 podStartE2EDuration="4.438486622s" podCreationTimestamp="2025-10-07 12:35:04 +0000 UTC" firstStartedPulling="2025-10-07 12:35:04.757341531 +0000 UTC m=+620.745173796" lastFinishedPulling="2025-10-07 12:35:07.864174516 +0000 UTC m=+623.852006771" observedRunningTime="2025-10-07 12:35:08.416685364 +0000 UTC m=+624.404517639" watchObservedRunningTime="2025-10-07 12:35:08.438486622 +0000 UTC m=+624.426318917" Oct 07 12:35:08 crc kubenswrapper[4854]: I1007 12:35:08.457592 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-h2m6m" podStartSLOduration=1.653457504 podStartE2EDuration="4.457568191s" podCreationTimestamp="2025-10-07 12:35:04 +0000 UTC" firstStartedPulling="2025-10-07 12:35:05.056691891 +0000 UTC m=+621.044524146" lastFinishedPulling="2025-10-07 12:35:07.860802558 +0000 UTC m=+623.848634833" observedRunningTime="2025-10-07 12:35:08.454921993 +0000 UTC m=+624.442754268" watchObservedRunningTime="2025-10-07 12:35:08.457568191 +0000 UTC m=+624.445400446" Oct 07 12:35:10 crc kubenswrapper[4854]: I1007 12:35:10.808519 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:35:10 crc kubenswrapper[4854]: I1007 12:35:10.810503 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:35:10 crc kubenswrapper[4854]: I1007 12:35:10.810647 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:35:11 crc kubenswrapper[4854]: I1007 12:35:11.418879 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f5e36bd11ba8e83568e3659962294b47149b50c3ca170d0c33beb4ab9960604f"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 12:35:11 crc kubenswrapper[4854]: I1007 12:35:11.419322 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://f5e36bd11ba8e83568e3659962294b47149b50c3ca170d0c33beb4ab9960604f" gracePeriod=600 Oct 07 12:35:12 crc kubenswrapper[4854]: I1007 12:35:12.428471 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="f5e36bd11ba8e83568e3659962294b47149b50c3ca170d0c33beb4ab9960604f" exitCode=0 Oct 07 12:35:12 crc kubenswrapper[4854]: I1007 12:35:12.428554 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"f5e36bd11ba8e83568e3659962294b47149b50c3ca170d0c33beb4ab9960604f"} Oct 07 12:35:12 crc kubenswrapper[4854]: I1007 12:35:12.428965 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"ad58428eb7a4eee4282ef9e4c324813616eaa8e2cac7c60386af842a1cd061ba"} Oct 07 12:35:12 crc kubenswrapper[4854]: I1007 12:35:12.428998 4854 scope.go:117] "RemoveContainer" containerID="2384c1ed3c6c1a4d362b7cd3182c7ae041c6ae68b38e9626a42db83d715f023f" Oct 07 12:35:12 crc kubenswrapper[4854]: I1007 12:35:12.431437 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gs6dc" event={"ID":"2beeb70d-afc7-4e87-b8a3-3d0398207f60","Type":"ContainerStarted","Data":"2c6ea9878d44ab78c5562c3f293fe47048ae3810fe4cbfa989134e0e8c64d58e"} Oct 07 12:35:12 crc kubenswrapper[4854]: I1007 12:35:12.469276 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gs6dc" podStartSLOduration=1.618820761 podStartE2EDuration="8.469239655s" podCreationTimestamp="2025-10-07 12:35:04 +0000 UTC" firstStartedPulling="2025-10-07 12:35:04.956350815 +0000 UTC m=+620.944183060" lastFinishedPulling="2025-10-07 12:35:11.806769699 +0000 UTC m=+627.794601954" observedRunningTime="2025-10-07 12:35:12.467494084 +0000 UTC m=+628.455326339" watchObservedRunningTime="2025-10-07 12:35:12.469239655 +0000 UTC m=+628.457071940" Oct 07 12:35:14 crc kubenswrapper[4854]: I1007 12:35:14.751833 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-xx54b" Oct 07 12:35:15 crc kubenswrapper[4854]: I1007 12:35:15.104976 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:15 crc kubenswrapper[4854]: I1007 12:35:15.105066 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:15 crc kubenswrapper[4854]: I1007 12:35:15.114685 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:15 crc kubenswrapper[4854]: I1007 12:35:15.463859 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59b7f8d4bf-cdmw9" Oct 07 12:35:15 crc kubenswrapper[4854]: I1007 12:35:15.545109 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9cjsc"] Oct 07 12:35:25 crc kubenswrapper[4854]: I1007 12:35:25.294218 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-4zcnr" Oct 07 12:35:27 crc kubenswrapper[4854]: I1007 12:35:27.846290 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6s458"] Oct 07 12:35:27 crc kubenswrapper[4854]: I1007 12:35:27.846981 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" podUID="753a1abe-8f65-4721-ad3f-b207e3413ffa" containerName="controller-manager" containerID="cri-o://0841c829e516db789b99eb47c1714434662b71c48a0c67251407ed29307c248e" gracePeriod=30 Oct 07 12:35:27 crc kubenswrapper[4854]: I1007 12:35:27.940638 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh"] Oct 07 12:35:27 crc kubenswrapper[4854]: I1007 12:35:27.940983 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" podUID="6c4c0dff-7632-4208-ad0e-475eb69bbc3b" containerName="route-controller-manager" containerID="cri-o://f93699bad30ab9b4213570ff13f662f7a74fa1a38cf616478d0940f9e5ddc952" gracePeriod=30 Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.265847 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.398836 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.418247 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-config\") pod \"753a1abe-8f65-4721-ad3f-b207e3413ffa\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.418315 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2s5b\" (UniqueName: \"kubernetes.io/projected/753a1abe-8f65-4721-ad3f-b207e3413ffa-kube-api-access-m2s5b\") pod \"753a1abe-8f65-4721-ad3f-b207e3413ffa\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.418360 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-proxy-ca-bundles\") pod \"753a1abe-8f65-4721-ad3f-b207e3413ffa\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.418394 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-client-ca\") pod \"753a1abe-8f65-4721-ad3f-b207e3413ffa\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.418425 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-serving-cert\") pod \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.418452 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-client-ca\") pod \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.418485 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4mvc\" (UniqueName: \"kubernetes.io/projected/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-kube-api-access-b4mvc\") pod \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.418520 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-config\") pod \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\" (UID: \"6c4c0dff-7632-4208-ad0e-475eb69bbc3b\") " Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.418542 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/753a1abe-8f65-4721-ad3f-b207e3413ffa-serving-cert\") pod \"753a1abe-8f65-4721-ad3f-b207e3413ffa\" (UID: \"753a1abe-8f65-4721-ad3f-b207e3413ffa\") " Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.421525 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-config" (OuterVolumeSpecName: "config") pod "753a1abe-8f65-4721-ad3f-b207e3413ffa" (UID: "753a1abe-8f65-4721-ad3f-b207e3413ffa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.422288 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "753a1abe-8f65-4721-ad3f-b207e3413ffa" (UID: "753a1abe-8f65-4721-ad3f-b207e3413ffa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.422585 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-config" (OuterVolumeSpecName: "config") pod "6c4c0dff-7632-4208-ad0e-475eb69bbc3b" (UID: "6c4c0dff-7632-4208-ad0e-475eb69bbc3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.422647 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-client-ca" (OuterVolumeSpecName: "client-ca") pod "6c4c0dff-7632-4208-ad0e-475eb69bbc3b" (UID: "6c4c0dff-7632-4208-ad0e-475eb69bbc3b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.422972 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-client-ca" (OuterVolumeSpecName: "client-ca") pod "753a1abe-8f65-4721-ad3f-b207e3413ffa" (UID: "753a1abe-8f65-4721-ad3f-b207e3413ffa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.432102 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-kube-api-access-b4mvc" (OuterVolumeSpecName: "kube-api-access-b4mvc") pod "6c4c0dff-7632-4208-ad0e-475eb69bbc3b" (UID: "6c4c0dff-7632-4208-ad0e-475eb69bbc3b"). InnerVolumeSpecName "kube-api-access-b4mvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.432249 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753a1abe-8f65-4721-ad3f-b207e3413ffa-kube-api-access-m2s5b" (OuterVolumeSpecName: "kube-api-access-m2s5b") pod "753a1abe-8f65-4721-ad3f-b207e3413ffa" (UID: "753a1abe-8f65-4721-ad3f-b207e3413ffa"). InnerVolumeSpecName "kube-api-access-m2s5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.432887 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6c4c0dff-7632-4208-ad0e-475eb69bbc3b" (UID: "6c4c0dff-7632-4208-ad0e-475eb69bbc3b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.436567 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753a1abe-8f65-4721-ad3f-b207e3413ffa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "753a1abe-8f65-4721-ad3f-b207e3413ffa" (UID: "753a1abe-8f65-4721-ad3f-b207e3413ffa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.520462 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.520505 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2s5b\" (UniqueName: \"kubernetes.io/projected/753a1abe-8f65-4721-ad3f-b207e3413ffa-kube-api-access-m2s5b\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.520518 4854 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.520527 4854 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/753a1abe-8f65-4721-ad3f-b207e3413ffa-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.520538 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.520547 4854 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.520561 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4mvc\" (UniqueName: \"kubernetes.io/projected/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-kube-api-access-b4mvc\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.520571 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4c0dff-7632-4208-ad0e-475eb69bbc3b-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.520580 4854 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/753a1abe-8f65-4721-ad3f-b207e3413ffa-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.563284 4854 generic.go:334] "Generic (PLEG): container finished" podID="6c4c0dff-7632-4208-ad0e-475eb69bbc3b" containerID="f93699bad30ab9b4213570ff13f662f7a74fa1a38cf616478d0940f9e5ddc952" exitCode=0 Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.563638 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" event={"ID":"6c4c0dff-7632-4208-ad0e-475eb69bbc3b","Type":"ContainerDied","Data":"f93699bad30ab9b4213570ff13f662f7a74fa1a38cf616478d0940f9e5ddc952"} Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.563794 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" event={"ID":"6c4c0dff-7632-4208-ad0e-475eb69bbc3b","Type":"ContainerDied","Data":"089f5d1cf5985d174a1b74c24feac01758cd10004d5ef269bc5a5a233eaa8232"} Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.563822 4854 scope.go:117] "RemoveContainer" containerID="f93699bad30ab9b4213570ff13f662f7a74fa1a38cf616478d0940f9e5ddc952" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.563953 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.569105 4854 generic.go:334] "Generic (PLEG): container finished" podID="753a1abe-8f65-4721-ad3f-b207e3413ffa" containerID="0841c829e516db789b99eb47c1714434662b71c48a0c67251407ed29307c248e" exitCode=0 Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.569230 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" event={"ID":"753a1abe-8f65-4721-ad3f-b207e3413ffa","Type":"ContainerDied","Data":"0841c829e516db789b99eb47c1714434662b71c48a0c67251407ed29307c248e"} Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.569314 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" event={"ID":"753a1abe-8f65-4721-ad3f-b207e3413ffa","Type":"ContainerDied","Data":"0cb4a09de0bcf16e9ed65dfb0112334da7e29bb8f18795347b3c98e9471dcfe8"} Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.569512 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6s458" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.597841 4854 scope.go:117] "RemoveContainer" containerID="f93699bad30ab9b4213570ff13f662f7a74fa1a38cf616478d0940f9e5ddc952" Oct 07 12:35:28 crc kubenswrapper[4854]: E1007 12:35:28.599983 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f93699bad30ab9b4213570ff13f662f7a74fa1a38cf616478d0940f9e5ddc952\": container with ID starting with f93699bad30ab9b4213570ff13f662f7a74fa1a38cf616478d0940f9e5ddc952 not found: ID does not exist" containerID="f93699bad30ab9b4213570ff13f662f7a74fa1a38cf616478d0940f9e5ddc952" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.600037 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f93699bad30ab9b4213570ff13f662f7a74fa1a38cf616478d0940f9e5ddc952"} err="failed to get container status \"f93699bad30ab9b4213570ff13f662f7a74fa1a38cf616478d0940f9e5ddc952\": rpc error: code = NotFound desc = could not find container \"f93699bad30ab9b4213570ff13f662f7a74fa1a38cf616478d0940f9e5ddc952\": container with ID starting with f93699bad30ab9b4213570ff13f662f7a74fa1a38cf616478d0940f9e5ddc952 not found: ID does not exist" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.600241 4854 scope.go:117] "RemoveContainer" containerID="0841c829e516db789b99eb47c1714434662b71c48a0c67251407ed29307c248e" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.601264 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh"] Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.605105 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-972qh"] Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.612970 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6s458"] Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.619169 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6s458"] Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.627210 4854 scope.go:117] "RemoveContainer" containerID="0841c829e516db789b99eb47c1714434662b71c48a0c67251407ed29307c248e" Oct 07 12:35:28 crc kubenswrapper[4854]: E1007 12:35:28.627905 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0841c829e516db789b99eb47c1714434662b71c48a0c67251407ed29307c248e\": container with ID starting with 0841c829e516db789b99eb47c1714434662b71c48a0c67251407ed29307c248e not found: ID does not exist" containerID="0841c829e516db789b99eb47c1714434662b71c48a0c67251407ed29307c248e" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.628063 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0841c829e516db789b99eb47c1714434662b71c48a0c67251407ed29307c248e"} err="failed to get container status \"0841c829e516db789b99eb47c1714434662b71c48a0c67251407ed29307c248e\": rpc error: code = NotFound desc = could not find container \"0841c829e516db789b99eb47c1714434662b71c48a0c67251407ed29307c248e\": container with ID starting with 0841c829e516db789b99eb47c1714434662b71c48a0c67251407ed29307c248e not found: ID does not exist" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.711291 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c4c0dff-7632-4208-ad0e-475eb69bbc3b" path="/var/lib/kubelet/pods/6c4c0dff-7632-4208-ad0e-475eb69bbc3b/volumes" Oct 07 12:35:28 crc kubenswrapper[4854]: I1007 12:35:28.714573 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="753a1abe-8f65-4721-ad3f-b207e3413ffa" path="/var/lib/kubelet/pods/753a1abe-8f65-4721-ad3f-b207e3413ffa/volumes" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.767808 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67b4c8fc49-98s5z"] Oct 07 12:35:29 crc kubenswrapper[4854]: E1007 12:35:29.768093 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753a1abe-8f65-4721-ad3f-b207e3413ffa" containerName="controller-manager" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.768112 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="753a1abe-8f65-4721-ad3f-b207e3413ffa" containerName="controller-manager" Oct 07 12:35:29 crc kubenswrapper[4854]: E1007 12:35:29.768133 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4c0dff-7632-4208-ad0e-475eb69bbc3b" containerName="route-controller-manager" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.768141 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4c0dff-7632-4208-ad0e-475eb69bbc3b" containerName="route-controller-manager" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.768338 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="753a1abe-8f65-4721-ad3f-b207e3413ffa" containerName="controller-manager" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.768366 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c4c0dff-7632-4208-ad0e-475eb69bbc3b" containerName="route-controller-manager" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.769060 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.778804 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp"] Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.781900 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.783009 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.783097 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.784413 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.785049 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.785266 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.785470 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.785547 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.785618 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.785875 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.785965 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.787304 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.787307 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.797106 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.797481 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67b4c8fc49-98s5z"] Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.803843 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp"] Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.942190 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f0a1091-437a-4c0b-b042-3387ed09cdf9-serving-cert\") pod \"route-controller-manager-66c6744bc-kb8bp\" (UID: \"7f0a1091-437a-4c0b-b042-3387ed09cdf9\") " pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.942274 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f0a1091-437a-4c0b-b042-3387ed09cdf9-client-ca\") pod \"route-controller-manager-66c6744bc-kb8bp\" (UID: \"7f0a1091-437a-4c0b-b042-3387ed09cdf9\") " pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.942316 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0a1091-437a-4c0b-b042-3387ed09cdf9-config\") pod \"route-controller-manager-66c6744bc-kb8bp\" (UID: \"7f0a1091-437a-4c0b-b042-3387ed09cdf9\") " pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.942340 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7526\" (UniqueName: \"kubernetes.io/projected/673c737b-85f0-4b85-9d9c-3f98b3a40596-kube-api-access-v7526\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.942366 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/673c737b-85f0-4b85-9d9c-3f98b3a40596-serving-cert\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.942384 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/673c737b-85f0-4b85-9d9c-3f98b3a40596-client-ca\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.942452 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj2kt\" (UniqueName: \"kubernetes.io/projected/7f0a1091-437a-4c0b-b042-3387ed09cdf9-kube-api-access-bj2kt\") pod \"route-controller-manager-66c6744bc-kb8bp\" (UID: \"7f0a1091-437a-4c0b-b042-3387ed09cdf9\") " pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.942548 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/673c737b-85f0-4b85-9d9c-3f98b3a40596-proxy-ca-bundles\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:29 crc kubenswrapper[4854]: I1007 12:35:29.942631 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/673c737b-85f0-4b85-9d9c-3f98b3a40596-config\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.043548 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/673c737b-85f0-4b85-9d9c-3f98b3a40596-config\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.044060 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f0a1091-437a-4c0b-b042-3387ed09cdf9-serving-cert\") pod \"route-controller-manager-66c6744bc-kb8bp\" (UID: \"7f0a1091-437a-4c0b-b042-3387ed09cdf9\") " pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.044190 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f0a1091-437a-4c0b-b042-3387ed09cdf9-client-ca\") pod \"route-controller-manager-66c6744bc-kb8bp\" (UID: \"7f0a1091-437a-4c0b-b042-3387ed09cdf9\") " pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.044302 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0a1091-437a-4c0b-b042-3387ed09cdf9-config\") pod \"route-controller-manager-66c6744bc-kb8bp\" (UID: \"7f0a1091-437a-4c0b-b042-3387ed09cdf9\") " pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.044410 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7526\" (UniqueName: \"kubernetes.io/projected/673c737b-85f0-4b85-9d9c-3f98b3a40596-kube-api-access-v7526\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.044530 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/673c737b-85f0-4b85-9d9c-3f98b3a40596-serving-cert\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.044618 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/673c737b-85f0-4b85-9d9c-3f98b3a40596-client-ca\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.044704 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj2kt\" (UniqueName: \"kubernetes.io/projected/7f0a1091-437a-4c0b-b042-3387ed09cdf9-kube-api-access-bj2kt\") pod \"route-controller-manager-66c6744bc-kb8bp\" (UID: \"7f0a1091-437a-4c0b-b042-3387ed09cdf9\") " pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.044847 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/673c737b-85f0-4b85-9d9c-3f98b3a40596-proxy-ca-bundles\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.045469 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/673c737b-85f0-4b85-9d9c-3f98b3a40596-config\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.045661 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/673c737b-85f0-4b85-9d9c-3f98b3a40596-client-ca\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.045731 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f0a1091-437a-4c0b-b042-3387ed09cdf9-client-ca\") pod \"route-controller-manager-66c6744bc-kb8bp\" (UID: \"7f0a1091-437a-4c0b-b042-3387ed09cdf9\") " pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.046109 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0a1091-437a-4c0b-b042-3387ed09cdf9-config\") pod \"route-controller-manager-66c6744bc-kb8bp\" (UID: \"7f0a1091-437a-4c0b-b042-3387ed09cdf9\") " pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.046436 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/673c737b-85f0-4b85-9d9c-3f98b3a40596-proxy-ca-bundles\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.052468 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/673c737b-85f0-4b85-9d9c-3f98b3a40596-serving-cert\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.059609 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f0a1091-437a-4c0b-b042-3387ed09cdf9-serving-cert\") pod \"route-controller-manager-66c6744bc-kb8bp\" (UID: \"7f0a1091-437a-4c0b-b042-3387ed09cdf9\") " pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.063534 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj2kt\" (UniqueName: \"kubernetes.io/projected/7f0a1091-437a-4c0b-b042-3387ed09cdf9-kube-api-access-bj2kt\") pod \"route-controller-manager-66c6744bc-kb8bp\" (UID: \"7f0a1091-437a-4c0b-b042-3387ed09cdf9\") " pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.070326 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7526\" (UniqueName: \"kubernetes.io/projected/673c737b-85f0-4b85-9d9c-3f98b3a40596-kube-api-access-v7526\") pod \"controller-manager-67b4c8fc49-98s5z\" (UID: \"673c737b-85f0-4b85-9d9c-3f98b3a40596\") " pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.102580 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.117848 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.338086 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67b4c8fc49-98s5z"] Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.373358 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp"] Oct 07 12:35:30 crc kubenswrapper[4854]: W1007 12:35:30.379844 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f0a1091_437a_4c0b_b042_3387ed09cdf9.slice/crio-46679ea5e43defc1f50923bcb097ba379c321d77ff45b3944521face474a107e WatchSource:0}: Error finding container 46679ea5e43defc1f50923bcb097ba379c321d77ff45b3944521face474a107e: Status 404 returned error can't find the container with id 46679ea5e43defc1f50923bcb097ba379c321d77ff45b3944521face474a107e Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.590821 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" event={"ID":"7f0a1091-437a-4c0b-b042-3387ed09cdf9","Type":"ContainerStarted","Data":"be8ba2a9adaef4739d4d1624a02d042371e5865d22e84624a4a795ad89da93a5"} Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.590890 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" event={"ID":"7f0a1091-437a-4c0b-b042-3387ed09cdf9","Type":"ContainerStarted","Data":"46679ea5e43defc1f50923bcb097ba379c321d77ff45b3944521face474a107e"} Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.592494 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.597274 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" event={"ID":"673c737b-85f0-4b85-9d9c-3f98b3a40596","Type":"ContainerStarted","Data":"f2c397f38cdf9df98b297845085cd26255bea5090702a19965a3f5e3a0a51d3b"} Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.597373 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" event={"ID":"673c737b-85f0-4b85-9d9c-3f98b3a40596","Type":"ContainerStarted","Data":"bd3f9674dc86d042d2543612250c0b9049b72637fc15cad0aa757c18e1239237"} Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.597837 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.603016 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.626304 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" podStartSLOduration=2.626268556 podStartE2EDuration="2.626268556s" podCreationTimestamp="2025-10-07 12:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:35:30.617924832 +0000 UTC m=+646.605757087" watchObservedRunningTime="2025-10-07 12:35:30.626268556 +0000 UTC m=+646.614100821" Oct 07 12:35:30 crc kubenswrapper[4854]: I1007 12:35:30.643377 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67b4c8fc49-98s5z" podStartSLOduration=3.643345006 podStartE2EDuration="3.643345006s" podCreationTimestamp="2025-10-07 12:35:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:35:30.635091194 +0000 UTC m=+646.622923449" watchObservedRunningTime="2025-10-07 12:35:30.643345006 +0000 UTC m=+646.631177261" Oct 07 12:35:31 crc kubenswrapper[4854]: I1007 12:35:31.301115 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66c6744bc-kb8bp" Oct 07 12:35:34 crc kubenswrapper[4854]: I1007 12:35:34.318647 4854 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.443627 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6"] Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.446246 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.448483 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.467259 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6"] Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.546373 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee4688b0-4708-49b7-8e15-a18a343a9a98-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6\" (UID: \"ee4688b0-4708-49b7-8e15-a18a343a9a98\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.546429 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5wcc\" (UniqueName: \"kubernetes.io/projected/ee4688b0-4708-49b7-8e15-a18a343a9a98-kube-api-access-b5wcc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6\" (UID: \"ee4688b0-4708-49b7-8e15-a18a343a9a98\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.546666 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee4688b0-4708-49b7-8e15-a18a343a9a98-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6\" (UID: \"ee4688b0-4708-49b7-8e15-a18a343a9a98\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.591562 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-9cjsc" podUID="a5796307-74e5-4cb6-99db-b1ba95dacb54" containerName="console" containerID="cri-o://dc06d1a97fc67438f36a0185d1fc220345e6e9084964dad503012af7e5bb3a61" gracePeriod=15 Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.647733 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee4688b0-4708-49b7-8e15-a18a343a9a98-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6\" (UID: \"ee4688b0-4708-49b7-8e15-a18a343a9a98\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.648075 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee4688b0-4708-49b7-8e15-a18a343a9a98-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6\" (UID: \"ee4688b0-4708-49b7-8e15-a18a343a9a98\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.648184 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5wcc\" (UniqueName: \"kubernetes.io/projected/ee4688b0-4708-49b7-8e15-a18a343a9a98-kube-api-access-b5wcc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6\" (UID: \"ee4688b0-4708-49b7-8e15-a18a343a9a98\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.648516 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee4688b0-4708-49b7-8e15-a18a343a9a98-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6\" (UID: \"ee4688b0-4708-49b7-8e15-a18a343a9a98\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.648982 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee4688b0-4708-49b7-8e15-a18a343a9a98-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6\" (UID: \"ee4688b0-4708-49b7-8e15-a18a343a9a98\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.679573 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5wcc\" (UniqueName: \"kubernetes.io/projected/ee4688b0-4708-49b7-8e15-a18a343a9a98-kube-api-access-b5wcc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6\" (UID: \"ee4688b0-4708-49b7-8e15-a18a343a9a98\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.770274 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.980030 4854 patch_prober.go:28] interesting pod/console-f9d7485db-9cjsc container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Oct 07 12:35:40 crc kubenswrapper[4854]: I1007 12:35:40.980125 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-9cjsc" podUID="a5796307-74e5-4cb6-99db-b1ba95dacb54" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.192721 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6"] Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.229397 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9cjsc_a5796307-74e5-4cb6-99db-b1ba95dacb54/console/0.log" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.229845 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.358801 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-config\") pod \"a5796307-74e5-4cb6-99db-b1ba95dacb54\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.358924 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtrt7\" (UniqueName: \"kubernetes.io/projected/a5796307-74e5-4cb6-99db-b1ba95dacb54-kube-api-access-xtrt7\") pod \"a5796307-74e5-4cb6-99db-b1ba95dacb54\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.358991 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-oauth-config\") pod \"a5796307-74e5-4cb6-99db-b1ba95dacb54\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.359016 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-service-ca\") pod \"a5796307-74e5-4cb6-99db-b1ba95dacb54\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.359107 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-serving-cert\") pod \"a5796307-74e5-4cb6-99db-b1ba95dacb54\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.359132 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-trusted-ca-bundle\") pod \"a5796307-74e5-4cb6-99db-b1ba95dacb54\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.359181 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-oauth-serving-cert\") pod \"a5796307-74e5-4cb6-99db-b1ba95dacb54\" (UID: \"a5796307-74e5-4cb6-99db-b1ba95dacb54\") " Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.360478 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a5796307-74e5-4cb6-99db-b1ba95dacb54" (UID: "a5796307-74e5-4cb6-99db-b1ba95dacb54"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.360489 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-service-ca" (OuterVolumeSpecName: "service-ca") pod "a5796307-74e5-4cb6-99db-b1ba95dacb54" (UID: "a5796307-74e5-4cb6-99db-b1ba95dacb54"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.360815 4854 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.360834 4854 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.361453 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a5796307-74e5-4cb6-99db-b1ba95dacb54" (UID: "a5796307-74e5-4cb6-99db-b1ba95dacb54"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.361481 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-config" (OuterVolumeSpecName: "console-config") pod "a5796307-74e5-4cb6-99db-b1ba95dacb54" (UID: "a5796307-74e5-4cb6-99db-b1ba95dacb54"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.367205 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a5796307-74e5-4cb6-99db-b1ba95dacb54" (UID: "a5796307-74e5-4cb6-99db-b1ba95dacb54"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.367470 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a5796307-74e5-4cb6-99db-b1ba95dacb54" (UID: "a5796307-74e5-4cb6-99db-b1ba95dacb54"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.368943 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5796307-74e5-4cb6-99db-b1ba95dacb54-kube-api-access-xtrt7" (OuterVolumeSpecName: "kube-api-access-xtrt7") pod "a5796307-74e5-4cb6-99db-b1ba95dacb54" (UID: "a5796307-74e5-4cb6-99db-b1ba95dacb54"). InnerVolumeSpecName "kube-api-access-xtrt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.461786 4854 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.461838 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtrt7\" (UniqueName: \"kubernetes.io/projected/a5796307-74e5-4cb6-99db-b1ba95dacb54-kube-api-access-xtrt7\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.461853 4854 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.461868 4854 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5796307-74e5-4cb6-99db-b1ba95dacb54-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.461878 4854 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5796307-74e5-4cb6-99db-b1ba95dacb54-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.687395 4854 generic.go:334] "Generic (PLEG): container finished" podID="ee4688b0-4708-49b7-8e15-a18a343a9a98" containerID="cd1b1f70042c13cf6de77101730d3f26ff7a4dcdf9fb2dfa94050128fc4ae966" exitCode=0 Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.687578 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" event={"ID":"ee4688b0-4708-49b7-8e15-a18a343a9a98","Type":"ContainerDied","Data":"cd1b1f70042c13cf6de77101730d3f26ff7a4dcdf9fb2dfa94050128fc4ae966"} Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.687629 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" event={"ID":"ee4688b0-4708-49b7-8e15-a18a343a9a98","Type":"ContainerStarted","Data":"0d8182621451aa854ad9b54d83c5de9e47c248a8b1dfe142c3c7d6f79739c678"} Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.690670 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9cjsc_a5796307-74e5-4cb6-99db-b1ba95dacb54/console/0.log" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.690757 4854 generic.go:334] "Generic (PLEG): container finished" podID="a5796307-74e5-4cb6-99db-b1ba95dacb54" containerID="dc06d1a97fc67438f36a0185d1fc220345e6e9084964dad503012af7e5bb3a61" exitCode=2 Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.690845 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9cjsc" event={"ID":"a5796307-74e5-4cb6-99db-b1ba95dacb54","Type":"ContainerDied","Data":"dc06d1a97fc67438f36a0185d1fc220345e6e9084964dad503012af7e5bb3a61"} Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.690895 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9cjsc" event={"ID":"a5796307-74e5-4cb6-99db-b1ba95dacb54","Type":"ContainerDied","Data":"a1aae0653c19084f5bed47cfc33a4b3081c9efd4c13c949be506d3b8e14500ed"} Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.690926 4854 scope.go:117] "RemoveContainer" containerID="dc06d1a97fc67438f36a0185d1fc220345e6e9084964dad503012af7e5bb3a61" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.691112 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9cjsc" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.721985 4854 scope.go:117] "RemoveContainer" containerID="dc06d1a97fc67438f36a0185d1fc220345e6e9084964dad503012af7e5bb3a61" Oct 07 12:35:41 crc kubenswrapper[4854]: E1007 12:35:41.723255 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc06d1a97fc67438f36a0185d1fc220345e6e9084964dad503012af7e5bb3a61\": container with ID starting with dc06d1a97fc67438f36a0185d1fc220345e6e9084964dad503012af7e5bb3a61 not found: ID does not exist" containerID="dc06d1a97fc67438f36a0185d1fc220345e6e9084964dad503012af7e5bb3a61" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.723404 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc06d1a97fc67438f36a0185d1fc220345e6e9084964dad503012af7e5bb3a61"} err="failed to get container status \"dc06d1a97fc67438f36a0185d1fc220345e6e9084964dad503012af7e5bb3a61\": rpc error: code = NotFound desc = could not find container \"dc06d1a97fc67438f36a0185d1fc220345e6e9084964dad503012af7e5bb3a61\": container with ID starting with dc06d1a97fc67438f36a0185d1fc220345e6e9084964dad503012af7e5bb3a61 not found: ID does not exist" Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.743312 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9cjsc"] Oct 07 12:35:41 crc kubenswrapper[4854]: I1007 12:35:41.749232 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-9cjsc"] Oct 07 12:35:42 crc kubenswrapper[4854]: I1007 12:35:42.716704 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5796307-74e5-4cb6-99db-b1ba95dacb54" path="/var/lib/kubelet/pods/a5796307-74e5-4cb6-99db-b1ba95dacb54/volumes" Oct 07 12:35:42 crc kubenswrapper[4854]: I1007 12:35:42.806868 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tvnbf"] Oct 07 12:35:42 crc kubenswrapper[4854]: E1007 12:35:42.807547 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5796307-74e5-4cb6-99db-b1ba95dacb54" containerName="console" Oct 07 12:35:42 crc kubenswrapper[4854]: I1007 12:35:42.807586 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5796307-74e5-4cb6-99db-b1ba95dacb54" containerName="console" Oct 07 12:35:42 crc kubenswrapper[4854]: I1007 12:35:42.807857 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5796307-74e5-4cb6-99db-b1ba95dacb54" containerName="console" Oct 07 12:35:42 crc kubenswrapper[4854]: I1007 12:35:42.809445 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:35:42 crc kubenswrapper[4854]: I1007 12:35:42.817753 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tvnbf"] Oct 07 12:35:42 crc kubenswrapper[4854]: I1007 12:35:42.997178 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc9e0a4-e77e-4160-bf58-09897c8543a3-utilities\") pod \"redhat-operators-tvnbf\" (UID: \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\") " pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:35:42 crc kubenswrapper[4854]: I1007 12:35:42.997262 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc9e0a4-e77e-4160-bf58-09897c8543a3-catalog-content\") pod \"redhat-operators-tvnbf\" (UID: \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\") " pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:35:42 crc kubenswrapper[4854]: I1007 12:35:42.997301 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht8ss\" (UniqueName: \"kubernetes.io/projected/8cc9e0a4-e77e-4160-bf58-09897c8543a3-kube-api-access-ht8ss\") pod \"redhat-operators-tvnbf\" (UID: \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\") " pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:35:43 crc kubenswrapper[4854]: I1007 12:35:43.098211 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht8ss\" (UniqueName: \"kubernetes.io/projected/8cc9e0a4-e77e-4160-bf58-09897c8543a3-kube-api-access-ht8ss\") pod \"redhat-operators-tvnbf\" (UID: \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\") " pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:35:43 crc kubenswrapper[4854]: I1007 12:35:43.098340 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc9e0a4-e77e-4160-bf58-09897c8543a3-utilities\") pod \"redhat-operators-tvnbf\" (UID: \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\") " pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:35:43 crc kubenswrapper[4854]: I1007 12:35:43.098902 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc9e0a4-e77e-4160-bf58-09897c8543a3-utilities\") pod \"redhat-operators-tvnbf\" (UID: \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\") " pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:35:43 crc kubenswrapper[4854]: I1007 12:35:43.098975 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc9e0a4-e77e-4160-bf58-09897c8543a3-catalog-content\") pod \"redhat-operators-tvnbf\" (UID: \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\") " pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:35:43 crc kubenswrapper[4854]: I1007 12:35:43.099293 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc9e0a4-e77e-4160-bf58-09897c8543a3-catalog-content\") pod \"redhat-operators-tvnbf\" (UID: \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\") " pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:35:43 crc kubenswrapper[4854]: I1007 12:35:43.117088 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht8ss\" (UniqueName: \"kubernetes.io/projected/8cc9e0a4-e77e-4160-bf58-09897c8543a3-kube-api-access-ht8ss\") pod \"redhat-operators-tvnbf\" (UID: \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\") " pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:35:43 crc kubenswrapper[4854]: I1007 12:35:43.179497 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:35:43 crc kubenswrapper[4854]: I1007 12:35:43.470033 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tvnbf"] Oct 07 12:35:43 crc kubenswrapper[4854]: W1007 12:35:43.480314 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cc9e0a4_e77e_4160_bf58_09897c8543a3.slice/crio-8828d7e454bcbc845e107b85c9ef1a1fff58a19dc16deecdaa8b72cb11f42c06 WatchSource:0}: Error finding container 8828d7e454bcbc845e107b85c9ef1a1fff58a19dc16deecdaa8b72cb11f42c06: Status 404 returned error can't find the container with id 8828d7e454bcbc845e107b85c9ef1a1fff58a19dc16deecdaa8b72cb11f42c06 Oct 07 12:35:43 crc kubenswrapper[4854]: I1007 12:35:43.707997 4854 generic.go:334] "Generic (PLEG): container finished" podID="ee4688b0-4708-49b7-8e15-a18a343a9a98" containerID="4a322a8240c0f8e1d6c3504577239195c2f7e7874dc7dd2370a12f1ea6229265" exitCode=0 Oct 07 12:35:43 crc kubenswrapper[4854]: I1007 12:35:43.708101 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" event={"ID":"ee4688b0-4708-49b7-8e15-a18a343a9a98","Type":"ContainerDied","Data":"4a322a8240c0f8e1d6c3504577239195c2f7e7874dc7dd2370a12f1ea6229265"} Oct 07 12:35:43 crc kubenswrapper[4854]: I1007 12:35:43.710348 4854 generic.go:334] "Generic (PLEG): container finished" podID="8cc9e0a4-e77e-4160-bf58-09897c8543a3" containerID="dc6384b296849bac6716aace0ccdb0f4ef7d1b9e5f1fb2d6b95e30b849e0158c" exitCode=0 Oct 07 12:35:43 crc kubenswrapper[4854]: I1007 12:35:43.710390 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvnbf" event={"ID":"8cc9e0a4-e77e-4160-bf58-09897c8543a3","Type":"ContainerDied","Data":"dc6384b296849bac6716aace0ccdb0f4ef7d1b9e5f1fb2d6b95e30b849e0158c"} Oct 07 12:35:43 crc kubenswrapper[4854]: I1007 12:35:43.710408 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvnbf" event={"ID":"8cc9e0a4-e77e-4160-bf58-09897c8543a3","Type":"ContainerStarted","Data":"8828d7e454bcbc845e107b85c9ef1a1fff58a19dc16deecdaa8b72cb11f42c06"} Oct 07 12:35:44 crc kubenswrapper[4854]: I1007 12:35:44.721414 4854 generic.go:334] "Generic (PLEG): container finished" podID="ee4688b0-4708-49b7-8e15-a18a343a9a98" containerID="cbd65a7aefeecf5d7e459e1d34ca01c24e88a1dce22e29e515787481d9c6944e" exitCode=0 Oct 07 12:35:44 crc kubenswrapper[4854]: I1007 12:35:44.721510 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" event={"ID":"ee4688b0-4708-49b7-8e15-a18a343a9a98","Type":"ContainerDied","Data":"cbd65a7aefeecf5d7e459e1d34ca01c24e88a1dce22e29e515787481d9c6944e"} Oct 07 12:35:46 crc kubenswrapper[4854]: I1007 12:35:46.094126 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" Oct 07 12:35:46 crc kubenswrapper[4854]: I1007 12:35:46.155607 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee4688b0-4708-49b7-8e15-a18a343a9a98-util\") pod \"ee4688b0-4708-49b7-8e15-a18a343a9a98\" (UID: \"ee4688b0-4708-49b7-8e15-a18a343a9a98\") " Oct 07 12:35:46 crc kubenswrapper[4854]: I1007 12:35:46.155758 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee4688b0-4708-49b7-8e15-a18a343a9a98-bundle\") pod \"ee4688b0-4708-49b7-8e15-a18a343a9a98\" (UID: \"ee4688b0-4708-49b7-8e15-a18a343a9a98\") " Oct 07 12:35:46 crc kubenswrapper[4854]: I1007 12:35:46.155789 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5wcc\" (UniqueName: \"kubernetes.io/projected/ee4688b0-4708-49b7-8e15-a18a343a9a98-kube-api-access-b5wcc\") pod \"ee4688b0-4708-49b7-8e15-a18a343a9a98\" (UID: \"ee4688b0-4708-49b7-8e15-a18a343a9a98\") " Oct 07 12:35:46 crc kubenswrapper[4854]: I1007 12:35:46.157321 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee4688b0-4708-49b7-8e15-a18a343a9a98-bundle" (OuterVolumeSpecName: "bundle") pod "ee4688b0-4708-49b7-8e15-a18a343a9a98" (UID: "ee4688b0-4708-49b7-8e15-a18a343a9a98"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:35:46 crc kubenswrapper[4854]: I1007 12:35:46.169127 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee4688b0-4708-49b7-8e15-a18a343a9a98-kube-api-access-b5wcc" (OuterVolumeSpecName: "kube-api-access-b5wcc") pod "ee4688b0-4708-49b7-8e15-a18a343a9a98" (UID: "ee4688b0-4708-49b7-8e15-a18a343a9a98"). InnerVolumeSpecName "kube-api-access-b5wcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:35:46 crc kubenswrapper[4854]: I1007 12:35:46.172744 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee4688b0-4708-49b7-8e15-a18a343a9a98-util" (OuterVolumeSpecName: "util") pod "ee4688b0-4708-49b7-8e15-a18a343a9a98" (UID: "ee4688b0-4708-49b7-8e15-a18a343a9a98"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:35:46 crc kubenswrapper[4854]: I1007 12:35:46.258731 4854 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee4688b0-4708-49b7-8e15-a18a343a9a98-util\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:46 crc kubenswrapper[4854]: I1007 12:35:46.258782 4854 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee4688b0-4708-49b7-8e15-a18a343a9a98-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:46 crc kubenswrapper[4854]: I1007 12:35:46.258796 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5wcc\" (UniqueName: \"kubernetes.io/projected/ee4688b0-4708-49b7-8e15-a18a343a9a98-kube-api-access-b5wcc\") on node \"crc\" DevicePath \"\"" Oct 07 12:35:46 crc kubenswrapper[4854]: I1007 12:35:46.742709 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" event={"ID":"ee4688b0-4708-49b7-8e15-a18a343a9a98","Type":"ContainerDied","Data":"0d8182621451aa854ad9b54d83c5de9e47c248a8b1dfe142c3c7d6f79739c678"} Oct 07 12:35:46 crc kubenswrapper[4854]: I1007 12:35:46.742773 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d8182621451aa854ad9b54d83c5de9e47c248a8b1dfe142c3c7d6f79739c678" Oct 07 12:35:46 crc kubenswrapper[4854]: I1007 12:35:46.742833 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6" Oct 07 12:35:52 crc kubenswrapper[4854]: I1007 12:35:52.793631 4854 generic.go:334] "Generic (PLEG): container finished" podID="8cc9e0a4-e77e-4160-bf58-09897c8543a3" containerID="8386927863c8347ac6695157b52edeea41d72620c07012996e1151886e34a96f" exitCode=0 Oct 07 12:35:52 crc kubenswrapper[4854]: I1007 12:35:52.793711 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvnbf" event={"ID":"8cc9e0a4-e77e-4160-bf58-09897c8543a3","Type":"ContainerDied","Data":"8386927863c8347ac6695157b52edeea41d72620c07012996e1151886e34a96f"} Oct 07 12:35:53 crc kubenswrapper[4854]: I1007 12:35:53.805063 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvnbf" event={"ID":"8cc9e0a4-e77e-4160-bf58-09897c8543a3","Type":"ContainerStarted","Data":"9497ec7f562591fc4c3b7572eaed05970c4c715c66f6f8f04f040253aff254d1"} Oct 07 12:35:55 crc kubenswrapper[4854]: I1007 12:35:55.828593 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tvnbf" podStartSLOduration=4.301085637 podStartE2EDuration="13.828561131s" podCreationTimestamp="2025-10-07 12:35:42 +0000 UTC" firstStartedPulling="2025-10-07 12:35:43.711410057 +0000 UTC m=+659.699242302" lastFinishedPulling="2025-10-07 12:35:53.238885541 +0000 UTC m=+669.226717796" observedRunningTime="2025-10-07 12:35:53.831930198 +0000 UTC m=+669.819762483" watchObservedRunningTime="2025-10-07 12:35:55.828561131 +0000 UTC m=+671.816393396" Oct 07 12:35:55 crc kubenswrapper[4854]: I1007 12:35:55.833465 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp"] Oct 07 12:35:55 crc kubenswrapper[4854]: E1007 12:35:55.833762 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4688b0-4708-49b7-8e15-a18a343a9a98" containerName="util" Oct 07 12:35:55 crc kubenswrapper[4854]: I1007 12:35:55.833783 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4688b0-4708-49b7-8e15-a18a343a9a98" containerName="util" Oct 07 12:35:55 crc kubenswrapper[4854]: E1007 12:35:55.833797 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4688b0-4708-49b7-8e15-a18a343a9a98" containerName="extract" Oct 07 12:35:55 crc kubenswrapper[4854]: I1007 12:35:55.833808 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4688b0-4708-49b7-8e15-a18a343a9a98" containerName="extract" Oct 07 12:35:55 crc kubenswrapper[4854]: E1007 12:35:55.833824 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4688b0-4708-49b7-8e15-a18a343a9a98" containerName="pull" Oct 07 12:35:55 crc kubenswrapper[4854]: I1007 12:35:55.833834 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4688b0-4708-49b7-8e15-a18a343a9a98" containerName="pull" Oct 07 12:35:55 crc kubenswrapper[4854]: I1007 12:35:55.833970 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee4688b0-4708-49b7-8e15-a18a343a9a98" containerName="extract" Oct 07 12:35:55 crc kubenswrapper[4854]: I1007 12:35:55.834503 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" Oct 07 12:35:55 crc kubenswrapper[4854]: I1007 12:35:55.837762 4854 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 07 12:35:55 crc kubenswrapper[4854]: I1007 12:35:55.837979 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 07 12:35:55 crc kubenswrapper[4854]: I1007 12:35:55.838800 4854 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 07 12:35:55 crc kubenswrapper[4854]: I1007 12:35:55.842456 4854 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-svrqp" Oct 07 12:35:55 crc kubenswrapper[4854]: I1007 12:35:55.842504 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 07 12:35:55 crc kubenswrapper[4854]: I1007 12:35:55.854719 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp"] Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.001764 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn6v2\" (UniqueName: \"kubernetes.io/projected/182c4b25-b992-4471-9264-fe61313b869d-kube-api-access-bn6v2\") pod \"metallb-operator-controller-manager-86f7bb879c-vrdbp\" (UID: \"182c4b25-b992-4471-9264-fe61313b869d\") " pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.001816 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/182c4b25-b992-4471-9264-fe61313b869d-webhook-cert\") pod \"metallb-operator-controller-manager-86f7bb879c-vrdbp\" (UID: \"182c4b25-b992-4471-9264-fe61313b869d\") " pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.001863 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/182c4b25-b992-4471-9264-fe61313b869d-apiservice-cert\") pod \"metallb-operator-controller-manager-86f7bb879c-vrdbp\" (UID: \"182c4b25-b992-4471-9264-fe61313b869d\") " pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.099838 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276"] Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.101380 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.102927 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn6v2\" (UniqueName: \"kubernetes.io/projected/182c4b25-b992-4471-9264-fe61313b869d-kube-api-access-bn6v2\") pod \"metallb-operator-controller-manager-86f7bb879c-vrdbp\" (UID: \"182c4b25-b992-4471-9264-fe61313b869d\") " pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.102995 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktbb2\" (UniqueName: \"kubernetes.io/projected/1cb7ac39-9e71-4037-990d-1672438371c7-kube-api-access-ktbb2\") pod \"metallb-operator-webhook-server-6b7dfdfc77-wf276\" (UID: \"1cb7ac39-9e71-4037-990d-1672438371c7\") " pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.103043 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/182c4b25-b992-4471-9264-fe61313b869d-webhook-cert\") pod \"metallb-operator-controller-manager-86f7bb879c-vrdbp\" (UID: \"182c4b25-b992-4471-9264-fe61313b869d\") " pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.103103 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/182c4b25-b992-4471-9264-fe61313b869d-apiservice-cert\") pod \"metallb-operator-controller-manager-86f7bb879c-vrdbp\" (UID: \"182c4b25-b992-4471-9264-fe61313b869d\") " pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.103127 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1cb7ac39-9e71-4037-990d-1672438371c7-webhook-cert\") pod \"metallb-operator-webhook-server-6b7dfdfc77-wf276\" (UID: \"1cb7ac39-9e71-4037-990d-1672438371c7\") " pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.103179 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1cb7ac39-9e71-4037-990d-1672438371c7-apiservice-cert\") pod \"metallb-operator-webhook-server-6b7dfdfc77-wf276\" (UID: \"1cb7ac39-9e71-4037-990d-1672438371c7\") " pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.104035 4854 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.107858 4854 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.112329 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/182c4b25-b992-4471-9264-fe61313b869d-webhook-cert\") pod \"metallb-operator-controller-manager-86f7bb879c-vrdbp\" (UID: \"182c4b25-b992-4471-9264-fe61313b869d\") " pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.116952 4854 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-mbfxl" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.118860 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/182c4b25-b992-4471-9264-fe61313b869d-apiservice-cert\") pod \"metallb-operator-controller-manager-86f7bb879c-vrdbp\" (UID: \"182c4b25-b992-4471-9264-fe61313b869d\") " pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.128624 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276"] Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.137121 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn6v2\" (UniqueName: \"kubernetes.io/projected/182c4b25-b992-4471-9264-fe61313b869d-kube-api-access-bn6v2\") pod \"metallb-operator-controller-manager-86f7bb879c-vrdbp\" (UID: \"182c4b25-b992-4471-9264-fe61313b869d\") " pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.155664 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.205973 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktbb2\" (UniqueName: \"kubernetes.io/projected/1cb7ac39-9e71-4037-990d-1672438371c7-kube-api-access-ktbb2\") pod \"metallb-operator-webhook-server-6b7dfdfc77-wf276\" (UID: \"1cb7ac39-9e71-4037-990d-1672438371c7\") " pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.206687 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1cb7ac39-9e71-4037-990d-1672438371c7-webhook-cert\") pod \"metallb-operator-webhook-server-6b7dfdfc77-wf276\" (UID: \"1cb7ac39-9e71-4037-990d-1672438371c7\") " pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.206744 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1cb7ac39-9e71-4037-990d-1672438371c7-apiservice-cert\") pod \"metallb-operator-webhook-server-6b7dfdfc77-wf276\" (UID: \"1cb7ac39-9e71-4037-990d-1672438371c7\") " pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.224407 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1cb7ac39-9e71-4037-990d-1672438371c7-apiservice-cert\") pod \"metallb-operator-webhook-server-6b7dfdfc77-wf276\" (UID: \"1cb7ac39-9e71-4037-990d-1672438371c7\") " pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.224538 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1cb7ac39-9e71-4037-990d-1672438371c7-webhook-cert\") pod \"metallb-operator-webhook-server-6b7dfdfc77-wf276\" (UID: \"1cb7ac39-9e71-4037-990d-1672438371c7\") " pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.229049 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktbb2\" (UniqueName: \"kubernetes.io/projected/1cb7ac39-9e71-4037-990d-1672438371c7-kube-api-access-ktbb2\") pod \"metallb-operator-webhook-server-6b7dfdfc77-wf276\" (UID: \"1cb7ac39-9e71-4037-990d-1672438371c7\") " pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.475329 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.783795 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp"] Oct 07 12:35:56 crc kubenswrapper[4854]: W1007 12:35:56.787161 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod182c4b25_b992_4471_9264_fe61313b869d.slice/crio-d9c330d3db7142103dd23667927a86b0da9f0b2dc73a45e9ea6204d8f3548b39 WatchSource:0}: Error finding container d9c330d3db7142103dd23667927a86b0da9f0b2dc73a45e9ea6204d8f3548b39: Status 404 returned error can't find the container with id d9c330d3db7142103dd23667927a86b0da9f0b2dc73a45e9ea6204d8f3548b39 Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.825326 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" event={"ID":"182c4b25-b992-4471-9264-fe61313b869d","Type":"ContainerStarted","Data":"d9c330d3db7142103dd23667927a86b0da9f0b2dc73a45e9ea6204d8f3548b39"} Oct 07 12:35:56 crc kubenswrapper[4854]: I1007 12:35:56.899666 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276"] Oct 07 12:35:56 crc kubenswrapper[4854]: W1007 12:35:56.907669 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb7ac39_9e71_4037_990d_1672438371c7.slice/crio-df86a932bacf0b868ddd89e34b1714693b328965bb3bc55538ecc3d6f4348b31 WatchSource:0}: Error finding container df86a932bacf0b868ddd89e34b1714693b328965bb3bc55538ecc3d6f4348b31: Status 404 returned error can't find the container with id df86a932bacf0b868ddd89e34b1714693b328965bb3bc55538ecc3d6f4348b31 Oct 07 12:35:57 crc kubenswrapper[4854]: I1007 12:35:57.844801 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" event={"ID":"1cb7ac39-9e71-4037-990d-1672438371c7","Type":"ContainerStarted","Data":"df86a932bacf0b868ddd89e34b1714693b328965bb3bc55538ecc3d6f4348b31"} Oct 07 12:36:03 crc kubenswrapper[4854]: I1007 12:36:03.184967 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:36:03 crc kubenswrapper[4854]: I1007 12:36:03.185737 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:36:03 crc kubenswrapper[4854]: I1007 12:36:03.226593 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:36:03 crc kubenswrapper[4854]: I1007 12:36:03.897918 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" event={"ID":"182c4b25-b992-4471-9264-fe61313b869d","Type":"ContainerStarted","Data":"0cfc634e4d15fb813e44ca4b228a21d07b04372a0903d34ed42ca36079a5ad41"} Oct 07 12:36:03 crc kubenswrapper[4854]: I1007 12:36:03.898348 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" Oct 07 12:36:03 crc kubenswrapper[4854]: I1007 12:36:03.901052 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" event={"ID":"1cb7ac39-9e71-4037-990d-1672438371c7","Type":"ContainerStarted","Data":"45f1ddd44dd8eca5d1e4a676b20897eaa02c246b6b93b8c020a6251ab330074d"} Oct 07 12:36:03 crc kubenswrapper[4854]: I1007 12:36:03.901082 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" Oct 07 12:36:03 crc kubenswrapper[4854]: I1007 12:36:03.921295 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" podStartSLOduration=2.156202136 podStartE2EDuration="8.921275773s" podCreationTimestamp="2025-10-07 12:35:55 +0000 UTC" firstStartedPulling="2025-10-07 12:35:56.79087276 +0000 UTC m=+672.778705015" lastFinishedPulling="2025-10-07 12:36:03.555946387 +0000 UTC m=+679.543778652" observedRunningTime="2025-10-07 12:36:03.919115759 +0000 UTC m=+679.906948014" watchObservedRunningTime="2025-10-07 12:36:03.921275773 +0000 UTC m=+679.909108028" Oct 07 12:36:03 crc kubenswrapper[4854]: I1007 12:36:03.963237 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" podStartSLOduration=1.286737591 podStartE2EDuration="7.963216885s" podCreationTimestamp="2025-10-07 12:35:56 +0000 UTC" firstStartedPulling="2025-10-07 12:35:56.910907457 +0000 UTC m=+672.898739722" lastFinishedPulling="2025-10-07 12:36:03.587386761 +0000 UTC m=+679.575219016" observedRunningTime="2025-10-07 12:36:03.958034803 +0000 UTC m=+679.945867058" watchObservedRunningTime="2025-10-07 12:36:03.963216885 +0000 UTC m=+679.951049140" Oct 07 12:36:03 crc kubenswrapper[4854]: I1007 12:36:03.973830 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 12:36:05 crc kubenswrapper[4854]: I1007 12:36:05.238394 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tvnbf"] Oct 07 12:36:05 crc kubenswrapper[4854]: I1007 12:36:05.592298 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d4k2f"] Oct 07 12:36:05 crc kubenswrapper[4854]: I1007 12:36:05.592686 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d4k2f" podUID="69a424ee-6bfe-4135-95fa-beb839c92eab" containerName="registry-server" containerID="cri-o://6c8196f19e2a2c277b182e834a5ac2434f12faaab9046e44e3b13b259d22192e" gracePeriod=2 Oct 07 12:36:06 crc kubenswrapper[4854]: I1007 12:36:06.919982 4854 generic.go:334] "Generic (PLEG): container finished" podID="69a424ee-6bfe-4135-95fa-beb839c92eab" containerID="6c8196f19e2a2c277b182e834a5ac2434f12faaab9046e44e3b13b259d22192e" exitCode=0 Oct 07 12:36:06 crc kubenswrapper[4854]: I1007 12:36:06.920185 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4k2f" event={"ID":"69a424ee-6bfe-4135-95fa-beb839c92eab","Type":"ContainerDied","Data":"6c8196f19e2a2c277b182e834a5ac2434f12faaab9046e44e3b13b259d22192e"} Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.014593 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.146410 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a424ee-6bfe-4135-95fa-beb839c92eab-catalog-content\") pod \"69a424ee-6bfe-4135-95fa-beb839c92eab\" (UID: \"69a424ee-6bfe-4135-95fa-beb839c92eab\") " Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.146592 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmqfc\" (UniqueName: \"kubernetes.io/projected/69a424ee-6bfe-4135-95fa-beb839c92eab-kube-api-access-rmqfc\") pod \"69a424ee-6bfe-4135-95fa-beb839c92eab\" (UID: \"69a424ee-6bfe-4135-95fa-beb839c92eab\") " Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.146649 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a424ee-6bfe-4135-95fa-beb839c92eab-utilities\") pod \"69a424ee-6bfe-4135-95fa-beb839c92eab\" (UID: \"69a424ee-6bfe-4135-95fa-beb839c92eab\") " Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.147604 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a424ee-6bfe-4135-95fa-beb839c92eab-utilities" (OuterVolumeSpecName: "utilities") pod "69a424ee-6bfe-4135-95fa-beb839c92eab" (UID: "69a424ee-6bfe-4135-95fa-beb839c92eab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.154796 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a424ee-6bfe-4135-95fa-beb839c92eab-kube-api-access-rmqfc" (OuterVolumeSpecName: "kube-api-access-rmqfc") pod "69a424ee-6bfe-4135-95fa-beb839c92eab" (UID: "69a424ee-6bfe-4135-95fa-beb839c92eab"). InnerVolumeSpecName "kube-api-access-rmqfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.248393 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmqfc\" (UniqueName: \"kubernetes.io/projected/69a424ee-6bfe-4135-95fa-beb839c92eab-kube-api-access-rmqfc\") on node \"crc\" DevicePath \"\"" Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.248434 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a424ee-6bfe-4135-95fa-beb839c92eab-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.259462 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a424ee-6bfe-4135-95fa-beb839c92eab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69a424ee-6bfe-4135-95fa-beb839c92eab" (UID: "69a424ee-6bfe-4135-95fa-beb839c92eab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.349934 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a424ee-6bfe-4135-95fa-beb839c92eab-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.928837 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d4k2f" event={"ID":"69a424ee-6bfe-4135-95fa-beb839c92eab","Type":"ContainerDied","Data":"e87ceee1739e194c836c798cc3e6398d4ece72ce033693c629e6fb17c2e64e9e"} Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.928921 4854 scope.go:117] "RemoveContainer" containerID="6c8196f19e2a2c277b182e834a5ac2434f12faaab9046e44e3b13b259d22192e" Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.928990 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d4k2f" Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.948298 4854 scope.go:117] "RemoveContainer" containerID="6d81f5b74f5c6c9c6d2c649ce83cb82ba41e4735fb037fc1637c5fa75e94ca3c" Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.959265 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d4k2f"] Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.964121 4854 scope.go:117] "RemoveContainer" containerID="be230a3a61a190bf95dfbb12f22694c8dfa5da3b195f7816283e3a2a173aab98" Oct 07 12:36:07 crc kubenswrapper[4854]: I1007 12:36:07.967913 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d4k2f"] Oct 07 12:36:08 crc kubenswrapper[4854]: I1007 12:36:08.713108 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a424ee-6bfe-4135-95fa-beb839c92eab" path="/var/lib/kubelet/pods/69a424ee-6bfe-4135-95fa-beb839c92eab/volumes" Oct 07 12:36:16 crc kubenswrapper[4854]: I1007 12:36:16.479909 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6b7dfdfc77-wf276" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.611527 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-crq2z"] Oct 07 12:36:33 crc kubenswrapper[4854]: E1007 12:36:33.612309 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a424ee-6bfe-4135-95fa-beb839c92eab" containerName="extract-utilities" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.612323 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a424ee-6bfe-4135-95fa-beb839c92eab" containerName="extract-utilities" Oct 07 12:36:33 crc kubenswrapper[4854]: E1007 12:36:33.612340 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a424ee-6bfe-4135-95fa-beb839c92eab" containerName="extract-content" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.612346 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a424ee-6bfe-4135-95fa-beb839c92eab" containerName="extract-content" Oct 07 12:36:33 crc kubenswrapper[4854]: E1007 12:36:33.612356 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a424ee-6bfe-4135-95fa-beb839c92eab" containerName="registry-server" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.612363 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a424ee-6bfe-4135-95fa-beb839c92eab" containerName="registry-server" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.612499 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a424ee-6bfe-4135-95fa-beb839c92eab" containerName="registry-server" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.613318 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.672002 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-crq2z"] Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.768988 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fec307e5-ef1c-4c44-8524-204f8d7ae452-utilities\") pod \"community-operators-crq2z\" (UID: \"fec307e5-ef1c-4c44-8524-204f8d7ae452\") " pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.769053 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fec307e5-ef1c-4c44-8524-204f8d7ae452-catalog-content\") pod \"community-operators-crq2z\" (UID: \"fec307e5-ef1c-4c44-8524-204f8d7ae452\") " pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.769125 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt4f6\" (UniqueName: \"kubernetes.io/projected/fec307e5-ef1c-4c44-8524-204f8d7ae452-kube-api-access-zt4f6\") pod \"community-operators-crq2z\" (UID: \"fec307e5-ef1c-4c44-8524-204f8d7ae452\") " pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.870997 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fec307e5-ef1c-4c44-8524-204f8d7ae452-utilities\") pod \"community-operators-crq2z\" (UID: \"fec307e5-ef1c-4c44-8524-204f8d7ae452\") " pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.871065 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fec307e5-ef1c-4c44-8524-204f8d7ae452-catalog-content\") pod \"community-operators-crq2z\" (UID: \"fec307e5-ef1c-4c44-8524-204f8d7ae452\") " pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.871126 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt4f6\" (UniqueName: \"kubernetes.io/projected/fec307e5-ef1c-4c44-8524-204f8d7ae452-kube-api-access-zt4f6\") pod \"community-operators-crq2z\" (UID: \"fec307e5-ef1c-4c44-8524-204f8d7ae452\") " pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.871591 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fec307e5-ef1c-4c44-8524-204f8d7ae452-utilities\") pod \"community-operators-crq2z\" (UID: \"fec307e5-ef1c-4c44-8524-204f8d7ae452\") " pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.871785 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fec307e5-ef1c-4c44-8524-204f8d7ae452-catalog-content\") pod \"community-operators-crq2z\" (UID: \"fec307e5-ef1c-4c44-8524-204f8d7ae452\") " pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.897032 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt4f6\" (UniqueName: \"kubernetes.io/projected/fec307e5-ef1c-4c44-8524-204f8d7ae452-kube-api-access-zt4f6\") pod \"community-operators-crq2z\" (UID: \"fec307e5-ef1c-4c44-8524-204f8d7ae452\") " pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:33 crc kubenswrapper[4854]: I1007 12:36:33.927545 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:34 crc kubenswrapper[4854]: I1007 12:36:34.429917 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-crq2z"] Oct 07 12:36:35 crc kubenswrapper[4854]: I1007 12:36:35.128066 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crq2z" event={"ID":"fec307e5-ef1c-4c44-8524-204f8d7ae452","Type":"ContainerStarted","Data":"4a49cff7504cbe8129974a055d1dc424a624d0c046c00188a7494bacb38c36b8"} Oct 07 12:36:36 crc kubenswrapper[4854]: I1007 12:36:36.137455 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crq2z" event={"ID":"fec307e5-ef1c-4c44-8524-204f8d7ae452","Type":"ContainerStarted","Data":"4698f5e210942ce047254ace5c095355049de6ed433b1c96edffd290ce95ce95"} Oct 07 12:36:36 crc kubenswrapper[4854]: I1007 12:36:36.159896 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-86f7bb879c-vrdbp" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.025326 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-9d94j"] Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.028416 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.030739 4854 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-bnr45" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.032589 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.036522 4854 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.038027 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d"] Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.039134 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.043273 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d"] Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.047563 4854 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.099889 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-w4gkv"] Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.100921 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-w4gkv" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.103288 4854 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.103441 4854 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.103561 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.105594 4854 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-rqzt7" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.116672 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-8szdq"] Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.117900 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-8szdq" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.120423 4854 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.130046 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e36001a5-79ba-4f9b-9e5a-012494f505f7-reloader\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.130096 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e36001a5-79ba-4f9b-9e5a-012494f505f7-frr-sockets\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.130117 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89pmm\" (UniqueName: \"kubernetes.io/projected/7dd0aa9d-7e88-4764-8cea-332b935e11ea-kube-api-access-89pmm\") pod \"frr-k8s-webhook-server-64bf5d555-dsv2d\" (UID: \"7dd0aa9d-7e88-4764-8cea-332b935e11ea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.130156 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf8qr\" (UniqueName: \"kubernetes.io/projected/e36001a5-79ba-4f9b-9e5a-012494f505f7-kube-api-access-bf8qr\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.130191 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dd0aa9d-7e88-4764-8cea-332b935e11ea-cert\") pod \"frr-k8s-webhook-server-64bf5d555-dsv2d\" (UID: \"7dd0aa9d-7e88-4764-8cea-332b935e11ea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.130215 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e36001a5-79ba-4f9b-9e5a-012494f505f7-metrics\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.130264 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e36001a5-79ba-4f9b-9e5a-012494f505f7-frr-conf\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.130282 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e36001a5-79ba-4f9b-9e5a-012494f505f7-metrics-certs\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.130305 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e36001a5-79ba-4f9b-9e5a-012494f505f7-frr-startup\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.140367 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-8szdq"] Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.146592 4854 generic.go:334] "Generic (PLEG): container finished" podID="fec307e5-ef1c-4c44-8524-204f8d7ae452" containerID="4698f5e210942ce047254ace5c095355049de6ed433b1c96edffd290ce95ce95" exitCode=0 Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.146683 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crq2z" event={"ID":"fec307e5-ef1c-4c44-8524-204f8d7ae452","Type":"ContainerDied","Data":"4698f5e210942ce047254ace5c095355049de6ed433b1c96edffd290ce95ce95"} Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.231931 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2206fc76-d0a8-4943-9efe-5378e7ee73f6-memberlist\") pod \"speaker-w4gkv\" (UID: \"2206fc76-d0a8-4943-9efe-5378e7ee73f6\") " pod="metallb-system/speaker-w4gkv" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232022 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e36001a5-79ba-4f9b-9e5a-012494f505f7-frr-conf\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232041 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e36001a5-79ba-4f9b-9e5a-012494f505f7-metrics-certs\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232061 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e36001a5-79ba-4f9b-9e5a-012494f505f7-frr-startup\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232089 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9rwl\" (UniqueName: \"kubernetes.io/projected/660bca3f-29ea-4ead-a16f-6287792ef72b-kube-api-access-r9rwl\") pod \"controller-68d546b9d8-8szdq\" (UID: \"660bca3f-29ea-4ead-a16f-6287792ef72b\") " pod="metallb-system/controller-68d546b9d8-8szdq" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232111 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e36001a5-79ba-4f9b-9e5a-012494f505f7-reloader\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232128 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/660bca3f-29ea-4ead-a16f-6287792ef72b-metrics-certs\") pod \"controller-68d546b9d8-8szdq\" (UID: \"660bca3f-29ea-4ead-a16f-6287792ef72b\") " pod="metallb-system/controller-68d546b9d8-8szdq" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232163 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2206fc76-d0a8-4943-9efe-5378e7ee73f6-metallb-excludel2\") pod \"speaker-w4gkv\" (UID: \"2206fc76-d0a8-4943-9efe-5378e7ee73f6\") " pod="metallb-system/speaker-w4gkv" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232185 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e36001a5-79ba-4f9b-9e5a-012494f505f7-frr-sockets\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232211 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89pmm\" (UniqueName: \"kubernetes.io/projected/7dd0aa9d-7e88-4764-8cea-332b935e11ea-kube-api-access-89pmm\") pod \"frr-k8s-webhook-server-64bf5d555-dsv2d\" (UID: \"7dd0aa9d-7e88-4764-8cea-332b935e11ea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232238 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf8qr\" (UniqueName: \"kubernetes.io/projected/e36001a5-79ba-4f9b-9e5a-012494f505f7-kube-api-access-bf8qr\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232256 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/660bca3f-29ea-4ead-a16f-6287792ef72b-cert\") pod \"controller-68d546b9d8-8szdq\" (UID: \"660bca3f-29ea-4ead-a16f-6287792ef72b\") " pod="metallb-system/controller-68d546b9d8-8szdq" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232306 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dd0aa9d-7e88-4764-8cea-332b935e11ea-cert\") pod \"frr-k8s-webhook-server-64bf5d555-dsv2d\" (UID: \"7dd0aa9d-7e88-4764-8cea-332b935e11ea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232330 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e36001a5-79ba-4f9b-9e5a-012494f505f7-metrics\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232346 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2206fc76-d0a8-4943-9efe-5378e7ee73f6-metrics-certs\") pod \"speaker-w4gkv\" (UID: \"2206fc76-d0a8-4943-9efe-5378e7ee73f6\") " pod="metallb-system/speaker-w4gkv" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232366 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlrlc\" (UniqueName: \"kubernetes.io/projected/2206fc76-d0a8-4943-9efe-5378e7ee73f6-kube-api-access-dlrlc\") pod \"speaker-w4gkv\" (UID: \"2206fc76-d0a8-4943-9efe-5378e7ee73f6\") " pod="metallb-system/speaker-w4gkv" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232494 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e36001a5-79ba-4f9b-9e5a-012494f505f7-frr-conf\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232650 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e36001a5-79ba-4f9b-9e5a-012494f505f7-reloader\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.232848 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e36001a5-79ba-4f9b-9e5a-012494f505f7-frr-sockets\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.233268 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e36001a5-79ba-4f9b-9e5a-012494f505f7-frr-startup\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: E1007 12:36:37.233279 4854 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.233368 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e36001a5-79ba-4f9b-9e5a-012494f505f7-metrics\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: E1007 12:36:37.233417 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7dd0aa9d-7e88-4764-8cea-332b935e11ea-cert podName:7dd0aa9d-7e88-4764-8cea-332b935e11ea nodeName:}" failed. No retries permitted until 2025-10-07 12:36:37.733388898 +0000 UTC m=+713.721221143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7dd0aa9d-7e88-4764-8cea-332b935e11ea-cert") pod "frr-k8s-webhook-server-64bf5d555-dsv2d" (UID: "7dd0aa9d-7e88-4764-8cea-332b935e11ea") : secret "frr-k8s-webhook-server-cert" not found Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.251746 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e36001a5-79ba-4f9b-9e5a-012494f505f7-metrics-certs\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.253973 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89pmm\" (UniqueName: \"kubernetes.io/projected/7dd0aa9d-7e88-4764-8cea-332b935e11ea-kube-api-access-89pmm\") pod \"frr-k8s-webhook-server-64bf5d555-dsv2d\" (UID: \"7dd0aa9d-7e88-4764-8cea-332b935e11ea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.263049 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf8qr\" (UniqueName: \"kubernetes.io/projected/e36001a5-79ba-4f9b-9e5a-012494f505f7-kube-api-access-bf8qr\") pod \"frr-k8s-9d94j\" (UID: \"e36001a5-79ba-4f9b-9e5a-012494f505f7\") " pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.333448 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rwl\" (UniqueName: \"kubernetes.io/projected/660bca3f-29ea-4ead-a16f-6287792ef72b-kube-api-access-r9rwl\") pod \"controller-68d546b9d8-8szdq\" (UID: \"660bca3f-29ea-4ead-a16f-6287792ef72b\") " pod="metallb-system/controller-68d546b9d8-8szdq" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.333501 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/660bca3f-29ea-4ead-a16f-6287792ef72b-metrics-certs\") pod \"controller-68d546b9d8-8szdq\" (UID: \"660bca3f-29ea-4ead-a16f-6287792ef72b\") " pod="metallb-system/controller-68d546b9d8-8szdq" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.333526 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2206fc76-d0a8-4943-9efe-5378e7ee73f6-metallb-excludel2\") pod \"speaker-w4gkv\" (UID: \"2206fc76-d0a8-4943-9efe-5378e7ee73f6\") " pod="metallb-system/speaker-w4gkv" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.333562 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/660bca3f-29ea-4ead-a16f-6287792ef72b-cert\") pod \"controller-68d546b9d8-8szdq\" (UID: \"660bca3f-29ea-4ead-a16f-6287792ef72b\") " pod="metallb-system/controller-68d546b9d8-8szdq" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.333618 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2206fc76-d0a8-4943-9efe-5378e7ee73f6-metrics-certs\") pod \"speaker-w4gkv\" (UID: \"2206fc76-d0a8-4943-9efe-5378e7ee73f6\") " pod="metallb-system/speaker-w4gkv" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.333638 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlrlc\" (UniqueName: \"kubernetes.io/projected/2206fc76-d0a8-4943-9efe-5378e7ee73f6-kube-api-access-dlrlc\") pod \"speaker-w4gkv\" (UID: \"2206fc76-d0a8-4943-9efe-5378e7ee73f6\") " pod="metallb-system/speaker-w4gkv" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.333680 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2206fc76-d0a8-4943-9efe-5378e7ee73f6-memberlist\") pod \"speaker-w4gkv\" (UID: \"2206fc76-d0a8-4943-9efe-5378e7ee73f6\") " pod="metallb-system/speaker-w4gkv" Oct 07 12:36:37 crc kubenswrapper[4854]: E1007 12:36:37.333829 4854 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 12:36:37 crc kubenswrapper[4854]: E1007 12:36:37.333896 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2206fc76-d0a8-4943-9efe-5378e7ee73f6-memberlist podName:2206fc76-d0a8-4943-9efe-5378e7ee73f6 nodeName:}" failed. No retries permitted until 2025-10-07 12:36:37.833875501 +0000 UTC m=+713.821707756 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2206fc76-d0a8-4943-9efe-5378e7ee73f6-memberlist") pod "speaker-w4gkv" (UID: "2206fc76-d0a8-4943-9efe-5378e7ee73f6") : secret "metallb-memberlist" not found Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.334575 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2206fc76-d0a8-4943-9efe-5378e7ee73f6-metallb-excludel2\") pod \"speaker-w4gkv\" (UID: \"2206fc76-d0a8-4943-9efe-5378e7ee73f6\") " pod="metallb-system/speaker-w4gkv" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.337984 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/660bca3f-29ea-4ead-a16f-6287792ef72b-metrics-certs\") pod \"controller-68d546b9d8-8szdq\" (UID: \"660bca3f-29ea-4ead-a16f-6287792ef72b\") " pod="metallb-system/controller-68d546b9d8-8szdq" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.338074 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2206fc76-d0a8-4943-9efe-5378e7ee73f6-metrics-certs\") pod \"speaker-w4gkv\" (UID: \"2206fc76-d0a8-4943-9efe-5378e7ee73f6\") " pod="metallb-system/speaker-w4gkv" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.341555 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/660bca3f-29ea-4ead-a16f-6287792ef72b-cert\") pod \"controller-68d546b9d8-8szdq\" (UID: \"660bca3f-29ea-4ead-a16f-6287792ef72b\") " pod="metallb-system/controller-68d546b9d8-8szdq" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.350531 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlrlc\" (UniqueName: \"kubernetes.io/projected/2206fc76-d0a8-4943-9efe-5378e7ee73f6-kube-api-access-dlrlc\") pod \"speaker-w4gkv\" (UID: \"2206fc76-d0a8-4943-9efe-5378e7ee73f6\") " pod="metallb-system/speaker-w4gkv" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.352174 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9rwl\" (UniqueName: \"kubernetes.io/projected/660bca3f-29ea-4ead-a16f-6287792ef72b-kube-api-access-r9rwl\") pod \"controller-68d546b9d8-8szdq\" (UID: \"660bca3f-29ea-4ead-a16f-6287792ef72b\") " pod="metallb-system/controller-68d546b9d8-8szdq" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.435284 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.505053 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-8szdq" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.741035 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dd0aa9d-7e88-4764-8cea-332b935e11ea-cert\") pod \"frr-k8s-webhook-server-64bf5d555-dsv2d\" (UID: \"7dd0aa9d-7e88-4764-8cea-332b935e11ea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.748891 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dd0aa9d-7e88-4764-8cea-332b935e11ea-cert\") pod \"frr-k8s-webhook-server-64bf5d555-dsv2d\" (UID: \"7dd0aa9d-7e88-4764-8cea-332b935e11ea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.779323 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-8szdq"] Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.806701 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d" Oct 07 12:36:37 crc kubenswrapper[4854]: I1007 12:36:37.843210 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2206fc76-d0a8-4943-9efe-5378e7ee73f6-memberlist\") pod \"speaker-w4gkv\" (UID: \"2206fc76-d0a8-4943-9efe-5378e7ee73f6\") " pod="metallb-system/speaker-w4gkv" Oct 07 12:36:37 crc kubenswrapper[4854]: E1007 12:36:37.844661 4854 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 12:36:37 crc kubenswrapper[4854]: E1007 12:36:37.844804 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2206fc76-d0a8-4943-9efe-5378e7ee73f6-memberlist podName:2206fc76-d0a8-4943-9efe-5378e7ee73f6 nodeName:}" failed. No retries permitted until 2025-10-07 12:36:38.844769744 +0000 UTC m=+714.832602069 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2206fc76-d0a8-4943-9efe-5378e7ee73f6-memberlist") pod "speaker-w4gkv" (UID: "2206fc76-d0a8-4943-9efe-5378e7ee73f6") : secret "metallb-memberlist" not found Oct 07 12:36:38 crc kubenswrapper[4854]: I1007 12:36:38.161203 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9d94j" event={"ID":"e36001a5-79ba-4f9b-9e5a-012494f505f7","Type":"ContainerStarted","Data":"446cdfd9716d11112c77fef29bfe222325b4b7501e8e33b7918b29f5511b42e4"} Oct 07 12:36:38 crc kubenswrapper[4854]: I1007 12:36:38.166363 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-8szdq" event={"ID":"660bca3f-29ea-4ead-a16f-6287792ef72b","Type":"ContainerStarted","Data":"93ad640f227a78095f399a53272e8d78c9dae1338fc4326d0fb92222c4254bf6"} Oct 07 12:36:38 crc kubenswrapper[4854]: I1007 12:36:38.210855 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d"] Oct 07 12:36:38 crc kubenswrapper[4854]: W1007 12:36:38.235705 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dd0aa9d_7e88_4764_8cea_332b935e11ea.slice/crio-7800dbf8d3bae21010b94b6a0599ceefa1d577b0850eb8bf899452d501b1b1c8 WatchSource:0}: Error finding container 7800dbf8d3bae21010b94b6a0599ceefa1d577b0850eb8bf899452d501b1b1c8: Status 404 returned error can't find the container with id 7800dbf8d3bae21010b94b6a0599ceefa1d577b0850eb8bf899452d501b1b1c8 Oct 07 12:36:38 crc kubenswrapper[4854]: I1007 12:36:38.860737 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2206fc76-d0a8-4943-9efe-5378e7ee73f6-memberlist\") pod \"speaker-w4gkv\" (UID: \"2206fc76-d0a8-4943-9efe-5378e7ee73f6\") " pod="metallb-system/speaker-w4gkv" Oct 07 12:36:38 crc kubenswrapper[4854]: I1007 12:36:38.870702 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2206fc76-d0a8-4943-9efe-5378e7ee73f6-memberlist\") pod \"speaker-w4gkv\" (UID: \"2206fc76-d0a8-4943-9efe-5378e7ee73f6\") " pod="metallb-system/speaker-w4gkv" Oct 07 12:36:39 crc kubenswrapper[4854]: I1007 12:36:39.004588 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-w4gkv" Oct 07 12:36:39 crc kubenswrapper[4854]: W1007 12:36:39.041438 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2206fc76_d0a8_4943_9efe_5378e7ee73f6.slice/crio-81a3a61473ea0f78ecfa7d8de83e5b89c4602070f98a6ffbe7fd8e6d5957cf48 WatchSource:0}: Error finding container 81a3a61473ea0f78ecfa7d8de83e5b89c4602070f98a6ffbe7fd8e6d5957cf48: Status 404 returned error can't find the container with id 81a3a61473ea0f78ecfa7d8de83e5b89c4602070f98a6ffbe7fd8e6d5957cf48 Oct 07 12:36:39 crc kubenswrapper[4854]: I1007 12:36:39.175239 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w4gkv" event={"ID":"2206fc76-d0a8-4943-9efe-5378e7ee73f6","Type":"ContainerStarted","Data":"81a3a61473ea0f78ecfa7d8de83e5b89c4602070f98a6ffbe7fd8e6d5957cf48"} Oct 07 12:36:39 crc kubenswrapper[4854]: I1007 12:36:39.184284 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-8szdq" event={"ID":"660bca3f-29ea-4ead-a16f-6287792ef72b","Type":"ContainerStarted","Data":"f6848074de0a093914c4219da405478aacc4eb000d98376ab61cd57f72490c6d"} Oct 07 12:36:39 crc kubenswrapper[4854]: I1007 12:36:39.184342 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-8szdq" event={"ID":"660bca3f-29ea-4ead-a16f-6287792ef72b","Type":"ContainerStarted","Data":"4ef52398e9676290bde0e4d534adf8c0ef5fda4c08df6c5d11730e73dd250563"} Oct 07 12:36:39 crc kubenswrapper[4854]: I1007 12:36:39.185406 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-8szdq" Oct 07 12:36:39 crc kubenswrapper[4854]: I1007 12:36:39.201468 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crq2z" event={"ID":"fec307e5-ef1c-4c44-8524-204f8d7ae452","Type":"ContainerStarted","Data":"a46b73cff9c40e2ae8d0b44e751cfbc569f563c05d61b0542b4ff169b0364d50"} Oct 07 12:36:39 crc kubenswrapper[4854]: I1007 12:36:39.210453 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d" event={"ID":"7dd0aa9d-7e88-4764-8cea-332b935e11ea","Type":"ContainerStarted","Data":"7800dbf8d3bae21010b94b6a0599ceefa1d577b0850eb8bf899452d501b1b1c8"} Oct 07 12:36:39 crc kubenswrapper[4854]: I1007 12:36:39.212317 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-8szdq" podStartSLOduration=2.212291859 podStartE2EDuration="2.212291859s" podCreationTimestamp="2025-10-07 12:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:36:39.207693353 +0000 UTC m=+715.195525608" watchObservedRunningTime="2025-10-07 12:36:39.212291859 +0000 UTC m=+715.200124114" Oct 07 12:36:40 crc kubenswrapper[4854]: I1007 12:36:40.237791 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w4gkv" event={"ID":"2206fc76-d0a8-4943-9efe-5378e7ee73f6","Type":"ContainerStarted","Data":"d52b34dfc1d0b68d9de74de33eb0c97d2bfcf9532a97c58a4ec1fefc1e607bed"} Oct 07 12:36:40 crc kubenswrapper[4854]: I1007 12:36:40.238936 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-w4gkv" event={"ID":"2206fc76-d0a8-4943-9efe-5378e7ee73f6","Type":"ContainerStarted","Data":"6c946b08b8e2e44284458f0b09a3686c996bea8c7da8181431c8ea922f06edd1"} Oct 07 12:36:40 crc kubenswrapper[4854]: I1007 12:36:40.239069 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-w4gkv" Oct 07 12:36:40 crc kubenswrapper[4854]: I1007 12:36:40.243163 4854 generic.go:334] "Generic (PLEG): container finished" podID="fec307e5-ef1c-4c44-8524-204f8d7ae452" containerID="a46b73cff9c40e2ae8d0b44e751cfbc569f563c05d61b0542b4ff169b0364d50" exitCode=0 Oct 07 12:36:40 crc kubenswrapper[4854]: I1007 12:36:40.243355 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crq2z" event={"ID":"fec307e5-ef1c-4c44-8524-204f8d7ae452","Type":"ContainerDied","Data":"a46b73cff9c40e2ae8d0b44e751cfbc569f563c05d61b0542b4ff169b0364d50"} Oct 07 12:36:40 crc kubenswrapper[4854]: I1007 12:36:40.282531 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-w4gkv" podStartSLOduration=3.282509872 podStartE2EDuration="3.282509872s" podCreationTimestamp="2025-10-07 12:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:36:40.278893355 +0000 UTC m=+716.266725620" watchObservedRunningTime="2025-10-07 12:36:40.282509872 +0000 UTC m=+716.270342127" Oct 07 12:36:41 crc kubenswrapper[4854]: I1007 12:36:41.264300 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crq2z" event={"ID":"fec307e5-ef1c-4c44-8524-204f8d7ae452","Type":"ContainerStarted","Data":"c8c17347e45ceb9006c38b6bdc630e2b564eeaeeeaf17ccf8e4e39a075168dbe"} Oct 07 12:36:42 crc kubenswrapper[4854]: I1007 12:36:42.294688 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-crq2z" podStartSLOduration=5.5123804320000005 podStartE2EDuration="9.294665239s" podCreationTimestamp="2025-10-07 12:36:33 +0000 UTC" firstStartedPulling="2025-10-07 12:36:37.148062181 +0000 UTC m=+713.135894436" lastFinishedPulling="2025-10-07 12:36:40.930346968 +0000 UTC m=+716.918179243" observedRunningTime="2025-10-07 12:36:42.290696851 +0000 UTC m=+718.278529106" watchObservedRunningTime="2025-10-07 12:36:42.294665239 +0000 UTC m=+718.282497494" Oct 07 12:36:43 crc kubenswrapper[4854]: I1007 12:36:43.928449 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:43 crc kubenswrapper[4854]: I1007 12:36:43.928601 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:43 crc kubenswrapper[4854]: I1007 12:36:43.994938 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:46 crc kubenswrapper[4854]: I1007 12:36:46.610574 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qw7pl"] Oct 07 12:36:46 crc kubenswrapper[4854]: I1007 12:36:46.612623 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:46 crc kubenswrapper[4854]: I1007 12:36:46.680897 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qw7pl"] Oct 07 12:36:46 crc kubenswrapper[4854]: I1007 12:36:46.719509 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-utilities\") pod \"certified-operators-qw7pl\" (UID: \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\") " pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:46 crc kubenswrapper[4854]: I1007 12:36:46.719782 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drrbk\" (UniqueName: \"kubernetes.io/projected/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-kube-api-access-drrbk\") pod \"certified-operators-qw7pl\" (UID: \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\") " pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:46 crc kubenswrapper[4854]: I1007 12:36:46.719914 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-catalog-content\") pod \"certified-operators-qw7pl\" (UID: \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\") " pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:46 crc kubenswrapper[4854]: I1007 12:36:46.820992 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-utilities\") pod \"certified-operators-qw7pl\" (UID: \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\") " pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:46 crc kubenswrapper[4854]: I1007 12:36:46.821403 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drrbk\" (UniqueName: \"kubernetes.io/projected/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-kube-api-access-drrbk\") pod \"certified-operators-qw7pl\" (UID: \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\") " pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:46 crc kubenswrapper[4854]: I1007 12:36:46.821587 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-catalog-content\") pod \"certified-operators-qw7pl\" (UID: \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\") " pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:46 crc kubenswrapper[4854]: I1007 12:36:46.821600 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-utilities\") pod \"certified-operators-qw7pl\" (UID: \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\") " pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:46 crc kubenswrapper[4854]: I1007 12:36:46.822775 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-catalog-content\") pod \"certified-operators-qw7pl\" (UID: \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\") " pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:46 crc kubenswrapper[4854]: I1007 12:36:46.843873 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drrbk\" (UniqueName: \"kubernetes.io/projected/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-kube-api-access-drrbk\") pod \"certified-operators-qw7pl\" (UID: \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\") " pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:46 crc kubenswrapper[4854]: I1007 12:36:46.931285 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:47 crc kubenswrapper[4854]: I1007 12:36:47.204307 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qw7pl"] Oct 07 12:36:47 crc kubenswrapper[4854]: I1007 12:36:47.315563 4854 generic.go:334] "Generic (PLEG): container finished" podID="e36001a5-79ba-4f9b-9e5a-012494f505f7" containerID="a713d1269711c4f76217eace45a8d873423bd0f6d7ff34c8975b34f6f68a13ae" exitCode=0 Oct 07 12:36:47 crc kubenswrapper[4854]: I1007 12:36:47.315656 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9d94j" event={"ID":"e36001a5-79ba-4f9b-9e5a-012494f505f7","Type":"ContainerDied","Data":"a713d1269711c4f76217eace45a8d873423bd0f6d7ff34c8975b34f6f68a13ae"} Oct 07 12:36:47 crc kubenswrapper[4854]: I1007 12:36:47.317653 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d" event={"ID":"7dd0aa9d-7e88-4764-8cea-332b935e11ea","Type":"ContainerStarted","Data":"f233ea968c777eeb183b3592eecf379da7e80b6ed9efe3bede34e16baa9d363a"} Oct 07 12:36:47 crc kubenswrapper[4854]: I1007 12:36:47.317840 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d" Oct 07 12:36:47 crc kubenswrapper[4854]: I1007 12:36:47.319231 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw7pl" event={"ID":"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d","Type":"ContainerStarted","Data":"d2628fe5c4f2adc7b228207eb168c0b2c9c6830ab69eff4ecf47ec8019e341bc"} Oct 07 12:36:47 crc kubenswrapper[4854]: I1007 12:36:47.381603 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d" podStartSLOduration=2.414771087 podStartE2EDuration="10.381573977s" podCreationTimestamp="2025-10-07 12:36:37 +0000 UTC" firstStartedPulling="2025-10-07 12:36:38.240696089 +0000 UTC m=+714.228528354" lastFinishedPulling="2025-10-07 12:36:46.207498949 +0000 UTC m=+722.195331244" observedRunningTime="2025-10-07 12:36:47.377910438 +0000 UTC m=+723.365742693" watchObservedRunningTime="2025-10-07 12:36:47.381573977 +0000 UTC m=+723.369406232" Oct 07 12:36:48 crc kubenswrapper[4854]: I1007 12:36:48.334403 4854 generic.go:334] "Generic (PLEG): container finished" podID="4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" containerID="a020a500bf32113a5c71ca4c8c34caaf69321b4ad6eb126fa41aa1abea68d157" exitCode=0 Oct 07 12:36:48 crc kubenswrapper[4854]: I1007 12:36:48.334529 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw7pl" event={"ID":"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d","Type":"ContainerDied","Data":"a020a500bf32113a5c71ca4c8c34caaf69321b4ad6eb126fa41aa1abea68d157"} Oct 07 12:36:48 crc kubenswrapper[4854]: I1007 12:36:48.342214 4854 generic.go:334] "Generic (PLEG): container finished" podID="e36001a5-79ba-4f9b-9e5a-012494f505f7" containerID="0b5cb37165ef94962c1de2429bd95ecb31de190bf3e1fcd6b97de72220d36037" exitCode=0 Oct 07 12:36:48 crc kubenswrapper[4854]: I1007 12:36:48.342321 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9d94j" event={"ID":"e36001a5-79ba-4f9b-9e5a-012494f505f7","Type":"ContainerDied","Data":"0b5cb37165ef94962c1de2429bd95ecb31de190bf3e1fcd6b97de72220d36037"} Oct 07 12:36:49 crc kubenswrapper[4854]: I1007 12:36:49.012632 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-w4gkv" Oct 07 12:36:49 crc kubenswrapper[4854]: I1007 12:36:49.350735 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw7pl" event={"ID":"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d","Type":"ContainerStarted","Data":"f176660dde63c5b01ceca2f877a4adc51c69e4112db2a4d298d89d79b091e378"} Oct 07 12:36:49 crc kubenswrapper[4854]: I1007 12:36:49.356640 4854 generic.go:334] "Generic (PLEG): container finished" podID="e36001a5-79ba-4f9b-9e5a-012494f505f7" containerID="55e960bc8023950bd3719aa378c41cfb2d2d35bc83e7b638f86b4b8769c1210d" exitCode=0 Oct 07 12:36:49 crc kubenswrapper[4854]: I1007 12:36:49.356687 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9d94j" event={"ID":"e36001a5-79ba-4f9b-9e5a-012494f505f7","Type":"ContainerDied","Data":"55e960bc8023950bd3719aa378c41cfb2d2d35bc83e7b638f86b4b8769c1210d"} Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.201466 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x558d"] Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.202934 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.216721 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x558d"] Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.364944 4854 generic.go:334] "Generic (PLEG): container finished" podID="4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" containerID="f176660dde63c5b01ceca2f877a4adc51c69e4112db2a4d298d89d79b091e378" exitCode=0 Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.365064 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw7pl" event={"ID":"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d","Type":"ContainerDied","Data":"f176660dde63c5b01ceca2f877a4adc51c69e4112db2a4d298d89d79b091e378"} Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.368558 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9d94j" event={"ID":"e36001a5-79ba-4f9b-9e5a-012494f505f7","Type":"ContainerStarted","Data":"f016dda33173282730ebe2b24dc3a0d7b4b632c600e3f9f0777281ee09606d4b"} Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.381596 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bkbf\" (UniqueName: \"kubernetes.io/projected/5d461d69-005a-4ae7-954d-c39d833d8021-kube-api-access-2bkbf\") pod \"redhat-marketplace-x558d\" (UID: \"5d461d69-005a-4ae7-954d-c39d833d8021\") " pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.381656 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d461d69-005a-4ae7-954d-c39d833d8021-catalog-content\") pod \"redhat-marketplace-x558d\" (UID: \"5d461d69-005a-4ae7-954d-c39d833d8021\") " pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.381830 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d461d69-005a-4ae7-954d-c39d833d8021-utilities\") pod \"redhat-marketplace-x558d\" (UID: \"5d461d69-005a-4ae7-954d-c39d833d8021\") " pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.483334 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d461d69-005a-4ae7-954d-c39d833d8021-catalog-content\") pod \"redhat-marketplace-x558d\" (UID: \"5d461d69-005a-4ae7-954d-c39d833d8021\") " pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.483516 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d461d69-005a-4ae7-954d-c39d833d8021-utilities\") pod \"redhat-marketplace-x558d\" (UID: \"5d461d69-005a-4ae7-954d-c39d833d8021\") " pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.483578 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bkbf\" (UniqueName: \"kubernetes.io/projected/5d461d69-005a-4ae7-954d-c39d833d8021-kube-api-access-2bkbf\") pod \"redhat-marketplace-x558d\" (UID: \"5d461d69-005a-4ae7-954d-c39d833d8021\") " pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.483964 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d461d69-005a-4ae7-954d-c39d833d8021-utilities\") pod \"redhat-marketplace-x558d\" (UID: \"5d461d69-005a-4ae7-954d-c39d833d8021\") " pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.484054 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d461d69-005a-4ae7-954d-c39d833d8021-catalog-content\") pod \"redhat-marketplace-x558d\" (UID: \"5d461d69-005a-4ae7-954d-c39d833d8021\") " pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.507294 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bkbf\" (UniqueName: \"kubernetes.io/projected/5d461d69-005a-4ae7-954d-c39d833d8021-kube-api-access-2bkbf\") pod \"redhat-marketplace-x558d\" (UID: \"5d461d69-005a-4ae7-954d-c39d833d8021\") " pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:36:50 crc kubenswrapper[4854]: I1007 12:36:50.550849 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.018798 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x558d"] Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.065643 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz"] Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.067306 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.069084 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.095074 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz"] Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.194103 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bz7k\" (UniqueName: \"kubernetes.io/projected/718dcecb-56b4-49cc-8992-d8d4c6447602-kube-api-access-7bz7k\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz\" (UID: \"718dcecb-56b4-49cc-8992-d8d4c6447602\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.194471 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/718dcecb-56b4-49cc-8992-d8d4c6447602-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz\" (UID: \"718dcecb-56b4-49cc-8992-d8d4c6447602\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.194610 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/718dcecb-56b4-49cc-8992-d8d4c6447602-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz\" (UID: \"718dcecb-56b4-49cc-8992-d8d4c6447602\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.296220 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bz7k\" (UniqueName: \"kubernetes.io/projected/718dcecb-56b4-49cc-8992-d8d4c6447602-kube-api-access-7bz7k\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz\" (UID: \"718dcecb-56b4-49cc-8992-d8d4c6447602\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.296402 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/718dcecb-56b4-49cc-8992-d8d4c6447602-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz\" (UID: \"718dcecb-56b4-49cc-8992-d8d4c6447602\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.296534 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/718dcecb-56b4-49cc-8992-d8d4c6447602-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz\" (UID: \"718dcecb-56b4-49cc-8992-d8d4c6447602\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.297213 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/718dcecb-56b4-49cc-8992-d8d4c6447602-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz\" (UID: \"718dcecb-56b4-49cc-8992-d8d4c6447602\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.297310 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/718dcecb-56b4-49cc-8992-d8d4c6447602-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz\" (UID: \"718dcecb-56b4-49cc-8992-d8d4c6447602\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.336962 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bz7k\" (UniqueName: \"kubernetes.io/projected/718dcecb-56b4-49cc-8992-d8d4c6447602-kube-api-access-7bz7k\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz\" (UID: \"718dcecb-56b4-49cc-8992-d8d4c6447602\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.388870 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9d94j" event={"ID":"e36001a5-79ba-4f9b-9e5a-012494f505f7","Type":"ContainerStarted","Data":"7c5f1f21049feb6f4982c1dfdf43397535887a904b59a56ebee61c8fa1a729fd"} Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.388926 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9d94j" event={"ID":"e36001a5-79ba-4f9b-9e5a-012494f505f7","Type":"ContainerStarted","Data":"47e49db159e6fe87c30a8f568fec7f37aa614b415968db9858c22554095d3a43"} Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.388940 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9d94j" event={"ID":"e36001a5-79ba-4f9b-9e5a-012494f505f7","Type":"ContainerStarted","Data":"4dd62e7b4df8c7cc2c1d2ca9703433a8820aad7e60fefb3dba6bf98b46151a69"} Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.390232 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x558d" event={"ID":"5d461d69-005a-4ae7-954d-c39d833d8021","Type":"ContainerStarted","Data":"7399078322f7b3a22f739eff01a5f033faed0d46c4e421e3704eff476f364d8a"} Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.390255 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x558d" event={"ID":"5d461d69-005a-4ae7-954d-c39d833d8021","Type":"ContainerStarted","Data":"17ea7e60e0f5d91818670d3fe1afdb7bc70c19ce1b64616d8bf88409a76d4b83"} Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.423428 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw7pl" event={"ID":"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d","Type":"ContainerStarted","Data":"3fa57c19b9af7019e6d833921236ec0d9f988cf46a6874ec28c6e6935d68a1ab"} Oct 07 12:36:51 crc kubenswrapper[4854]: I1007 12:36:51.607018 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" Oct 07 12:36:52 crc kubenswrapper[4854]: I1007 12:36:52.046661 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qw7pl" podStartSLOduration=3.209926633 podStartE2EDuration="6.046637785s" podCreationTimestamp="2025-10-07 12:36:46 +0000 UTC" firstStartedPulling="2025-10-07 12:36:48.340118003 +0000 UTC m=+724.327950308" lastFinishedPulling="2025-10-07 12:36:51.176829215 +0000 UTC m=+727.164661460" observedRunningTime="2025-10-07 12:36:51.457697039 +0000 UTC m=+727.445529294" watchObservedRunningTime="2025-10-07 12:36:52.046637785 +0000 UTC m=+728.034470040" Oct 07 12:36:52 crc kubenswrapper[4854]: I1007 12:36:52.048468 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz"] Oct 07 12:36:52 crc kubenswrapper[4854]: W1007 12:36:52.055879 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod718dcecb_56b4_49cc_8992_d8d4c6447602.slice/crio-c910cdb398765477917c1a3f04fe23c66066bb9281f9a69b6da7d3118b6a094f WatchSource:0}: Error finding container c910cdb398765477917c1a3f04fe23c66066bb9281f9a69b6da7d3118b6a094f: Status 404 returned error can't find the container with id c910cdb398765477917c1a3f04fe23c66066bb9281f9a69b6da7d3118b6a094f Oct 07 12:36:52 crc kubenswrapper[4854]: I1007 12:36:52.431938 4854 generic.go:334] "Generic (PLEG): container finished" podID="718dcecb-56b4-49cc-8992-d8d4c6447602" containerID="584ec5b906dadcd9defe6041d1dd65cea27e7b38413b5ce3eedf3f6fcdd6d896" exitCode=0 Oct 07 12:36:52 crc kubenswrapper[4854]: I1007 12:36:52.432007 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" event={"ID":"718dcecb-56b4-49cc-8992-d8d4c6447602","Type":"ContainerDied","Data":"584ec5b906dadcd9defe6041d1dd65cea27e7b38413b5ce3eedf3f6fcdd6d896"} Oct 07 12:36:52 crc kubenswrapper[4854]: I1007 12:36:52.432039 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" event={"ID":"718dcecb-56b4-49cc-8992-d8d4c6447602","Type":"ContainerStarted","Data":"c910cdb398765477917c1a3f04fe23c66066bb9281f9a69b6da7d3118b6a094f"} Oct 07 12:36:52 crc kubenswrapper[4854]: I1007 12:36:52.443877 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9d94j" event={"ID":"e36001a5-79ba-4f9b-9e5a-012494f505f7","Type":"ContainerStarted","Data":"b961c84f8df13c09dab9da96db4d33c225e1edd88c63d9b3a49281dc64e9c2d3"} Oct 07 12:36:52 crc kubenswrapper[4854]: I1007 12:36:52.443920 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9d94j" event={"ID":"e36001a5-79ba-4f9b-9e5a-012494f505f7","Type":"ContainerStarted","Data":"7994012868d7dbce35813144e7d506a9db135961ca438eb6e627ff75eefbfb46"} Oct 07 12:36:52 crc kubenswrapper[4854]: I1007 12:36:52.444438 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:52 crc kubenswrapper[4854]: I1007 12:36:52.447837 4854 generic.go:334] "Generic (PLEG): container finished" podID="5d461d69-005a-4ae7-954d-c39d833d8021" containerID="7399078322f7b3a22f739eff01a5f033faed0d46c4e421e3704eff476f364d8a" exitCode=0 Oct 07 12:36:52 crc kubenswrapper[4854]: I1007 12:36:52.448042 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x558d" event={"ID":"5d461d69-005a-4ae7-954d-c39d833d8021","Type":"ContainerDied","Data":"7399078322f7b3a22f739eff01a5f033faed0d46c4e421e3704eff476f364d8a"} Oct 07 12:36:52 crc kubenswrapper[4854]: I1007 12:36:52.487759 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-9d94j" podStartSLOduration=7.910615526 podStartE2EDuration="16.487731969s" podCreationTimestamp="2025-10-07 12:36:36 +0000 UTC" firstStartedPulling="2025-10-07 12:36:37.591083479 +0000 UTC m=+713.578915734" lastFinishedPulling="2025-10-07 12:36:46.168199882 +0000 UTC m=+722.156032177" observedRunningTime="2025-10-07 12:36:52.484802402 +0000 UTC m=+728.472634667" watchObservedRunningTime="2025-10-07 12:36:52.487731969 +0000 UTC m=+728.475564234" Oct 07 12:36:53 crc kubenswrapper[4854]: I1007 12:36:53.458864 4854 generic.go:334] "Generic (PLEG): container finished" podID="5d461d69-005a-4ae7-954d-c39d833d8021" containerID="a562bf9dafac0a4d801a1facbc15c2e9e3e5861bdda4e3492f8631fc2b6410d1" exitCode=0 Oct 07 12:36:53 crc kubenswrapper[4854]: I1007 12:36:53.459028 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x558d" event={"ID":"5d461d69-005a-4ae7-954d-c39d833d8021","Type":"ContainerDied","Data":"a562bf9dafac0a4d801a1facbc15c2e9e3e5861bdda4e3492f8631fc2b6410d1"} Oct 07 12:36:54 crc kubenswrapper[4854]: I1007 12:36:54.089272 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:36:56 crc kubenswrapper[4854]: I1007 12:36:56.932162 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:56 crc kubenswrapper[4854]: I1007 12:36:56.932589 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:56 crc kubenswrapper[4854]: I1007 12:36:56.983730 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:57 crc kubenswrapper[4854]: I1007 12:36:57.435723 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:57 crc kubenswrapper[4854]: I1007 12:36:57.496864 4854 generic.go:334] "Generic (PLEG): container finished" podID="718dcecb-56b4-49cc-8992-d8d4c6447602" containerID="55c211c22cd98654b28cf8f290bd36ef1a9954475b4df3fd5f5a17edb02ef558" exitCode=0 Oct 07 12:36:57 crc kubenswrapper[4854]: I1007 12:36:57.496990 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" event={"ID":"718dcecb-56b4-49cc-8992-d8d4c6447602","Type":"ContainerDied","Data":"55c211c22cd98654b28cf8f290bd36ef1a9954475b4df3fd5f5a17edb02ef558"} Oct 07 12:36:57 crc kubenswrapper[4854]: I1007 12:36:57.500121 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x558d" event={"ID":"5d461d69-005a-4ae7-954d-c39d833d8021","Type":"ContainerStarted","Data":"2e179bff55df8268c4a5717ccd5c6759abad8348c12a85e53c18a035e0867dd8"} Oct 07 12:36:57 crc kubenswrapper[4854]: I1007 12:36:57.512084 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-8szdq" Oct 07 12:36:57 crc kubenswrapper[4854]: I1007 12:36:57.514164 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-9d94j" Oct 07 12:36:57 crc kubenswrapper[4854]: I1007 12:36:57.558977 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:36:57 crc kubenswrapper[4854]: I1007 12:36:57.575309 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x558d" podStartSLOduration=2.276976797 podStartE2EDuration="7.575280278s" podCreationTimestamp="2025-10-07 12:36:50 +0000 UTC" firstStartedPulling="2025-10-07 12:36:51.396348136 +0000 UTC m=+727.384180391" lastFinishedPulling="2025-10-07 12:36:56.694651607 +0000 UTC m=+732.682483872" observedRunningTime="2025-10-07 12:36:57.574033501 +0000 UTC m=+733.561865796" watchObservedRunningTime="2025-10-07 12:36:57.575280278 +0000 UTC m=+733.563112533" Oct 07 12:36:57 crc kubenswrapper[4854]: I1007 12:36:57.815059 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-dsv2d" Oct 07 12:36:59 crc kubenswrapper[4854]: I1007 12:36:59.520109 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" event={"ID":"718dcecb-56b4-49cc-8992-d8d4c6447602","Type":"ContainerStarted","Data":"1a7e1271f4a657972c15173d8952a3987cfbdaa323eeee40624967ec6c6892eb"} Oct 07 12:36:59 crc kubenswrapper[4854]: I1007 12:36:59.551280 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" podStartSLOduration=4.290333347 podStartE2EDuration="8.551247969s" podCreationTimestamp="2025-10-07 12:36:51 +0000 UTC" firstStartedPulling="2025-10-07 12:36:52.433861318 +0000 UTC m=+728.421693573" lastFinishedPulling="2025-10-07 12:36:56.69477594 +0000 UTC m=+732.682608195" observedRunningTime="2025-10-07 12:36:59.546974422 +0000 UTC m=+735.534806687" watchObservedRunningTime="2025-10-07 12:36:59.551247969 +0000 UTC m=+735.539080224" Oct 07 12:37:00 crc kubenswrapper[4854]: I1007 12:37:00.196131 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qw7pl"] Oct 07 12:37:00 crc kubenswrapper[4854]: I1007 12:37:00.197672 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qw7pl" podUID="4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" containerName="registry-server" containerID="cri-o://3fa57c19b9af7019e6d833921236ec0d9f988cf46a6874ec28c6e6935d68a1ab" gracePeriod=2 Oct 07 12:37:00 crc kubenswrapper[4854]: I1007 12:37:00.530595 4854 generic.go:334] "Generic (PLEG): container finished" podID="718dcecb-56b4-49cc-8992-d8d4c6447602" containerID="1a7e1271f4a657972c15173d8952a3987cfbdaa323eeee40624967ec6c6892eb" exitCode=0 Oct 07 12:37:00 crc kubenswrapper[4854]: I1007 12:37:00.530658 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" event={"ID":"718dcecb-56b4-49cc-8992-d8d4c6447602","Type":"ContainerDied","Data":"1a7e1271f4a657972c15173d8952a3987cfbdaa323eeee40624967ec6c6892eb"} Oct 07 12:37:00 crc kubenswrapper[4854]: I1007 12:37:00.551520 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:37:00 crc kubenswrapper[4854]: I1007 12:37:00.551781 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:37:00 crc kubenswrapper[4854]: I1007 12:37:00.592695 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-crq2z"] Oct 07 12:37:00 crc kubenswrapper[4854]: I1007 12:37:00.593128 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-crq2z" podUID="fec307e5-ef1c-4c44-8524-204f8d7ae452" containerName="registry-server" containerID="cri-o://c8c17347e45ceb9006c38b6bdc630e2b564eeaeeeaf17ccf8e4e39a075168dbe" gracePeriod=2 Oct 07 12:37:00 crc kubenswrapper[4854]: I1007 12:37:00.612070 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:37:01 crc kubenswrapper[4854]: I1007 12:37:01.927840 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" Oct 07 12:37:01 crc kubenswrapper[4854]: I1007 12:37:01.977445 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bz7k\" (UniqueName: \"kubernetes.io/projected/718dcecb-56b4-49cc-8992-d8d4c6447602-kube-api-access-7bz7k\") pod \"718dcecb-56b4-49cc-8992-d8d4c6447602\" (UID: \"718dcecb-56b4-49cc-8992-d8d4c6447602\") " Oct 07 12:37:01 crc kubenswrapper[4854]: I1007 12:37:01.977959 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/718dcecb-56b4-49cc-8992-d8d4c6447602-bundle\") pod \"718dcecb-56b4-49cc-8992-d8d4c6447602\" (UID: \"718dcecb-56b4-49cc-8992-d8d4c6447602\") " Oct 07 12:37:01 crc kubenswrapper[4854]: I1007 12:37:01.978084 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/718dcecb-56b4-49cc-8992-d8d4c6447602-util\") pod \"718dcecb-56b4-49cc-8992-d8d4c6447602\" (UID: \"718dcecb-56b4-49cc-8992-d8d4c6447602\") " Oct 07 12:37:01 crc kubenswrapper[4854]: I1007 12:37:01.982956 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/718dcecb-56b4-49cc-8992-d8d4c6447602-bundle" (OuterVolumeSpecName: "bundle") pod "718dcecb-56b4-49cc-8992-d8d4c6447602" (UID: "718dcecb-56b4-49cc-8992-d8d4c6447602"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:37:01 crc kubenswrapper[4854]: I1007 12:37:01.997330 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/718dcecb-56b4-49cc-8992-d8d4c6447602-util" (OuterVolumeSpecName: "util") pod "718dcecb-56b4-49cc-8992-d8d4c6447602" (UID: "718dcecb-56b4-49cc-8992-d8d4c6447602"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:37:02 crc kubenswrapper[4854]: I1007 12:37:02.001479 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/718dcecb-56b4-49cc-8992-d8d4c6447602-kube-api-access-7bz7k" (OuterVolumeSpecName: "kube-api-access-7bz7k") pod "718dcecb-56b4-49cc-8992-d8d4c6447602" (UID: "718dcecb-56b4-49cc-8992-d8d4c6447602"). InnerVolumeSpecName "kube-api-access-7bz7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:37:02 crc kubenswrapper[4854]: I1007 12:37:02.081916 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bz7k\" (UniqueName: \"kubernetes.io/projected/718dcecb-56b4-49cc-8992-d8d4c6447602-kube-api-access-7bz7k\") on node \"crc\" DevicePath \"\"" Oct 07 12:37:02 crc kubenswrapper[4854]: I1007 12:37:02.081968 4854 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/718dcecb-56b4-49cc-8992-d8d4c6447602-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:37:02 crc kubenswrapper[4854]: I1007 12:37:02.081980 4854 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/718dcecb-56b4-49cc-8992-d8d4c6447602-util\") on node \"crc\" DevicePath \"\"" Oct 07 12:37:02 crc kubenswrapper[4854]: I1007 12:37:02.549306 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" Oct 07 12:37:02 crc kubenswrapper[4854]: I1007 12:37:02.549180 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz" event={"ID":"718dcecb-56b4-49cc-8992-d8d4c6447602","Type":"ContainerDied","Data":"c910cdb398765477917c1a3f04fe23c66066bb9281f9a69b6da7d3118b6a094f"} Oct 07 12:37:02 crc kubenswrapper[4854]: I1007 12:37:02.549452 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c910cdb398765477917c1a3f04fe23c66066bb9281f9a69b6da7d3118b6a094f" Oct 07 12:37:02 crc kubenswrapper[4854]: I1007 12:37:02.551494 4854 generic.go:334] "Generic (PLEG): container finished" podID="fec307e5-ef1c-4c44-8524-204f8d7ae452" containerID="c8c17347e45ceb9006c38b6bdc630e2b564eeaeeeaf17ccf8e4e39a075168dbe" exitCode=0 Oct 07 12:37:02 crc kubenswrapper[4854]: I1007 12:37:02.551552 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crq2z" event={"ID":"fec307e5-ef1c-4c44-8524-204f8d7ae452","Type":"ContainerDied","Data":"c8c17347e45ceb9006c38b6bdc630e2b564eeaeeeaf17ccf8e4e39a075168dbe"} Oct 07 12:37:02 crc kubenswrapper[4854]: I1007 12:37:02.554012 4854 generic.go:334] "Generic (PLEG): container finished" podID="4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" containerID="3fa57c19b9af7019e6d833921236ec0d9f988cf46a6874ec28c6e6935d68a1ab" exitCode=0 Oct 07 12:37:02 crc kubenswrapper[4854]: I1007 12:37:02.554100 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw7pl" event={"ID":"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d","Type":"ContainerDied","Data":"3fa57c19b9af7019e6d833921236ec0d9f988cf46a6874ec28c6e6935d68a1ab"} Oct 07 12:37:02 crc kubenswrapper[4854]: I1007 12:37:02.611219 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.369090 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.376549 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.406524 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt4f6\" (UniqueName: \"kubernetes.io/projected/fec307e5-ef1c-4c44-8524-204f8d7ae452-kube-api-access-zt4f6\") pod \"fec307e5-ef1c-4c44-8524-204f8d7ae452\" (UID: \"fec307e5-ef1c-4c44-8524-204f8d7ae452\") " Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.406698 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-catalog-content\") pod \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\" (UID: \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\") " Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.406809 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-utilities\") pod \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\" (UID: \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\") " Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.406888 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fec307e5-ef1c-4c44-8524-204f8d7ae452-utilities\") pod \"fec307e5-ef1c-4c44-8524-204f8d7ae452\" (UID: \"fec307e5-ef1c-4c44-8524-204f8d7ae452\") " Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.406940 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fec307e5-ef1c-4c44-8524-204f8d7ae452-catalog-content\") pod \"fec307e5-ef1c-4c44-8524-204f8d7ae452\" (UID: \"fec307e5-ef1c-4c44-8524-204f8d7ae452\") " Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.406972 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drrbk\" (UniqueName: \"kubernetes.io/projected/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-kube-api-access-drrbk\") pod \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\" (UID: \"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d\") " Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.408500 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fec307e5-ef1c-4c44-8524-204f8d7ae452-utilities" (OuterVolumeSpecName: "utilities") pod "fec307e5-ef1c-4c44-8524-204f8d7ae452" (UID: "fec307e5-ef1c-4c44-8524-204f8d7ae452"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.408615 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-utilities" (OuterVolumeSpecName: "utilities") pod "4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" (UID: "4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.418470 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fec307e5-ef1c-4c44-8524-204f8d7ae452-kube-api-access-zt4f6" (OuterVolumeSpecName: "kube-api-access-zt4f6") pod "fec307e5-ef1c-4c44-8524-204f8d7ae452" (UID: "fec307e5-ef1c-4c44-8524-204f8d7ae452"). InnerVolumeSpecName "kube-api-access-zt4f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.421362 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-kube-api-access-drrbk" (OuterVolumeSpecName: "kube-api-access-drrbk") pod "4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" (UID: "4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d"). InnerVolumeSpecName "kube-api-access-drrbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.486506 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" (UID: "4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.494738 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fec307e5-ef1c-4c44-8524-204f8d7ae452-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fec307e5-ef1c-4c44-8524-204f8d7ae452" (UID: "fec307e5-ef1c-4c44-8524-204f8d7ae452"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.508873 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt4f6\" (UniqueName: \"kubernetes.io/projected/fec307e5-ef1c-4c44-8524-204f8d7ae452-kube-api-access-zt4f6\") on node \"crc\" DevicePath \"\"" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.508904 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.508919 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.508931 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fec307e5-ef1c-4c44-8524-204f8d7ae452-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.508947 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fec307e5-ef1c-4c44-8524-204f8d7ae452-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.508961 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drrbk\" (UniqueName: \"kubernetes.io/projected/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d-kube-api-access-drrbk\") on node \"crc\" DevicePath \"\"" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.565256 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crq2z" event={"ID":"fec307e5-ef1c-4c44-8524-204f8d7ae452","Type":"ContainerDied","Data":"4a49cff7504cbe8129974a055d1dc424a624d0c046c00188a7494bacb38c36b8"} Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.565352 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crq2z" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.565808 4854 scope.go:117] "RemoveContainer" containerID="c8c17347e45ceb9006c38b6bdc630e2b564eeaeeeaf17ccf8e4e39a075168dbe" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.570379 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw7pl" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.571400 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw7pl" event={"ID":"4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d","Type":"ContainerDied","Data":"d2628fe5c4f2adc7b228207eb168c0b2c9c6830ab69eff4ecf47ec8019e341bc"} Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.602662 4854 scope.go:117] "RemoveContainer" containerID="a46b73cff9c40e2ae8d0b44e751cfbc569f563c05d61b0542b4ff169b0364d50" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.608808 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qw7pl"] Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.632900 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qw7pl"] Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.640835 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-crq2z"] Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.644686 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-crq2z"] Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.648401 4854 scope.go:117] "RemoveContainer" containerID="4698f5e210942ce047254ace5c095355049de6ed433b1c96edffd290ce95ce95" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.670976 4854 scope.go:117] "RemoveContainer" containerID="3fa57c19b9af7019e6d833921236ec0d9f988cf46a6874ec28c6e6935d68a1ab" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.690903 4854 scope.go:117] "RemoveContainer" containerID="f176660dde63c5b01ceca2f877a4adc51c69e4112db2a4d298d89d79b091e378" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.714826 4854 scope.go:117] "RemoveContainer" containerID="a020a500bf32113a5c71ca4c8c34caaf69321b4ad6eb126fa41aa1abea68d157" Oct 07 12:37:03 crc kubenswrapper[4854]: I1007 12:37:03.792494 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x558d"] Oct 07 12:37:04 crc kubenswrapper[4854]: I1007 12:37:04.581998 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x558d" podUID="5d461d69-005a-4ae7-954d-c39d833d8021" containerName="registry-server" containerID="cri-o://2e179bff55df8268c4a5717ccd5c6759abad8348c12a85e53c18a035e0867dd8" gracePeriod=2 Oct 07 12:37:04 crc kubenswrapper[4854]: I1007 12:37:04.711572 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" path="/var/lib/kubelet/pods/4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d/volumes" Oct 07 12:37:04 crc kubenswrapper[4854]: I1007 12:37:04.712293 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fec307e5-ef1c-4c44-8524-204f8d7ae452" path="/var/lib/kubelet/pods/fec307e5-ef1c-4c44-8524-204f8d7ae452/volumes" Oct 07 12:37:06 crc kubenswrapper[4854]: I1007 12:37:06.598324 4854 generic.go:334] "Generic (PLEG): container finished" podID="5d461d69-005a-4ae7-954d-c39d833d8021" containerID="2e179bff55df8268c4a5717ccd5c6759abad8348c12a85e53c18a035e0867dd8" exitCode=0 Oct 07 12:37:06 crc kubenswrapper[4854]: I1007 12:37:06.598420 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x558d" event={"ID":"5d461d69-005a-4ae7-954d-c39d833d8021","Type":"ContainerDied","Data":"2e179bff55df8268c4a5717ccd5c6759abad8348c12a85e53c18a035e0867dd8"} Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.141336 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.182264 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bkbf\" (UniqueName: \"kubernetes.io/projected/5d461d69-005a-4ae7-954d-c39d833d8021-kube-api-access-2bkbf\") pod \"5d461d69-005a-4ae7-954d-c39d833d8021\" (UID: \"5d461d69-005a-4ae7-954d-c39d833d8021\") " Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.182343 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d461d69-005a-4ae7-954d-c39d833d8021-catalog-content\") pod \"5d461d69-005a-4ae7-954d-c39d833d8021\" (UID: \"5d461d69-005a-4ae7-954d-c39d833d8021\") " Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.182420 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d461d69-005a-4ae7-954d-c39d833d8021-utilities\") pod \"5d461d69-005a-4ae7-954d-c39d833d8021\" (UID: \"5d461d69-005a-4ae7-954d-c39d833d8021\") " Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.183652 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d461d69-005a-4ae7-954d-c39d833d8021-utilities" (OuterVolumeSpecName: "utilities") pod "5d461d69-005a-4ae7-954d-c39d833d8021" (UID: "5d461d69-005a-4ae7-954d-c39d833d8021"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.189730 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d461d69-005a-4ae7-954d-c39d833d8021-kube-api-access-2bkbf" (OuterVolumeSpecName: "kube-api-access-2bkbf") pod "5d461d69-005a-4ae7-954d-c39d833d8021" (UID: "5d461d69-005a-4ae7-954d-c39d833d8021"). InnerVolumeSpecName "kube-api-access-2bkbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.206546 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d461d69-005a-4ae7-954d-c39d833d8021-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d461d69-005a-4ae7-954d-c39d833d8021" (UID: "5d461d69-005a-4ae7-954d-c39d833d8021"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.284464 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d461d69-005a-4ae7-954d-c39d833d8021-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.284507 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bkbf\" (UniqueName: \"kubernetes.io/projected/5d461d69-005a-4ae7-954d-c39d833d8021-kube-api-access-2bkbf\") on node \"crc\" DevicePath \"\"" Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.284520 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d461d69-005a-4ae7-954d-c39d833d8021-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.440474 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-9d94j" Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.609521 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x558d" event={"ID":"5d461d69-005a-4ae7-954d-c39d833d8021","Type":"ContainerDied","Data":"17ea7e60e0f5d91818670d3fe1afdb7bc70c19ce1b64616d8bf88409a76d4b83"} Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.609825 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x558d" Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.609834 4854 scope.go:117] "RemoveContainer" containerID="2e179bff55df8268c4a5717ccd5c6759abad8348c12a85e53c18a035e0867dd8" Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.636419 4854 scope.go:117] "RemoveContainer" containerID="a562bf9dafac0a4d801a1facbc15c2e9e3e5861bdda4e3492f8631fc2b6410d1" Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.652671 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x558d"] Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.652734 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x558d"] Oct 07 12:37:07 crc kubenswrapper[4854]: I1007 12:37:07.678840 4854 scope.go:117] "RemoveContainer" containerID="7399078322f7b3a22f739eff01a5f033faed0d46c4e421e3704eff476f364d8a" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.295491 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-hrslp"] Oct 07 12:37:08 crc kubenswrapper[4854]: E1007 12:37:08.295834 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" containerName="extract-content" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.295854 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" containerName="extract-content" Oct 07 12:37:08 crc kubenswrapper[4854]: E1007 12:37:08.295870 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718dcecb-56b4-49cc-8992-d8d4c6447602" containerName="pull" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.295879 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="718dcecb-56b4-49cc-8992-d8d4c6447602" containerName="pull" Oct 07 12:37:08 crc kubenswrapper[4854]: E1007 12:37:08.295894 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fec307e5-ef1c-4c44-8524-204f8d7ae452" containerName="extract-content" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.295902 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="fec307e5-ef1c-4c44-8524-204f8d7ae452" containerName="extract-content" Oct 07 12:37:08 crc kubenswrapper[4854]: E1007 12:37:08.295914 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d461d69-005a-4ae7-954d-c39d833d8021" containerName="extract-utilities" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.295922 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d461d69-005a-4ae7-954d-c39d833d8021" containerName="extract-utilities" Oct 07 12:37:08 crc kubenswrapper[4854]: E1007 12:37:08.295933 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718dcecb-56b4-49cc-8992-d8d4c6447602" containerName="util" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.295942 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="718dcecb-56b4-49cc-8992-d8d4c6447602" containerName="util" Oct 07 12:37:08 crc kubenswrapper[4854]: E1007 12:37:08.295959 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d461d69-005a-4ae7-954d-c39d833d8021" containerName="extract-content" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.295968 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d461d69-005a-4ae7-954d-c39d833d8021" containerName="extract-content" Oct 07 12:37:08 crc kubenswrapper[4854]: E1007 12:37:08.295977 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" containerName="extract-utilities" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.295986 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" containerName="extract-utilities" Oct 07 12:37:08 crc kubenswrapper[4854]: E1007 12:37:08.295994 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d461d69-005a-4ae7-954d-c39d833d8021" containerName="registry-server" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.296001 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d461d69-005a-4ae7-954d-c39d833d8021" containerName="registry-server" Oct 07 12:37:08 crc kubenswrapper[4854]: E1007 12:37:08.296014 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fec307e5-ef1c-4c44-8524-204f8d7ae452" containerName="extract-utilities" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.296021 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="fec307e5-ef1c-4c44-8524-204f8d7ae452" containerName="extract-utilities" Oct 07 12:37:08 crc kubenswrapper[4854]: E1007 12:37:08.296031 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718dcecb-56b4-49cc-8992-d8d4c6447602" containerName="extract" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.296038 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="718dcecb-56b4-49cc-8992-d8d4c6447602" containerName="extract" Oct 07 12:37:08 crc kubenswrapper[4854]: E1007 12:37:08.296055 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fec307e5-ef1c-4c44-8524-204f8d7ae452" containerName="registry-server" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.296061 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="fec307e5-ef1c-4c44-8524-204f8d7ae452" containerName="registry-server" Oct 07 12:37:08 crc kubenswrapper[4854]: E1007 12:37:08.296072 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" containerName="registry-server" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.296079 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" containerName="registry-server" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.296220 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3e8707-9e37-4e38-b5f9-c4dc13ad9c6d" containerName="registry-server" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.296235 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d461d69-005a-4ae7-954d-c39d833d8021" containerName="registry-server" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.296246 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="718dcecb-56b4-49cc-8992-d8d4c6447602" containerName="extract" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.296261 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="fec307e5-ef1c-4c44-8524-204f8d7ae452" containerName="registry-server" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.296911 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-hrslp" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.298881 4854 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-wlqpm" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.299379 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.300656 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.320396 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-hrslp"] Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.399396 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkttp\" (UniqueName: \"kubernetes.io/projected/7d5da3ab-9353-4ada-8ca8-593d32bc9223-kube-api-access-nkttp\") pod \"cert-manager-operator-controller-manager-57cd46d6d-hrslp\" (UID: \"7d5da3ab-9353-4ada-8ca8-593d32bc9223\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-hrslp" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.500490 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkttp\" (UniqueName: \"kubernetes.io/projected/7d5da3ab-9353-4ada-8ca8-593d32bc9223-kube-api-access-nkttp\") pod \"cert-manager-operator-controller-manager-57cd46d6d-hrslp\" (UID: \"7d5da3ab-9353-4ada-8ca8-593d32bc9223\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-hrslp" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.541402 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkttp\" (UniqueName: \"kubernetes.io/projected/7d5da3ab-9353-4ada-8ca8-593d32bc9223-kube-api-access-nkttp\") pod \"cert-manager-operator-controller-manager-57cd46d6d-hrslp\" (UID: \"7d5da3ab-9353-4ada-8ca8-593d32bc9223\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-hrslp" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.611568 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-hrslp" Oct 07 12:37:08 crc kubenswrapper[4854]: I1007 12:37:08.721520 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d461d69-005a-4ae7-954d-c39d833d8021" path="/var/lib/kubelet/pods/5d461d69-005a-4ae7-954d-c39d833d8021/volumes" Oct 07 12:37:09 crc kubenswrapper[4854]: I1007 12:37:09.068414 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-hrslp"] Oct 07 12:37:09 crc kubenswrapper[4854]: W1007 12:37:09.072129 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d5da3ab_9353_4ada_8ca8_593d32bc9223.slice/crio-0029363b3f3b03f9cbcc8c99888a2ac786e8abdb031052feadde4a8ce606e990 WatchSource:0}: Error finding container 0029363b3f3b03f9cbcc8c99888a2ac786e8abdb031052feadde4a8ce606e990: Status 404 returned error can't find the container with id 0029363b3f3b03f9cbcc8c99888a2ac786e8abdb031052feadde4a8ce606e990 Oct 07 12:37:09 crc kubenswrapper[4854]: I1007 12:37:09.628989 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-hrslp" event={"ID":"7d5da3ab-9353-4ada-8ca8-593d32bc9223","Type":"ContainerStarted","Data":"0029363b3f3b03f9cbcc8c99888a2ac786e8abdb031052feadde4a8ce606e990"} Oct 07 12:37:18 crc kubenswrapper[4854]: I1007 12:37:18.742706 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-hrslp" event={"ID":"7d5da3ab-9353-4ada-8ca8-593d32bc9223","Type":"ContainerStarted","Data":"984b2c3fd8d0c5db8a274e8c1bfba13bddfe35d263ddb2a807bdb109b80ff353"} Oct 07 12:37:18 crc kubenswrapper[4854]: I1007 12:37:18.777482 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-hrslp" podStartSLOduration=1.638148256 podStartE2EDuration="10.777453542s" podCreationTimestamp="2025-10-07 12:37:08 +0000 UTC" firstStartedPulling="2025-10-07 12:37:09.075620434 +0000 UTC m=+745.063452689" lastFinishedPulling="2025-10-07 12:37:18.21492572 +0000 UTC m=+754.202757975" observedRunningTime="2025-10-07 12:37:18.773742171 +0000 UTC m=+754.761574436" watchObservedRunningTime="2025-10-07 12:37:18.777453542 +0000 UTC m=+754.765285797" Oct 07 12:37:23 crc kubenswrapper[4854]: I1007 12:37:23.450557 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-qgclp"] Oct 07 12:37:23 crc kubenswrapper[4854]: I1007 12:37:23.452341 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-qgclp" Oct 07 12:37:23 crc kubenswrapper[4854]: I1007 12:37:23.454968 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 07 12:37:23 crc kubenswrapper[4854]: I1007 12:37:23.455391 4854 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-jwvgv" Oct 07 12:37:23 crc kubenswrapper[4854]: I1007 12:37:23.461779 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-qgclp"] Oct 07 12:37:23 crc kubenswrapper[4854]: I1007 12:37:23.462627 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 07 12:37:23 crc kubenswrapper[4854]: I1007 12:37:23.546967 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a-bound-sa-token\") pod \"cert-manager-webhook-d969966f-qgclp\" (UID: \"33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a\") " pod="cert-manager/cert-manager-webhook-d969966f-qgclp" Oct 07 12:37:23 crc kubenswrapper[4854]: I1007 12:37:23.547090 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn9gf\" (UniqueName: \"kubernetes.io/projected/33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a-kube-api-access-bn9gf\") pod \"cert-manager-webhook-d969966f-qgclp\" (UID: \"33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a\") " pod="cert-manager/cert-manager-webhook-d969966f-qgclp" Oct 07 12:37:23 crc kubenswrapper[4854]: I1007 12:37:23.647783 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn9gf\" (UniqueName: \"kubernetes.io/projected/33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a-kube-api-access-bn9gf\") pod \"cert-manager-webhook-d969966f-qgclp\" (UID: \"33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a\") " pod="cert-manager/cert-manager-webhook-d969966f-qgclp" Oct 07 12:37:23 crc kubenswrapper[4854]: I1007 12:37:23.647833 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a-bound-sa-token\") pod \"cert-manager-webhook-d969966f-qgclp\" (UID: \"33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a\") " pod="cert-manager/cert-manager-webhook-d969966f-qgclp" Oct 07 12:37:23 crc kubenswrapper[4854]: I1007 12:37:23.670293 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a-bound-sa-token\") pod \"cert-manager-webhook-d969966f-qgclp\" (UID: \"33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a\") " pod="cert-manager/cert-manager-webhook-d969966f-qgclp" Oct 07 12:37:23 crc kubenswrapper[4854]: I1007 12:37:23.680197 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn9gf\" (UniqueName: \"kubernetes.io/projected/33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a-kube-api-access-bn9gf\") pod \"cert-manager-webhook-d969966f-qgclp\" (UID: \"33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a\") " pod="cert-manager/cert-manager-webhook-d969966f-qgclp" Oct 07 12:37:23 crc kubenswrapper[4854]: I1007 12:37:23.769346 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-qgclp" Oct 07 12:37:24 crc kubenswrapper[4854]: I1007 12:37:24.215600 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-qgclp"] Oct 07 12:37:24 crc kubenswrapper[4854]: I1007 12:37:24.228821 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-8gdkc"] Oct 07 12:37:24 crc kubenswrapper[4854]: I1007 12:37:24.230296 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8gdkc" Oct 07 12:37:24 crc kubenswrapper[4854]: I1007 12:37:24.237526 4854 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-vlc7q" Oct 07 12:37:24 crc kubenswrapper[4854]: I1007 12:37:24.242732 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-8gdkc"] Oct 07 12:37:24 crc kubenswrapper[4854]: I1007 12:37:24.359279 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f476n\" (UniqueName: \"kubernetes.io/projected/17874387-19ac-41f2-b359-1407fb8f09dc-kube-api-access-f476n\") pod \"cert-manager-cainjector-7d9f95dbf-8gdkc\" (UID: \"17874387-19ac-41f2-b359-1407fb8f09dc\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8gdkc" Oct 07 12:37:24 crc kubenswrapper[4854]: I1007 12:37:24.359464 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/17874387-19ac-41f2-b359-1407fb8f09dc-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-8gdkc\" (UID: \"17874387-19ac-41f2-b359-1407fb8f09dc\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8gdkc" Oct 07 12:37:24 crc kubenswrapper[4854]: I1007 12:37:24.461698 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f476n\" (UniqueName: \"kubernetes.io/projected/17874387-19ac-41f2-b359-1407fb8f09dc-kube-api-access-f476n\") pod \"cert-manager-cainjector-7d9f95dbf-8gdkc\" (UID: \"17874387-19ac-41f2-b359-1407fb8f09dc\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8gdkc" Oct 07 12:37:24 crc kubenswrapper[4854]: I1007 12:37:24.461833 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/17874387-19ac-41f2-b359-1407fb8f09dc-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-8gdkc\" (UID: \"17874387-19ac-41f2-b359-1407fb8f09dc\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8gdkc" Oct 07 12:37:24 crc kubenswrapper[4854]: I1007 12:37:24.483105 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f476n\" (UniqueName: \"kubernetes.io/projected/17874387-19ac-41f2-b359-1407fb8f09dc-kube-api-access-f476n\") pod \"cert-manager-cainjector-7d9f95dbf-8gdkc\" (UID: \"17874387-19ac-41f2-b359-1407fb8f09dc\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8gdkc" Oct 07 12:37:24 crc kubenswrapper[4854]: I1007 12:37:24.483203 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/17874387-19ac-41f2-b359-1407fb8f09dc-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-8gdkc\" (UID: \"17874387-19ac-41f2-b359-1407fb8f09dc\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8gdkc" Oct 07 12:37:24 crc kubenswrapper[4854]: I1007 12:37:24.587333 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8gdkc" Oct 07 12:37:24 crc kubenswrapper[4854]: I1007 12:37:24.802497 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-qgclp" event={"ID":"33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a","Type":"ContainerStarted","Data":"60eac38f1c756db3277db85ca6465912b316b7831123c19758371de2d205a25f"} Oct 07 12:37:25 crc kubenswrapper[4854]: I1007 12:37:25.068010 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-8gdkc"] Oct 07 12:37:25 crc kubenswrapper[4854]: I1007 12:37:25.812282 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8gdkc" event={"ID":"17874387-19ac-41f2-b359-1407fb8f09dc","Type":"ContainerStarted","Data":"886ebfc56152d247e0e45860680aa1f797160e0cac69d27ee18d3c243998d75c"} Oct 07 12:37:29 crc kubenswrapper[4854]: I1007 12:37:29.863559 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8gdkc" event={"ID":"17874387-19ac-41f2-b359-1407fb8f09dc","Type":"ContainerStarted","Data":"5308a105e336e4e93c295851611d6f82801edd934390fcfdeb77370bb7ca551a"} Oct 07 12:37:29 crc kubenswrapper[4854]: I1007 12:37:29.866433 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-qgclp" event={"ID":"33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a","Type":"ContainerStarted","Data":"547305a97ad5fd9a0e076ab9697059967d815d451dbb0dac61e7461eb7e7e1c6"} Oct 07 12:37:29 crc kubenswrapper[4854]: I1007 12:37:29.866709 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-qgclp" Oct 07 12:37:29 crc kubenswrapper[4854]: I1007 12:37:29.888293 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8gdkc" podStartSLOduration=2.217976501 podStartE2EDuration="5.888266547s" podCreationTimestamp="2025-10-07 12:37:24 +0000 UTC" firstStartedPulling="2025-10-07 12:37:25.078685035 +0000 UTC m=+761.066517310" lastFinishedPulling="2025-10-07 12:37:28.748975101 +0000 UTC m=+764.736807356" observedRunningTime="2025-10-07 12:37:29.88331316 +0000 UTC m=+765.871145415" watchObservedRunningTime="2025-10-07 12:37:29.888266547 +0000 UTC m=+765.876098822" Oct 07 12:37:29 crc kubenswrapper[4854]: I1007 12:37:29.908885 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-qgclp" podStartSLOduration=2.413597676 podStartE2EDuration="6.908859869s" podCreationTimestamp="2025-10-07 12:37:23 +0000 UTC" firstStartedPulling="2025-10-07 12:37:24.230995463 +0000 UTC m=+760.218827738" lastFinishedPulling="2025-10-07 12:37:28.726257676 +0000 UTC m=+764.714089931" observedRunningTime="2025-10-07 12:37:29.908764996 +0000 UTC m=+765.896597261" watchObservedRunningTime="2025-10-07 12:37:29.908859869 +0000 UTC m=+765.896692124" Oct 07 12:37:32 crc kubenswrapper[4854]: I1007 12:37:32.911510 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-fvqd8"] Oct 07 12:37:32 crc kubenswrapper[4854]: I1007 12:37:32.913011 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-fvqd8" Oct 07 12:37:32 crc kubenswrapper[4854]: I1007 12:37:32.919250 4854 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ks97r" Oct 07 12:37:32 crc kubenswrapper[4854]: I1007 12:37:32.923042 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-fvqd8"] Oct 07 12:37:32 crc kubenswrapper[4854]: I1007 12:37:32.998374 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkl7l\" (UniqueName: \"kubernetes.io/projected/3d3237da-c856-4707-956e-2a25e381506e-kube-api-access-xkl7l\") pod \"cert-manager-7d4cc89fcb-fvqd8\" (UID: \"3d3237da-c856-4707-956e-2a25e381506e\") " pod="cert-manager/cert-manager-7d4cc89fcb-fvqd8" Oct 07 12:37:32 crc kubenswrapper[4854]: I1007 12:37:32.998441 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d3237da-c856-4707-956e-2a25e381506e-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-fvqd8\" (UID: \"3d3237da-c856-4707-956e-2a25e381506e\") " pod="cert-manager/cert-manager-7d4cc89fcb-fvqd8" Oct 07 12:37:33 crc kubenswrapper[4854]: I1007 12:37:33.099988 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkl7l\" (UniqueName: \"kubernetes.io/projected/3d3237da-c856-4707-956e-2a25e381506e-kube-api-access-xkl7l\") pod \"cert-manager-7d4cc89fcb-fvqd8\" (UID: \"3d3237da-c856-4707-956e-2a25e381506e\") " pod="cert-manager/cert-manager-7d4cc89fcb-fvqd8" Oct 07 12:37:33 crc kubenswrapper[4854]: I1007 12:37:33.100342 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d3237da-c856-4707-956e-2a25e381506e-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-fvqd8\" (UID: \"3d3237da-c856-4707-956e-2a25e381506e\") " pod="cert-manager/cert-manager-7d4cc89fcb-fvqd8" Oct 07 12:37:33 crc kubenswrapper[4854]: I1007 12:37:33.129594 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d3237da-c856-4707-956e-2a25e381506e-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-fvqd8\" (UID: \"3d3237da-c856-4707-956e-2a25e381506e\") " pod="cert-manager/cert-manager-7d4cc89fcb-fvqd8" Oct 07 12:37:33 crc kubenswrapper[4854]: I1007 12:37:33.131889 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkl7l\" (UniqueName: \"kubernetes.io/projected/3d3237da-c856-4707-956e-2a25e381506e-kube-api-access-xkl7l\") pod \"cert-manager-7d4cc89fcb-fvqd8\" (UID: \"3d3237da-c856-4707-956e-2a25e381506e\") " pod="cert-manager/cert-manager-7d4cc89fcb-fvqd8" Oct 07 12:37:33 crc kubenswrapper[4854]: I1007 12:37:33.241193 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-fvqd8" Oct 07 12:37:33 crc kubenswrapper[4854]: I1007 12:37:33.721477 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-fvqd8"] Oct 07 12:37:33 crc kubenswrapper[4854]: I1007 12:37:33.772341 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-qgclp" Oct 07 12:37:33 crc kubenswrapper[4854]: I1007 12:37:33.901483 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-fvqd8" event={"ID":"3d3237da-c856-4707-956e-2a25e381506e","Type":"ContainerStarted","Data":"1d8e671a57aa9da870433fb770a1d57d1b3dfa6ee217a64940054c54d07b055c"} Oct 07 12:37:34 crc kubenswrapper[4854]: I1007 12:37:34.910797 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-fvqd8" event={"ID":"3d3237da-c856-4707-956e-2a25e381506e","Type":"ContainerStarted","Data":"4200674011c54cee54a3c85d9a70625c9b1e6c135e511e62119d547263e03a69"} Oct 07 12:37:34 crc kubenswrapper[4854]: I1007 12:37:34.934944 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-fvqd8" podStartSLOduration=2.93490757 podStartE2EDuration="2.93490757s" podCreationTimestamp="2025-10-07 12:37:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:37:34.928440638 +0000 UTC m=+770.916272903" watchObservedRunningTime="2025-10-07 12:37:34.93490757 +0000 UTC m=+770.922739835" Oct 07 12:37:38 crc kubenswrapper[4854]: I1007 12:37:38.163727 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-frqgk"] Oct 07 12:37:38 crc kubenswrapper[4854]: I1007 12:37:38.165066 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-frqgk" Oct 07 12:37:38 crc kubenswrapper[4854]: I1007 12:37:38.167029 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 07 12:37:38 crc kubenswrapper[4854]: I1007 12:37:38.168970 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-d6lsr" Oct 07 12:37:38 crc kubenswrapper[4854]: I1007 12:37:38.177838 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-frqgk"] Oct 07 12:37:38 crc kubenswrapper[4854]: I1007 12:37:38.178350 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 07 12:37:38 crc kubenswrapper[4854]: I1007 12:37:38.297077 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jxzx\" (UniqueName: \"kubernetes.io/projected/c3539a6f-6e3c-4c11-8fe3-6eba89768ae0-kube-api-access-8jxzx\") pod \"openstack-operator-index-frqgk\" (UID: \"c3539a6f-6e3c-4c11-8fe3-6eba89768ae0\") " pod="openstack-operators/openstack-operator-index-frqgk" Oct 07 12:37:38 crc kubenswrapper[4854]: I1007 12:37:38.399275 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jxzx\" (UniqueName: \"kubernetes.io/projected/c3539a6f-6e3c-4c11-8fe3-6eba89768ae0-kube-api-access-8jxzx\") pod \"openstack-operator-index-frqgk\" (UID: \"c3539a6f-6e3c-4c11-8fe3-6eba89768ae0\") " pod="openstack-operators/openstack-operator-index-frqgk" Oct 07 12:37:38 crc kubenswrapper[4854]: I1007 12:37:38.428643 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jxzx\" (UniqueName: \"kubernetes.io/projected/c3539a6f-6e3c-4c11-8fe3-6eba89768ae0-kube-api-access-8jxzx\") pod \"openstack-operator-index-frqgk\" (UID: \"c3539a6f-6e3c-4c11-8fe3-6eba89768ae0\") " pod="openstack-operators/openstack-operator-index-frqgk" Oct 07 12:37:38 crc kubenswrapper[4854]: I1007 12:37:38.496941 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-frqgk" Oct 07 12:37:38 crc kubenswrapper[4854]: I1007 12:37:38.738345 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-frqgk"] Oct 07 12:37:38 crc kubenswrapper[4854]: W1007 12:37:38.754403 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3539a6f_6e3c_4c11_8fe3_6eba89768ae0.slice/crio-f9b17fc0fdd00a9de35a3fc1e5feed58a7e7892f7a285caebbf3080bfa43064b WatchSource:0}: Error finding container f9b17fc0fdd00a9de35a3fc1e5feed58a7e7892f7a285caebbf3080bfa43064b: Status 404 returned error can't find the container with id f9b17fc0fdd00a9de35a3fc1e5feed58a7e7892f7a285caebbf3080bfa43064b Oct 07 12:37:38 crc kubenswrapper[4854]: I1007 12:37:38.937748 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-frqgk" event={"ID":"c3539a6f-6e3c-4c11-8fe3-6eba89768ae0","Type":"ContainerStarted","Data":"f9b17fc0fdd00a9de35a3fc1e5feed58a7e7892f7a285caebbf3080bfa43064b"} Oct 07 12:37:40 crc kubenswrapper[4854]: I1007 12:37:40.807892 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:37:40 crc kubenswrapper[4854]: I1007 12:37:40.808356 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:37:41 crc kubenswrapper[4854]: I1007 12:37:41.542958 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-frqgk"] Oct 07 12:37:42 crc kubenswrapper[4854]: I1007 12:37:42.155003 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-tlx58"] Oct 07 12:37:42 crc kubenswrapper[4854]: I1007 12:37:42.156173 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tlx58" Oct 07 12:37:42 crc kubenswrapper[4854]: I1007 12:37:42.166635 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tlx58"] Oct 07 12:37:42 crc kubenswrapper[4854]: I1007 12:37:42.271383 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-764wc\" (UniqueName: \"kubernetes.io/projected/b71069c1-735b-4896-94f3-82913ac9dbf0-kube-api-access-764wc\") pod \"openstack-operator-index-tlx58\" (UID: \"b71069c1-735b-4896-94f3-82913ac9dbf0\") " pod="openstack-operators/openstack-operator-index-tlx58" Oct 07 12:37:42 crc kubenswrapper[4854]: I1007 12:37:42.373907 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-764wc\" (UniqueName: \"kubernetes.io/projected/b71069c1-735b-4896-94f3-82913ac9dbf0-kube-api-access-764wc\") pod \"openstack-operator-index-tlx58\" (UID: \"b71069c1-735b-4896-94f3-82913ac9dbf0\") " pod="openstack-operators/openstack-operator-index-tlx58" Oct 07 12:37:42 crc kubenswrapper[4854]: I1007 12:37:42.401260 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-764wc\" (UniqueName: \"kubernetes.io/projected/b71069c1-735b-4896-94f3-82913ac9dbf0-kube-api-access-764wc\") pod \"openstack-operator-index-tlx58\" (UID: \"b71069c1-735b-4896-94f3-82913ac9dbf0\") " pod="openstack-operators/openstack-operator-index-tlx58" Oct 07 12:37:42 crc kubenswrapper[4854]: I1007 12:37:42.492822 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tlx58" Oct 07 12:37:42 crc kubenswrapper[4854]: I1007 12:37:42.712636 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tlx58"] Oct 07 12:37:42 crc kubenswrapper[4854]: W1007 12:37:42.735042 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb71069c1_735b_4896_94f3_82913ac9dbf0.slice/crio-be1c1ea4d3feb1d2795de13c5c82c65aed4cc5a17df41c18aa1f15dabb149291 WatchSource:0}: Error finding container be1c1ea4d3feb1d2795de13c5c82c65aed4cc5a17df41c18aa1f15dabb149291: Status 404 returned error can't find the container with id be1c1ea4d3feb1d2795de13c5c82c65aed4cc5a17df41c18aa1f15dabb149291 Oct 07 12:37:42 crc kubenswrapper[4854]: I1007 12:37:42.966379 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tlx58" event={"ID":"b71069c1-735b-4896-94f3-82913ac9dbf0","Type":"ContainerStarted","Data":"31dda12edcaf161b0d66311afb6b4fb7ccc9476b6e78ac4c887d691aa18cf1d9"} Oct 07 12:37:42 crc kubenswrapper[4854]: I1007 12:37:42.966820 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tlx58" event={"ID":"b71069c1-735b-4896-94f3-82913ac9dbf0","Type":"ContainerStarted","Data":"be1c1ea4d3feb1d2795de13c5c82c65aed4cc5a17df41c18aa1f15dabb149291"} Oct 07 12:37:42 crc kubenswrapper[4854]: I1007 12:37:42.968992 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-frqgk" event={"ID":"c3539a6f-6e3c-4c11-8fe3-6eba89768ae0","Type":"ContainerStarted","Data":"0c9f0f5cd08d03d6eac2ef96bbf3709f817c69fe2eeb41b8a85744942de05cec"} Oct 07 12:37:42 crc kubenswrapper[4854]: I1007 12:37:42.969186 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-frqgk" podUID="c3539a6f-6e3c-4c11-8fe3-6eba89768ae0" containerName="registry-server" containerID="cri-o://0c9f0f5cd08d03d6eac2ef96bbf3709f817c69fe2eeb41b8a85744942de05cec" gracePeriod=2 Oct 07 12:37:43 crc kubenswrapper[4854]: I1007 12:37:43.004082 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-tlx58" podStartSLOduration=0.940422153 podStartE2EDuration="1.004058674s" podCreationTimestamp="2025-10-07 12:37:42 +0000 UTC" firstStartedPulling="2025-10-07 12:37:42.738065442 +0000 UTC m=+778.725897697" lastFinishedPulling="2025-10-07 12:37:42.801701963 +0000 UTC m=+778.789534218" observedRunningTime="2025-10-07 12:37:42.997592052 +0000 UTC m=+778.985424327" watchObservedRunningTime="2025-10-07 12:37:43.004058674 +0000 UTC m=+778.991890949" Oct 07 12:37:43 crc kubenswrapper[4854]: I1007 12:37:43.021941 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-frqgk" podStartSLOduration=1.552659322 podStartE2EDuration="5.021914075s" podCreationTimestamp="2025-10-07 12:37:38 +0000 UTC" firstStartedPulling="2025-10-07 12:37:38.755972074 +0000 UTC m=+774.743804329" lastFinishedPulling="2025-10-07 12:37:42.225226827 +0000 UTC m=+778.213059082" observedRunningTime="2025-10-07 12:37:43.020889675 +0000 UTC m=+779.008721930" watchObservedRunningTime="2025-10-07 12:37:43.021914075 +0000 UTC m=+779.009746330" Oct 07 12:37:43 crc kubenswrapper[4854]: I1007 12:37:43.261797 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-frqgk" Oct 07 12:37:43 crc kubenswrapper[4854]: I1007 12:37:43.302444 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jxzx\" (UniqueName: \"kubernetes.io/projected/c3539a6f-6e3c-4c11-8fe3-6eba89768ae0-kube-api-access-8jxzx\") pod \"c3539a6f-6e3c-4c11-8fe3-6eba89768ae0\" (UID: \"c3539a6f-6e3c-4c11-8fe3-6eba89768ae0\") " Oct 07 12:37:43 crc kubenswrapper[4854]: I1007 12:37:43.308076 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3539a6f-6e3c-4c11-8fe3-6eba89768ae0-kube-api-access-8jxzx" (OuterVolumeSpecName: "kube-api-access-8jxzx") pod "c3539a6f-6e3c-4c11-8fe3-6eba89768ae0" (UID: "c3539a6f-6e3c-4c11-8fe3-6eba89768ae0"). InnerVolumeSpecName "kube-api-access-8jxzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:37:43 crc kubenswrapper[4854]: I1007 12:37:43.403782 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jxzx\" (UniqueName: \"kubernetes.io/projected/c3539a6f-6e3c-4c11-8fe3-6eba89768ae0-kube-api-access-8jxzx\") on node \"crc\" DevicePath \"\"" Oct 07 12:37:43 crc kubenswrapper[4854]: I1007 12:37:43.980919 4854 generic.go:334] "Generic (PLEG): container finished" podID="c3539a6f-6e3c-4c11-8fe3-6eba89768ae0" containerID="0c9f0f5cd08d03d6eac2ef96bbf3709f817c69fe2eeb41b8a85744942de05cec" exitCode=0 Oct 07 12:37:43 crc kubenswrapper[4854]: I1007 12:37:43.981034 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-frqgk" event={"ID":"c3539a6f-6e3c-4c11-8fe3-6eba89768ae0","Type":"ContainerDied","Data":"0c9f0f5cd08d03d6eac2ef96bbf3709f817c69fe2eeb41b8a85744942de05cec"} Oct 07 12:37:43 crc kubenswrapper[4854]: I1007 12:37:43.981067 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-frqgk" Oct 07 12:37:43 crc kubenswrapper[4854]: I1007 12:37:43.981085 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-frqgk" event={"ID":"c3539a6f-6e3c-4c11-8fe3-6eba89768ae0","Type":"ContainerDied","Data":"f9b17fc0fdd00a9de35a3fc1e5feed58a7e7892f7a285caebbf3080bfa43064b"} Oct 07 12:37:43 crc kubenswrapper[4854]: I1007 12:37:43.981102 4854 scope.go:117] "RemoveContainer" containerID="0c9f0f5cd08d03d6eac2ef96bbf3709f817c69fe2eeb41b8a85744942de05cec" Oct 07 12:37:44 crc kubenswrapper[4854]: I1007 12:37:44.018067 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-frqgk"] Oct 07 12:37:44 crc kubenswrapper[4854]: I1007 12:37:44.018106 4854 scope.go:117] "RemoveContainer" containerID="0c9f0f5cd08d03d6eac2ef96bbf3709f817c69fe2eeb41b8a85744942de05cec" Oct 07 12:37:44 crc kubenswrapper[4854]: E1007 12:37:44.018916 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c9f0f5cd08d03d6eac2ef96bbf3709f817c69fe2eeb41b8a85744942de05cec\": container with ID starting with 0c9f0f5cd08d03d6eac2ef96bbf3709f817c69fe2eeb41b8a85744942de05cec not found: ID does not exist" containerID="0c9f0f5cd08d03d6eac2ef96bbf3709f817c69fe2eeb41b8a85744942de05cec" Oct 07 12:37:44 crc kubenswrapper[4854]: I1007 12:37:44.018977 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c9f0f5cd08d03d6eac2ef96bbf3709f817c69fe2eeb41b8a85744942de05cec"} err="failed to get container status \"0c9f0f5cd08d03d6eac2ef96bbf3709f817c69fe2eeb41b8a85744942de05cec\": rpc error: code = NotFound desc = could not find container \"0c9f0f5cd08d03d6eac2ef96bbf3709f817c69fe2eeb41b8a85744942de05cec\": container with ID starting with 0c9f0f5cd08d03d6eac2ef96bbf3709f817c69fe2eeb41b8a85744942de05cec not found: ID does not exist" Oct 07 12:37:44 crc kubenswrapper[4854]: I1007 12:37:44.023832 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-frqgk"] Oct 07 12:37:44 crc kubenswrapper[4854]: I1007 12:37:44.710112 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3539a6f-6e3c-4c11-8fe3-6eba89768ae0" path="/var/lib/kubelet/pods/c3539a6f-6e3c-4c11-8fe3-6eba89768ae0/volumes" Oct 07 12:37:52 crc kubenswrapper[4854]: I1007 12:37:52.493544 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-tlx58" Oct 07 12:37:52 crc kubenswrapper[4854]: I1007 12:37:52.494345 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-tlx58" Oct 07 12:37:52 crc kubenswrapper[4854]: I1007 12:37:52.542741 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-tlx58" Oct 07 12:37:53 crc kubenswrapper[4854]: I1007 12:37:53.084183 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-tlx58" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.015813 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb"] Oct 07 12:38:01 crc kubenswrapper[4854]: E1007 12:38:01.017100 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3539a6f-6e3c-4c11-8fe3-6eba89768ae0" containerName="registry-server" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.017123 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3539a6f-6e3c-4c11-8fe3-6eba89768ae0" containerName="registry-server" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.017285 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3539a6f-6e3c-4c11-8fe3-6eba89768ae0" containerName="registry-server" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.018490 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.021571 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tp4lb" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.034900 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb"] Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.214796 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ba19497-e9e9-4587-b726-cadfb140df77-util\") pod \"03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb\" (UID: \"0ba19497-e9e9-4587-b726-cadfb140df77\") " pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.214885 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ba19497-e9e9-4587-b726-cadfb140df77-bundle\") pod \"03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb\" (UID: \"0ba19497-e9e9-4587-b726-cadfb140df77\") " pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.215475 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj8sj\" (UniqueName: \"kubernetes.io/projected/0ba19497-e9e9-4587-b726-cadfb140df77-kube-api-access-hj8sj\") pod \"03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb\" (UID: \"0ba19497-e9e9-4587-b726-cadfb140df77\") " pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.317413 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ba19497-e9e9-4587-b726-cadfb140df77-util\") pod \"03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb\" (UID: \"0ba19497-e9e9-4587-b726-cadfb140df77\") " pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.317488 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ba19497-e9e9-4587-b726-cadfb140df77-bundle\") pod \"03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb\" (UID: \"0ba19497-e9e9-4587-b726-cadfb140df77\") " pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.317537 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8sj\" (UniqueName: \"kubernetes.io/projected/0ba19497-e9e9-4587-b726-cadfb140df77-kube-api-access-hj8sj\") pod \"03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb\" (UID: \"0ba19497-e9e9-4587-b726-cadfb140df77\") " pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.318561 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ba19497-e9e9-4587-b726-cadfb140df77-util\") pod \"03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb\" (UID: \"0ba19497-e9e9-4587-b726-cadfb140df77\") " pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.318683 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ba19497-e9e9-4587-b726-cadfb140df77-bundle\") pod \"03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb\" (UID: \"0ba19497-e9e9-4587-b726-cadfb140df77\") " pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.340132 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj8sj\" (UniqueName: \"kubernetes.io/projected/0ba19497-e9e9-4587-b726-cadfb140df77-kube-api-access-hj8sj\") pod \"03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb\" (UID: \"0ba19497-e9e9-4587-b726-cadfb140df77\") " pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.363839 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" Oct 07 12:38:01 crc kubenswrapper[4854]: I1007 12:38:01.587051 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb"] Oct 07 12:38:02 crc kubenswrapper[4854]: I1007 12:38:02.123333 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" event={"ID":"0ba19497-e9e9-4587-b726-cadfb140df77","Type":"ContainerStarted","Data":"3e3e6d82ed6d07eabf0c5cad7d6afa553301e5e46635ff614f4bd9b2e67f3c2a"} Oct 07 12:38:04 crc kubenswrapper[4854]: I1007 12:38:04.143773 4854 generic.go:334] "Generic (PLEG): container finished" podID="0ba19497-e9e9-4587-b726-cadfb140df77" containerID="c0c82256503be3e678bf87a0455bb4fdb3b38e959c220545cdc3389411f87936" exitCode=0 Oct 07 12:38:04 crc kubenswrapper[4854]: I1007 12:38:04.143852 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" event={"ID":"0ba19497-e9e9-4587-b726-cadfb140df77","Type":"ContainerDied","Data":"c0c82256503be3e678bf87a0455bb4fdb3b38e959c220545cdc3389411f87936"} Oct 07 12:38:09 crc kubenswrapper[4854]: I1007 12:38:09.197529 4854 generic.go:334] "Generic (PLEG): container finished" podID="0ba19497-e9e9-4587-b726-cadfb140df77" containerID="0412555004a2afc40f29a207bb474e8cb94db3d0b816dd9b32df2f988f9840d7" exitCode=0 Oct 07 12:38:09 crc kubenswrapper[4854]: I1007 12:38:09.197645 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" event={"ID":"0ba19497-e9e9-4587-b726-cadfb140df77","Type":"ContainerDied","Data":"0412555004a2afc40f29a207bb474e8cb94db3d0b816dd9b32df2f988f9840d7"} Oct 07 12:38:10 crc kubenswrapper[4854]: I1007 12:38:10.208805 4854 generic.go:334] "Generic (PLEG): container finished" podID="0ba19497-e9e9-4587-b726-cadfb140df77" containerID="03df1244349acfe07eec38d1a84d1d25db543122a507601f411cc1aa62b1e649" exitCode=0 Oct 07 12:38:10 crc kubenswrapper[4854]: I1007 12:38:10.208890 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" event={"ID":"0ba19497-e9e9-4587-b726-cadfb140df77","Type":"ContainerDied","Data":"03df1244349acfe07eec38d1a84d1d25db543122a507601f411cc1aa62b1e649"} Oct 07 12:38:10 crc kubenswrapper[4854]: I1007 12:38:10.807801 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:38:10 crc kubenswrapper[4854]: I1007 12:38:10.807901 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:38:11 crc kubenswrapper[4854]: I1007 12:38:11.537108 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" Oct 07 12:38:11 crc kubenswrapper[4854]: I1007 12:38:11.586402 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ba19497-e9e9-4587-b726-cadfb140df77-util\") pod \"0ba19497-e9e9-4587-b726-cadfb140df77\" (UID: \"0ba19497-e9e9-4587-b726-cadfb140df77\") " Oct 07 12:38:11 crc kubenswrapper[4854]: I1007 12:38:11.586477 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj8sj\" (UniqueName: \"kubernetes.io/projected/0ba19497-e9e9-4587-b726-cadfb140df77-kube-api-access-hj8sj\") pod \"0ba19497-e9e9-4587-b726-cadfb140df77\" (UID: \"0ba19497-e9e9-4587-b726-cadfb140df77\") " Oct 07 12:38:11 crc kubenswrapper[4854]: I1007 12:38:11.586607 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ba19497-e9e9-4587-b726-cadfb140df77-bundle\") pod \"0ba19497-e9e9-4587-b726-cadfb140df77\" (UID: \"0ba19497-e9e9-4587-b726-cadfb140df77\") " Oct 07 12:38:11 crc kubenswrapper[4854]: I1007 12:38:11.588003 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ba19497-e9e9-4587-b726-cadfb140df77-bundle" (OuterVolumeSpecName: "bundle") pod "0ba19497-e9e9-4587-b726-cadfb140df77" (UID: "0ba19497-e9e9-4587-b726-cadfb140df77"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:38:11 crc kubenswrapper[4854]: I1007 12:38:11.597137 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba19497-e9e9-4587-b726-cadfb140df77-kube-api-access-hj8sj" (OuterVolumeSpecName: "kube-api-access-hj8sj") pod "0ba19497-e9e9-4587-b726-cadfb140df77" (UID: "0ba19497-e9e9-4587-b726-cadfb140df77"). InnerVolumeSpecName "kube-api-access-hj8sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:38:11 crc kubenswrapper[4854]: I1007 12:38:11.602079 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ba19497-e9e9-4587-b726-cadfb140df77-util" (OuterVolumeSpecName: "util") pod "0ba19497-e9e9-4587-b726-cadfb140df77" (UID: "0ba19497-e9e9-4587-b726-cadfb140df77"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:38:11 crc kubenswrapper[4854]: I1007 12:38:11.689124 4854 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ba19497-e9e9-4587-b726-cadfb140df77-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:38:11 crc kubenswrapper[4854]: I1007 12:38:11.689203 4854 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ba19497-e9e9-4587-b726-cadfb140df77-util\") on node \"crc\" DevicePath \"\"" Oct 07 12:38:11 crc kubenswrapper[4854]: I1007 12:38:11.689224 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj8sj\" (UniqueName: \"kubernetes.io/projected/0ba19497-e9e9-4587-b726-cadfb140df77-kube-api-access-hj8sj\") on node \"crc\" DevicePath \"\"" Oct 07 12:38:12 crc kubenswrapper[4854]: I1007 12:38:12.230366 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" event={"ID":"0ba19497-e9e9-4587-b726-cadfb140df77","Type":"ContainerDied","Data":"3e3e6d82ed6d07eabf0c5cad7d6afa553301e5e46635ff614f4bd9b2e67f3c2a"} Oct 07 12:38:12 crc kubenswrapper[4854]: I1007 12:38:12.230450 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e3e6d82ed6d07eabf0c5cad7d6afa553301e5e46635ff614f4bd9b2e67f3c2a" Oct 07 12:38:12 crc kubenswrapper[4854]: I1007 12:38:12.230511 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb" Oct 07 12:38:23 crc kubenswrapper[4854]: I1007 12:38:23.846610 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-746fd59886-q46bc"] Oct 07 12:38:23 crc kubenswrapper[4854]: E1007 12:38:23.847733 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba19497-e9e9-4587-b726-cadfb140df77" containerName="util" Oct 07 12:38:23 crc kubenswrapper[4854]: I1007 12:38:23.847753 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba19497-e9e9-4587-b726-cadfb140df77" containerName="util" Oct 07 12:38:23 crc kubenswrapper[4854]: E1007 12:38:23.847773 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba19497-e9e9-4587-b726-cadfb140df77" containerName="extract" Oct 07 12:38:23 crc kubenswrapper[4854]: I1007 12:38:23.847784 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba19497-e9e9-4587-b726-cadfb140df77" containerName="extract" Oct 07 12:38:23 crc kubenswrapper[4854]: E1007 12:38:23.847802 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba19497-e9e9-4587-b726-cadfb140df77" containerName="pull" Oct 07 12:38:23 crc kubenswrapper[4854]: I1007 12:38:23.847813 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba19497-e9e9-4587-b726-cadfb140df77" containerName="pull" Oct 07 12:38:23 crc kubenswrapper[4854]: I1007 12:38:23.847983 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba19497-e9e9-4587-b726-cadfb140df77" containerName="extract" Oct 07 12:38:23 crc kubenswrapper[4854]: I1007 12:38:23.849053 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-746fd59886-q46bc" Oct 07 12:38:23 crc kubenswrapper[4854]: I1007 12:38:23.851140 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-v9dg6" Oct 07 12:38:23 crc kubenswrapper[4854]: I1007 12:38:23.876953 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-746fd59886-q46bc"] Oct 07 12:38:24 crc kubenswrapper[4854]: I1007 12:38:24.012090 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2f2t\" (UniqueName: \"kubernetes.io/projected/07a9d82f-5beb-479b-8d8f-9650a3cb1a4e-kube-api-access-p2f2t\") pod \"openstack-operator-controller-operator-746fd59886-q46bc\" (UID: \"07a9d82f-5beb-479b-8d8f-9650a3cb1a4e\") " pod="openstack-operators/openstack-operator-controller-operator-746fd59886-q46bc" Oct 07 12:38:24 crc kubenswrapper[4854]: I1007 12:38:24.113682 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2f2t\" (UniqueName: \"kubernetes.io/projected/07a9d82f-5beb-479b-8d8f-9650a3cb1a4e-kube-api-access-p2f2t\") pod \"openstack-operator-controller-operator-746fd59886-q46bc\" (UID: \"07a9d82f-5beb-479b-8d8f-9650a3cb1a4e\") " pod="openstack-operators/openstack-operator-controller-operator-746fd59886-q46bc" Oct 07 12:38:24 crc kubenswrapper[4854]: I1007 12:38:24.149986 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2f2t\" (UniqueName: \"kubernetes.io/projected/07a9d82f-5beb-479b-8d8f-9650a3cb1a4e-kube-api-access-p2f2t\") pod \"openstack-operator-controller-operator-746fd59886-q46bc\" (UID: \"07a9d82f-5beb-479b-8d8f-9650a3cb1a4e\") " pod="openstack-operators/openstack-operator-controller-operator-746fd59886-q46bc" Oct 07 12:38:24 crc kubenswrapper[4854]: I1007 12:38:24.171111 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-746fd59886-q46bc" Oct 07 12:38:24 crc kubenswrapper[4854]: I1007 12:38:24.521795 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-746fd59886-q46bc"] Oct 07 12:38:25 crc kubenswrapper[4854]: I1007 12:38:25.365714 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-746fd59886-q46bc" event={"ID":"07a9d82f-5beb-479b-8d8f-9650a3cb1a4e","Type":"ContainerStarted","Data":"cb09c15e156e38ded1da56efe65323820a2a20cd5693fc8e5aa86d7940c3de8e"} Oct 07 12:38:29 crc kubenswrapper[4854]: I1007 12:38:29.435887 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-746fd59886-q46bc" event={"ID":"07a9d82f-5beb-479b-8d8f-9650a3cb1a4e","Type":"ContainerStarted","Data":"60bef1ace7a3a513637e308d604014f5582e01b9163dc7ca6e7276ab8bedd974"} Oct 07 12:38:31 crc kubenswrapper[4854]: I1007 12:38:31.462298 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-746fd59886-q46bc" event={"ID":"07a9d82f-5beb-479b-8d8f-9650a3cb1a4e","Type":"ContainerStarted","Data":"3c60bd7efc9557219892a958f5466eb864267ddd8039d7fe768c81ff410ae856"} Oct 07 12:38:31 crc kubenswrapper[4854]: I1007 12:38:31.462741 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-746fd59886-q46bc" Oct 07 12:38:31 crc kubenswrapper[4854]: I1007 12:38:31.499092 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-746fd59886-q46bc" podStartSLOduration=1.818035074 podStartE2EDuration="8.499068921s" podCreationTimestamp="2025-10-07 12:38:23 +0000 UTC" firstStartedPulling="2025-10-07 12:38:24.541931152 +0000 UTC m=+820.529763407" lastFinishedPulling="2025-10-07 12:38:31.222964999 +0000 UTC m=+827.210797254" observedRunningTime="2025-10-07 12:38:31.491724478 +0000 UTC m=+827.479556753" watchObservedRunningTime="2025-10-07 12:38:31.499068921 +0000 UTC m=+827.486901176" Oct 07 12:38:34 crc kubenswrapper[4854]: I1007 12:38:34.174403 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-746fd59886-q46bc" Oct 07 12:38:40 crc kubenswrapper[4854]: I1007 12:38:40.807730 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:38:40 crc kubenswrapper[4854]: I1007 12:38:40.808474 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:38:40 crc kubenswrapper[4854]: I1007 12:38:40.808555 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:38:40 crc kubenswrapper[4854]: I1007 12:38:40.809489 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad58428eb7a4eee4282ef9e4c324813616eaa8e2cac7c60386af842a1cd061ba"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 12:38:40 crc kubenswrapper[4854]: I1007 12:38:40.809567 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://ad58428eb7a4eee4282ef9e4c324813616eaa8e2cac7c60386af842a1cd061ba" gracePeriod=600 Oct 07 12:38:41 crc kubenswrapper[4854]: I1007 12:38:41.540652 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="ad58428eb7a4eee4282ef9e4c324813616eaa8e2cac7c60386af842a1cd061ba" exitCode=0 Oct 07 12:38:41 crc kubenswrapper[4854]: I1007 12:38:41.540780 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"ad58428eb7a4eee4282ef9e4c324813616eaa8e2cac7c60386af842a1cd061ba"} Oct 07 12:38:41 crc kubenswrapper[4854]: I1007 12:38:41.541235 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"ce4378e583bfb2c39ca2c2683e3f5d09095f8b3086e8373f80ece7dbb8bdaa5a"} Oct 07 12:38:41 crc kubenswrapper[4854]: I1007 12:38:41.541273 4854 scope.go:117] "RemoveContainer" containerID="f5e36bd11ba8e83568e3659962294b47149b50c3ca170d0c33beb4ab9960604f" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.449385 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-bstfk"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.451366 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-bstfk" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.453684 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-4fjsx" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.455464 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-6tg4x"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.456951 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-6tg4x" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.458985 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-psnrv" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.478686 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-bstfk"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.484318 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-6tg4x"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.496093 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-ml6xv"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.497506 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-ml6xv" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.502958 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-flbvw" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.503890 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-jw8gn"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.507126 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-jw8gn" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.511383 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-8mksr" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.525891 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-ml6xv"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.543050 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-hqdgf"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.544545 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hqdgf" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.551647 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-6vbtb" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.553225 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-jw8gn"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.587623 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-hqdgf"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.595726 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-lknk5"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.597185 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-lknk5" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.603166 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-59qd7" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.614839 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.616579 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.621694 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-c5tgl" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.621995 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.623380 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xh5p\" (UniqueName: \"kubernetes.io/projected/52f4d849-2233-4984-8afb-4ccf15d94914-kube-api-access-8xh5p\") pod \"designate-operator-controller-manager-75dfd9b554-jw8gn\" (UID: \"52f4d849-2233-4984-8afb-4ccf15d94914\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-jw8gn" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.623481 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn4cc\" (UniqueName: \"kubernetes.io/projected/6e979fa2-9b10-4bd5-9adc-8d9e116da401-kube-api-access-wn4cc\") pod \"glance-operator-controller-manager-5dc44df7d5-ml6xv\" (UID: \"6e979fa2-9b10-4bd5-9adc-8d9e116da401\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-ml6xv" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.623515 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg6j7\" (UniqueName: \"kubernetes.io/projected/b81a8a7a-ce97-4020-9cad-038a81ea3f79-kube-api-access-zg6j7\") pod \"cinder-operator-controller-manager-7d4d4f8d-6tg4x\" (UID: \"b81a8a7a-ce97-4020-9cad-038a81ea3f79\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-6tg4x" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.623552 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2bx6\" (UniqueName: \"kubernetes.io/projected/c345db1e-94bb-4650-80af-e0c3dac97dbe-kube-api-access-l2bx6\") pod \"barbican-operator-controller-manager-58c4cd55f4-bstfk\" (UID: \"c345db1e-94bb-4650-80af-e0c3dac97dbe\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-bstfk" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.626210 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-lknk5"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.670011 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-h4lkl"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.673761 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h4lkl" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.676858 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-5gs97" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.689260 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.732085 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4sbn\" (UniqueName: \"kubernetes.io/projected/3ddf497a-0424-4168-ae55-47a94d4d5124-kube-api-access-t4sbn\") pod \"heat-operator-controller-manager-54b4974c45-hqdgf\" (UID: \"3ddf497a-0424-4168-ae55-47a94d4d5124\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hqdgf" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.732258 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn4cc\" (UniqueName: \"kubernetes.io/projected/6e979fa2-9b10-4bd5-9adc-8d9e116da401-kube-api-access-wn4cc\") pod \"glance-operator-controller-manager-5dc44df7d5-ml6xv\" (UID: \"6e979fa2-9b10-4bd5-9adc-8d9e116da401\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-ml6xv" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.732284 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg6j7\" (UniqueName: \"kubernetes.io/projected/b81a8a7a-ce97-4020-9cad-038a81ea3f79-kube-api-access-zg6j7\") pod \"cinder-operator-controller-manager-7d4d4f8d-6tg4x\" (UID: \"b81a8a7a-ce97-4020-9cad-038a81ea3f79\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-6tg4x" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.732310 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2bx6\" (UniqueName: \"kubernetes.io/projected/c345db1e-94bb-4650-80af-e0c3dac97dbe-kube-api-access-l2bx6\") pod \"barbican-operator-controller-manager-58c4cd55f4-bstfk\" (UID: \"c345db1e-94bb-4650-80af-e0c3dac97dbe\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-bstfk" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.732328 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f33763b-0578-4fa9-8d46-51202dfb0b12-cert\") pod \"infra-operator-controller-manager-658588b8c9-5n2wq\" (UID: \"9f33763b-0578-4fa9-8d46-51202dfb0b12\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.732355 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czksx\" (UniqueName: \"kubernetes.io/projected/d09c6570-0b86-42a3-aa24-cd139b85c0fb-kube-api-access-czksx\") pod \"horizon-operator-controller-manager-76d5b87f47-lknk5\" (UID: \"d09c6570-0b86-42a3-aa24-cd139b85c0fb\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-lknk5" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.732394 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsfcp\" (UniqueName: \"kubernetes.io/projected/9f33763b-0578-4fa9-8d46-51202dfb0b12-kube-api-access-dsfcp\") pod \"infra-operator-controller-manager-658588b8c9-5n2wq\" (UID: \"9f33763b-0578-4fa9-8d46-51202dfb0b12\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.732412 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xh5p\" (UniqueName: \"kubernetes.io/projected/52f4d849-2233-4984-8afb-4ccf15d94914-kube-api-access-8xh5p\") pod \"designate-operator-controller-manager-75dfd9b554-jw8gn\" (UID: \"52f4d849-2233-4984-8afb-4ccf15d94914\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-jw8gn" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.740442 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-h4lkl"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.746238 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-pnm28"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.747531 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-pnm28" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.753608 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-smcmc" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.760294 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-pnm28"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.786535 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg6j7\" (UniqueName: \"kubernetes.io/projected/b81a8a7a-ce97-4020-9cad-038a81ea3f79-kube-api-access-zg6j7\") pod \"cinder-operator-controller-manager-7d4d4f8d-6tg4x\" (UID: \"b81a8a7a-ce97-4020-9cad-038a81ea3f79\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-6tg4x" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.786738 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.786910 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-6tg4x" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.788057 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.793976 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-nvh92"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.796584 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-nvh92" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.810641 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.812118 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2bx6\" (UniqueName: \"kubernetes.io/projected/c345db1e-94bb-4650-80af-e0c3dac97dbe-kube-api-access-l2bx6\") pod \"barbican-operator-controller-manager-58c4cd55f4-bstfk\" (UID: \"c345db1e-94bb-4650-80af-e0c3dac97dbe\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-bstfk" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.812186 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-v6gt2" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.812426 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn4cc\" (UniqueName: \"kubernetes.io/projected/6e979fa2-9b10-4bd5-9adc-8d9e116da401-kube-api-access-wn4cc\") pod \"glance-operator-controller-manager-5dc44df7d5-ml6xv\" (UID: \"6e979fa2-9b10-4bd5-9adc-8d9e116da401\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-ml6xv" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.812463 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-62fz5" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.812486 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xh5p\" (UniqueName: \"kubernetes.io/projected/52f4d849-2233-4984-8afb-4ccf15d94914-kube-api-access-8xh5p\") pod \"designate-operator-controller-manager-75dfd9b554-jw8gn\" (UID: \"52f4d849-2233-4984-8afb-4ccf15d94914\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-jw8gn" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.821709 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-ml6xv" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.834714 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f33763b-0578-4fa9-8d46-51202dfb0b12-cert\") pod \"infra-operator-controller-manager-658588b8c9-5n2wq\" (UID: \"9f33763b-0578-4fa9-8d46-51202dfb0b12\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.834788 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czksx\" (UniqueName: \"kubernetes.io/projected/d09c6570-0b86-42a3-aa24-cd139b85c0fb-kube-api-access-czksx\") pod \"horizon-operator-controller-manager-76d5b87f47-lknk5\" (UID: \"d09c6570-0b86-42a3-aa24-cd139b85c0fb\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-lknk5" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.834843 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsfcp\" (UniqueName: \"kubernetes.io/projected/9f33763b-0578-4fa9-8d46-51202dfb0b12-kube-api-access-dsfcp\") pod \"infra-operator-controller-manager-658588b8c9-5n2wq\" (UID: \"9f33763b-0578-4fa9-8d46-51202dfb0b12\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.834879 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrq4x\" (UniqueName: \"kubernetes.io/projected/80562dd3-4b06-4e44-88e5-50febc56fa3d-kube-api-access-lrq4x\") pod \"ironic-operator-controller-manager-649675d675-h4lkl\" (UID: \"80562dd3-4b06-4e44-88e5-50febc56fa3d\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-h4lkl" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.834925 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4sbn\" (UniqueName: \"kubernetes.io/projected/3ddf497a-0424-4168-ae55-47a94d4d5124-kube-api-access-t4sbn\") pod \"heat-operator-controller-manager-54b4974c45-hqdgf\" (UID: \"3ddf497a-0424-4168-ae55-47a94d4d5124\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hqdgf" Oct 07 12:38:55 crc kubenswrapper[4854]: E1007 12:38:55.835681 4854 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 07 12:38:55 crc kubenswrapper[4854]: E1007 12:38:55.835733 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f33763b-0578-4fa9-8d46-51202dfb0b12-cert podName:9f33763b-0578-4fa9-8d46-51202dfb0b12 nodeName:}" failed. No retries permitted until 2025-10-07 12:38:56.335713177 +0000 UTC m=+852.323545432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9f33763b-0578-4fa9-8d46-51202dfb0b12-cert") pod "infra-operator-controller-manager-658588b8c9-5n2wq" (UID: "9f33763b-0578-4fa9-8d46-51202dfb0b12") : secret "infra-operator-webhook-server-cert" not found Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.840750 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-nvh92"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.845403 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.846648 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.849885 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-czx9c" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.855290 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-jw8gn" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.865632 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsfcp\" (UniqueName: \"kubernetes.io/projected/9f33763b-0578-4fa9-8d46-51202dfb0b12-kube-api-access-dsfcp\") pod \"infra-operator-controller-manager-658588b8c9-5n2wq\" (UID: \"9f33763b-0578-4fa9-8d46-51202dfb0b12\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.876855 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4sbn\" (UniqueName: \"kubernetes.io/projected/3ddf497a-0424-4168-ae55-47a94d4d5124-kube-api-access-t4sbn\") pod \"heat-operator-controller-manager-54b4974c45-hqdgf\" (UID: \"3ddf497a-0424-4168-ae55-47a94d4d5124\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hqdgf" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.880856 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czksx\" (UniqueName: \"kubernetes.io/projected/d09c6570-0b86-42a3-aa24-cd139b85c0fb-kube-api-access-czksx\") pod \"horizon-operator-controller-manager-76d5b87f47-lknk5\" (UID: \"d09c6570-0b86-42a3-aa24-cd139b85c0fb\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-lknk5" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.884735 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.894282 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-kcmxb"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.896493 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-kcmxb" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.904228 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-xdhf9"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.905376 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.906203 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.906448 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xdhf9" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.906633 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-fgghj" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.910753 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-kcmxb"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.917666 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-dpcs4" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.917885 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.918029 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-4p5px" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.918055 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-xdhf9"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.918408 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-lknk5" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.923301 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-sc7j5"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.934680 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-m6qn8"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.942304 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-sc7j5" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.943272 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfvp5\" (UniqueName: \"kubernetes.io/projected/c7263a64-2cf7-404c-b690-d2042b34d0cd-kube-api-access-kfvp5\") pod \"manila-operator-controller-manager-65d89cfd9f-ct8rs\" (UID: \"c7263a64-2cf7-404c-b690-d2042b34d0cd\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.943367 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrq4x\" (UniqueName: \"kubernetes.io/projected/80562dd3-4b06-4e44-88e5-50febc56fa3d-kube-api-access-lrq4x\") pod \"ironic-operator-controller-manager-649675d675-h4lkl\" (UID: \"80562dd3-4b06-4e44-88e5-50febc56fa3d\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-h4lkl" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.943402 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvdbk\" (UniqueName: \"kubernetes.io/projected/47a57505-60e3-4139-95d9-426eb48e4e56-kube-api-access-mvdbk\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-nvh92\" (UID: \"47a57505-60e3-4139-95d9-426eb48e4e56\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-nvh92" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.943422 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6mqx\" (UniqueName: \"kubernetes.io/projected/1f529c42-e29a-4dfb-972b-5e143c2589b7-kube-api-access-l6mqx\") pod \"neutron-operator-controller-manager-8d984cc4d-4774x\" (UID: \"1f529c42-e29a-4dfb-972b-5e143c2589b7\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.943522 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lj8p\" (UniqueName: \"kubernetes.io/projected/b4d43b9e-cdd5-4513-aa57-002a29687247-kube-api-access-2lj8p\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-pnm28\" (UID: \"b4d43b9e-cdd5-4513-aa57-002a29687247\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-pnm28" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.945210 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-xrcjd" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.949669 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-sc7j5"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.949715 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-m6qn8"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.949825 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-m6qn8" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.952655 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-wmvmr" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.953901 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-x5n7x"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.955091 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-x5n7x" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.960470 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrq4x\" (UniqueName: \"kubernetes.io/projected/80562dd3-4b06-4e44-88e5-50febc56fa3d-kube-api-access-lrq4x\") pod \"ironic-operator-controller-manager-649675d675-h4lkl\" (UID: \"80562dd3-4b06-4e44-88e5-50febc56fa3d\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-h4lkl" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.960810 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.961226 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-456rh" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.972637 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-x5n7x"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.976510 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-d4s4h"] Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.978041 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-d4s4h" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.981379 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-bs7gq" Oct 07 12:38:55 crc kubenswrapper[4854]: I1007 12:38:55.990297 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-d4s4h"] Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.012576 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc"] Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.016211 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.017430 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h4lkl" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.021530 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-n27xb" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.044870 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6mqx\" (UniqueName: \"kubernetes.io/projected/1f529c42-e29a-4dfb-972b-5e143c2589b7-kube-api-access-l6mqx\") pod \"neutron-operator-controller-manager-8d984cc4d-4774x\" (UID: \"1f529c42-e29a-4dfb-972b-5e143c2589b7\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.044928 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwl4h\" (UniqueName: \"kubernetes.io/projected/602ad7aa-3b22-476b-849c-c1138b5e6223-kube-api-access-qwl4h\") pod \"swift-operator-controller-manager-6859f9b676-x5n7x\" (UID: \"602ad7aa-3b22-476b-849c-c1138b5e6223\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-x5n7x" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.044954 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk82v\" (UniqueName: \"kubernetes.io/projected/dee05e6c-eda8-4059-a9b0-88b3ab2eb219-kube-api-access-zk82v\") pod \"nova-operator-controller-manager-7c7fc454ff-xdhf9\" (UID: \"dee05e6c-eda8-4059-a9b0-88b3ab2eb219\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xdhf9" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.044979 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f66ed5a7-8fda-4f43-bf10-c1709f30a858-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c\" (UID: \"f66ed5a7-8fda-4f43-bf10-c1709f30a858\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.045005 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdd7j\" (UniqueName: \"kubernetes.io/projected/d3b199b3-50a7-4e60-b377-101e8c0d1882-kube-api-access-xdd7j\") pod \"placement-operator-controller-manager-54689d9f88-m6qn8\" (UID: \"d3b199b3-50a7-4e60-b377-101e8c0d1882\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-m6qn8" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.045028 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lj8p\" (UniqueName: \"kubernetes.io/projected/b4d43b9e-cdd5-4513-aa57-002a29687247-kube-api-access-2lj8p\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-pnm28\" (UID: \"b4d43b9e-cdd5-4513-aa57-002a29687247\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-pnm28" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.045070 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xbtw\" (UniqueName: \"kubernetes.io/projected/101bbc87-6377-436d-b179-e0368ce39e68-kube-api-access-6xbtw\") pod \"octavia-operator-controller-manager-7468f855d8-kcmxb\" (UID: \"101bbc87-6377-436d-b179-e0368ce39e68\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-kcmxb" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.045092 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8t95\" (UniqueName: \"kubernetes.io/projected/f66ed5a7-8fda-4f43-bf10-c1709f30a858-kube-api-access-j8t95\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c\" (UID: \"f66ed5a7-8fda-4f43-bf10-c1709f30a858\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.045120 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfvp5\" (UniqueName: \"kubernetes.io/projected/c7263a64-2cf7-404c-b690-d2042b34d0cd-kube-api-access-kfvp5\") pod \"manila-operator-controller-manager-65d89cfd9f-ct8rs\" (UID: \"c7263a64-2cf7-404c-b690-d2042b34d0cd\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.045161 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrr8x\" (UniqueName: \"kubernetes.io/projected/35f2082e-767b-4de0-9c71-76d1d1cb020a-kube-api-access-mrr8x\") pod \"ovn-operator-controller-manager-6d8b6f9b9-sc7j5\" (UID: \"35f2082e-767b-4de0-9c71-76d1d1cb020a\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-sc7j5" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.045183 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvdbk\" (UniqueName: \"kubernetes.io/projected/47a57505-60e3-4139-95d9-426eb48e4e56-kube-api-access-mvdbk\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-nvh92\" (UID: \"47a57505-60e3-4139-95d9-426eb48e4e56\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-nvh92" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.066772 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc"] Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.077601 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfvp5\" (UniqueName: \"kubernetes.io/projected/c7263a64-2cf7-404c-b690-d2042b34d0cd-kube-api-access-kfvp5\") pod \"manila-operator-controller-manager-65d89cfd9f-ct8rs\" (UID: \"c7263a64-2cf7-404c-b690-d2042b34d0cd\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.080024 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-bstfk" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.083301 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvdbk\" (UniqueName: \"kubernetes.io/projected/47a57505-60e3-4139-95d9-426eb48e4e56-kube-api-access-mvdbk\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-nvh92\" (UID: \"47a57505-60e3-4139-95d9-426eb48e4e56\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-nvh92" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.083407 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lj8p\" (UniqueName: \"kubernetes.io/projected/b4d43b9e-cdd5-4513-aa57-002a29687247-kube-api-access-2lj8p\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-pnm28\" (UID: \"b4d43b9e-cdd5-4513-aa57-002a29687247\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-pnm28" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.114547 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6mqx\" (UniqueName: \"kubernetes.io/projected/1f529c42-e29a-4dfb-972b-5e143c2589b7-kube-api-access-l6mqx\") pod \"neutron-operator-controller-manager-8d984cc4d-4774x\" (UID: \"1f529c42-e29a-4dfb-972b-5e143c2589b7\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.136969 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22"] Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.152312 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwl4h\" (UniqueName: \"kubernetes.io/projected/602ad7aa-3b22-476b-849c-c1138b5e6223-kube-api-access-qwl4h\") pod \"swift-operator-controller-manager-6859f9b676-x5n7x\" (UID: \"602ad7aa-3b22-476b-849c-c1138b5e6223\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-x5n7x" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.152362 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk82v\" (UniqueName: \"kubernetes.io/projected/dee05e6c-eda8-4059-a9b0-88b3ab2eb219-kube-api-access-zk82v\") pod \"nova-operator-controller-manager-7c7fc454ff-xdhf9\" (UID: \"dee05e6c-eda8-4059-a9b0-88b3ab2eb219\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xdhf9" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.152396 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f66ed5a7-8fda-4f43-bf10-c1709f30a858-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c\" (UID: \"f66ed5a7-8fda-4f43-bf10-c1709f30a858\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.152429 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdd7j\" (UniqueName: \"kubernetes.io/projected/d3b199b3-50a7-4e60-b377-101e8c0d1882-kube-api-access-xdd7j\") pod \"placement-operator-controller-manager-54689d9f88-m6qn8\" (UID: \"d3b199b3-50a7-4e60-b377-101e8c0d1882\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-m6qn8" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.152492 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xbtw\" (UniqueName: \"kubernetes.io/projected/101bbc87-6377-436d-b179-e0368ce39e68-kube-api-access-6xbtw\") pod \"octavia-operator-controller-manager-7468f855d8-kcmxb\" (UID: \"101bbc87-6377-436d-b179-e0368ce39e68\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-kcmxb" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.152521 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8t95\" (UniqueName: \"kubernetes.io/projected/f66ed5a7-8fda-4f43-bf10-c1709f30a858-kube-api-access-j8t95\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c\" (UID: \"f66ed5a7-8fda-4f43-bf10-c1709f30a858\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.152565 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr2rg\" (UniqueName: \"kubernetes.io/projected/bae9e78e-d7e2-4b0b-851b-0705610a640b-kube-api-access-lr2rg\") pod \"telemetry-operator-controller-manager-5d4d74dd89-d4s4h\" (UID: \"bae9e78e-d7e2-4b0b-851b-0705610a640b\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-d4s4h" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.152601 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrr8x\" (UniqueName: \"kubernetes.io/projected/35f2082e-767b-4de0-9c71-76d1d1cb020a-kube-api-access-mrr8x\") pod \"ovn-operator-controller-manager-6d8b6f9b9-sc7j5\" (UID: \"35f2082e-767b-4de0-9c71-76d1d1cb020a\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-sc7j5" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.152626 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt9jc\" (UniqueName: \"kubernetes.io/projected/3cf5fba8-8e19-4c34-bea9-5b910302d68f-kube-api-access-dt9jc\") pod \"test-operator-controller-manager-5cd5cb47d7-vvcpc\" (UID: \"3cf5fba8-8e19-4c34-bea9-5b910302d68f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc" Oct 07 12:38:56 crc kubenswrapper[4854]: E1007 12:38:56.155815 4854 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 12:38:56 crc kubenswrapper[4854]: E1007 12:38:56.155901 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f66ed5a7-8fda-4f43-bf10-c1709f30a858-cert podName:f66ed5a7-8fda-4f43-bf10-c1709f30a858 nodeName:}" failed. No retries permitted until 2025-10-07 12:38:56.655861091 +0000 UTC m=+852.643693346 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f66ed5a7-8fda-4f43-bf10-c1709f30a858-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" (UID: "f66ed5a7-8fda-4f43-bf10-c1709f30a858") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.173494 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hqdgf" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.185753 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xbtw\" (UniqueName: \"kubernetes.io/projected/101bbc87-6377-436d-b179-e0368ce39e68-kube-api-access-6xbtw\") pod \"octavia-operator-controller-manager-7468f855d8-kcmxb\" (UID: \"101bbc87-6377-436d-b179-e0368ce39e68\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-kcmxb" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.200927 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.207604 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk82v\" (UniqueName: \"kubernetes.io/projected/dee05e6c-eda8-4059-a9b0-88b3ab2eb219-kube-api-access-zk82v\") pod \"nova-operator-controller-manager-7c7fc454ff-xdhf9\" (UID: \"dee05e6c-eda8-4059-a9b0-88b3ab2eb219\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xdhf9" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.209288 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-6hdhn" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.212301 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8t95\" (UniqueName: \"kubernetes.io/projected/f66ed5a7-8fda-4f43-bf10-c1709f30a858-kube-api-access-j8t95\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c\" (UID: \"f66ed5a7-8fda-4f43-bf10-c1709f30a858\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.212792 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwl4h\" (UniqueName: \"kubernetes.io/projected/602ad7aa-3b22-476b-849c-c1138b5e6223-kube-api-access-qwl4h\") pod \"swift-operator-controller-manager-6859f9b676-x5n7x\" (UID: \"602ad7aa-3b22-476b-849c-c1138b5e6223\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-x5n7x" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.231138 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22"] Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.238689 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrr8x\" (UniqueName: \"kubernetes.io/projected/35f2082e-767b-4de0-9c71-76d1d1cb020a-kube-api-access-mrr8x\") pod \"ovn-operator-controller-manager-6d8b6f9b9-sc7j5\" (UID: \"35f2082e-767b-4de0-9c71-76d1d1cb020a\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-sc7j5" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.239317 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdd7j\" (UniqueName: \"kubernetes.io/projected/d3b199b3-50a7-4e60-b377-101e8c0d1882-kube-api-access-xdd7j\") pod \"placement-operator-controller-manager-54689d9f88-m6qn8\" (UID: \"d3b199b3-50a7-4e60-b377-101e8c0d1882\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-m6qn8" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.250630 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-pnm28" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.259350 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt9jc\" (UniqueName: \"kubernetes.io/projected/3cf5fba8-8e19-4c34-bea9-5b910302d68f-kube-api-access-dt9jc\") pod \"test-operator-controller-manager-5cd5cb47d7-vvcpc\" (UID: \"3cf5fba8-8e19-4c34-bea9-5b910302d68f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.259513 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr2rg\" (UniqueName: \"kubernetes.io/projected/bae9e78e-d7e2-4b0b-851b-0705610a640b-kube-api-access-lr2rg\") pod \"telemetry-operator-controller-manager-5d4d74dd89-d4s4h\" (UID: \"bae9e78e-d7e2-4b0b-851b-0705610a640b\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-d4s4h" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.296211 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-sc7j5" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.302109 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr2rg\" (UniqueName: \"kubernetes.io/projected/bae9e78e-d7e2-4b0b-851b-0705610a640b-kube-api-access-lr2rg\") pod \"telemetry-operator-controller-manager-5d4d74dd89-d4s4h\" (UID: \"bae9e78e-d7e2-4b0b-851b-0705610a640b\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-d4s4h" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.304160 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.312884 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-nvh92" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.313576 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt9jc\" (UniqueName: \"kubernetes.io/projected/3cf5fba8-8e19-4c34-bea9-5b910302d68f-kube-api-access-dt9jc\") pod \"test-operator-controller-manager-5cd5cb47d7-vvcpc\" (UID: \"3cf5fba8-8e19-4c34-bea9-5b910302d68f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.325462 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-m6qn8" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.328667 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.341642 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g"] Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.351019 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.356093 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.361017 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-t2tbw" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.362651 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f33763b-0578-4fa9-8d46-51202dfb0b12-cert\") pod \"infra-operator-controller-manager-658588b8c9-5n2wq\" (UID: \"9f33763b-0578-4fa9-8d46-51202dfb0b12\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.362778 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sph9t\" (UniqueName: \"kubernetes.io/projected/68f7e193-d05e-4ae9-a3b8-c075b608dfa9-kube-api-access-sph9t\") pod \"watcher-operator-controller-manager-6cbc6dd547-dkw22\" (UID: \"68f7e193-d05e-4ae9-a3b8-c075b608dfa9\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.373654 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f33763b-0578-4fa9-8d46-51202dfb0b12-cert\") pod \"infra-operator-controller-manager-658588b8c9-5n2wq\" (UID: \"9f33763b-0578-4fa9-8d46-51202dfb0b12\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.379035 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-x5n7x" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.380289 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g"] Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.408229 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj"] Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.409450 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.413321 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-kgqtz" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.418024 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj"] Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.423117 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-kcmxb" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.427623 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-6tg4x"] Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.442892 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-d4s4h" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.448870 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.465054 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sph9t\" (UniqueName: \"kubernetes.io/projected/68f7e193-d05e-4ae9-a3b8-c075b608dfa9-kube-api-access-sph9t\") pod \"watcher-operator-controller-manager-6cbc6dd547-dkw22\" (UID: \"68f7e193-d05e-4ae9-a3b8-c075b608dfa9\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.465256 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b140fde-b7d7-4a32-b80e-0cfa788c09b5-cert\") pod \"openstack-operator-controller-manager-6559cd6d74-82w2g\" (UID: \"3b140fde-b7d7-4a32-b80e-0cfa788c09b5\") " pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.465292 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94znf\" (UniqueName: \"kubernetes.io/projected/e7938e36-fecf-4df1-9e1d-886f84e4c597-kube-api-access-94znf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj\" (UID: \"e7938e36-fecf-4df1-9e1d-886f84e4c597\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.465325 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4j72\" (UniqueName: \"kubernetes.io/projected/3b140fde-b7d7-4a32-b80e-0cfa788c09b5-kube-api-access-w4j72\") pod \"openstack-operator-controller-manager-6559cd6d74-82w2g\" (UID: \"3b140fde-b7d7-4a32-b80e-0cfa788c09b5\") " pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.497550 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xdhf9" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.518832 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sph9t\" (UniqueName: \"kubernetes.io/projected/68f7e193-d05e-4ae9-a3b8-c075b608dfa9-kube-api-access-sph9t\") pod \"watcher-operator-controller-manager-6cbc6dd547-dkw22\" (UID: \"68f7e193-d05e-4ae9-a3b8-c075b608dfa9\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.566176 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b140fde-b7d7-4a32-b80e-0cfa788c09b5-cert\") pod \"openstack-operator-controller-manager-6559cd6d74-82w2g\" (UID: \"3b140fde-b7d7-4a32-b80e-0cfa788c09b5\") " pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.566234 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94znf\" (UniqueName: \"kubernetes.io/projected/e7938e36-fecf-4df1-9e1d-886f84e4c597-kube-api-access-94znf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj\" (UID: \"e7938e36-fecf-4df1-9e1d-886f84e4c597\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.566273 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4j72\" (UniqueName: \"kubernetes.io/projected/3b140fde-b7d7-4a32-b80e-0cfa788c09b5-kube-api-access-w4j72\") pod \"openstack-operator-controller-manager-6559cd6d74-82w2g\" (UID: \"3b140fde-b7d7-4a32-b80e-0cfa788c09b5\") " pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.566856 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" Oct 07 12:38:56 crc kubenswrapper[4854]: E1007 12:38:56.573305 4854 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 07 12:38:56 crc kubenswrapper[4854]: E1007 12:38:56.573371 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b140fde-b7d7-4a32-b80e-0cfa788c09b5-cert podName:3b140fde-b7d7-4a32-b80e-0cfa788c09b5 nodeName:}" failed. No retries permitted until 2025-10-07 12:38:57.073351458 +0000 UTC m=+853.061183713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b140fde-b7d7-4a32-b80e-0cfa788c09b5-cert") pod "openstack-operator-controller-manager-6559cd6d74-82w2g" (UID: "3b140fde-b7d7-4a32-b80e-0cfa788c09b5") : secret "webhook-server-cert" not found Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.617136 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94znf\" (UniqueName: \"kubernetes.io/projected/e7938e36-fecf-4df1-9e1d-886f84e4c597-kube-api-access-94znf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj\" (UID: \"e7938e36-fecf-4df1-9e1d-886f84e4c597\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.628937 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4j72\" (UniqueName: \"kubernetes.io/projected/3b140fde-b7d7-4a32-b80e-0cfa788c09b5-kube-api-access-w4j72\") pod \"openstack-operator-controller-manager-6559cd6d74-82w2g\" (UID: \"3b140fde-b7d7-4a32-b80e-0cfa788c09b5\") " pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.661219 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-6tg4x" event={"ID":"b81a8a7a-ce97-4020-9cad-038a81ea3f79","Type":"ContainerStarted","Data":"6f10f36243c98758291b5a43c8bc2373c7ebd5876cf2e2e5006f72ec31318aba"} Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.662122 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-ml6xv"] Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.668950 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f66ed5a7-8fda-4f43-bf10-c1709f30a858-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c\" (UID: \"f66ed5a7-8fda-4f43-bf10-c1709f30a858\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" Oct 07 12:38:56 crc kubenswrapper[4854]: E1007 12:38:56.669112 4854 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 12:38:56 crc kubenswrapper[4854]: E1007 12:38:56.673690 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f66ed5a7-8fda-4f43-bf10-c1709f30a858-cert podName:f66ed5a7-8fda-4f43-bf10-c1709f30a858 nodeName:}" failed. No retries permitted until 2025-10-07 12:38:57.673653508 +0000 UTC m=+853.661485763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f66ed5a7-8fda-4f43-bf10-c1709f30a858-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" (UID: "f66ed5a7-8fda-4f43-bf10-c1709f30a858") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.738251 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-jw8gn"] Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.769790 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.864952 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj" Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.912502 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-lknk5"] Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.938026 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-h4lkl"] Oct 07 12:38:56 crc kubenswrapper[4854]: I1007 12:38:56.972748 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-bstfk"] Oct 07 12:38:56 crc kubenswrapper[4854]: W1007 12:38:56.990188 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd09c6570_0b86_42a3_aa24_cd139b85c0fb.slice/crio-47d77982e769303683f60e7dc9ab703c53708aa4865009ef54338bb451e1c494 WatchSource:0}: Error finding container 47d77982e769303683f60e7dc9ab703c53708aa4865009ef54338bb451e1c494: Status 404 returned error can't find the container with id 47d77982e769303683f60e7dc9ab703c53708aa4865009ef54338bb451e1c494 Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.085003 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b140fde-b7d7-4a32-b80e-0cfa788c09b5-cert\") pod \"openstack-operator-controller-manager-6559cd6d74-82w2g\" (UID: \"3b140fde-b7d7-4a32-b80e-0cfa788c09b5\") " pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.094228 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b140fde-b7d7-4a32-b80e-0cfa788c09b5-cert\") pod \"openstack-operator-controller-manager-6559cd6d74-82w2g\" (UID: \"3b140fde-b7d7-4a32-b80e-0cfa788c09b5\") " pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.114610 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.177186 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-pnm28"] Oct 07 12:38:57 crc kubenswrapper[4854]: W1007 12:38:57.201886 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4d43b9e_cdd5_4513_aa57_002a29687247.slice/crio-278a20883afa3a83d2cc83a12bab2931e73632a07d2715a07f0ccdaf59f06ada WatchSource:0}: Error finding container 278a20883afa3a83d2cc83a12bab2931e73632a07d2715a07f0ccdaf59f06ada: Status 404 returned error can't find the container with id 278a20883afa3a83d2cc83a12bab2931e73632a07d2715a07f0ccdaf59f06ada Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.440055 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-x5n7x"] Oct 07 12:38:57 crc kubenswrapper[4854]: W1007 12:38:57.448942 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod602ad7aa_3b22_476b_849c_c1138b5e6223.slice/crio-8b0376c7ed4b6c28f2189e753452f932cdd4f68dfe7d90dbb0edaa942b3c9e4e WatchSource:0}: Error finding container 8b0376c7ed4b6c28f2189e753452f932cdd4f68dfe7d90dbb0edaa942b3c9e4e: Status 404 returned error can't find the container with id 8b0376c7ed4b6c28f2189e753452f932cdd4f68dfe7d90dbb0edaa942b3c9e4e Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.618676 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-sc7j5"] Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.627225 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-m6qn8"] Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.634140 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-hqdgf"] Oct 07 12:38:57 crc kubenswrapper[4854]: W1007 12:38:57.638861 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47a57505_60e3_4139_95d9_426eb48e4e56.slice/crio-bee77fb4495625c20f7c5110f90a87301b1ddf103a8ed449130a580431b344cd WatchSource:0}: Error finding container bee77fb4495625c20f7c5110f90a87301b1ddf103a8ed449130a580431b344cd: Status 404 returned error can't find the container with id bee77fb4495625c20f7c5110f90a87301b1ddf103a8ed449130a580431b344cd Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.643477 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-nvh92"] Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.660723 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-kcmxb"] Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.700130 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f66ed5a7-8fda-4f43-bf10-c1709f30a858-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c\" (UID: \"f66ed5a7-8fda-4f43-bf10-c1709f30a858\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.703899 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-pnm28" event={"ID":"b4d43b9e-cdd5-4513-aa57-002a29687247","Type":"ContainerStarted","Data":"278a20883afa3a83d2cc83a12bab2931e73632a07d2715a07f0ccdaf59f06ada"} Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.706921 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-d4s4h"] Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.708211 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f66ed5a7-8fda-4f43-bf10-c1709f30a858-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c\" (UID: \"f66ed5a7-8fda-4f43-bf10-c1709f30a858\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.716031 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-bstfk" event={"ID":"c345db1e-94bb-4650-80af-e0c3dac97dbe","Type":"ContainerStarted","Data":"84d1da54e2f2f30a5bfba660d36832c6b7865c43111ff8e667f8b017756ca0f8"} Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.718784 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-lknk5" event={"ID":"d09c6570-0b86-42a3-aa24-cd139b85c0fb","Type":"ContainerStarted","Data":"47d77982e769303683f60e7dc9ab703c53708aa4865009ef54338bb451e1c494"} Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.722406 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-xdhf9"] Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.722823 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-ml6xv" event={"ID":"6e979fa2-9b10-4bd5-9adc-8d9e116da401","Type":"ContainerStarted","Data":"10801654f282449f9955a54a37585ddd91fb557b2fb60ca3630feff85b5d1ff0"} Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.728531 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq"] Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.732808 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-x5n7x" event={"ID":"602ad7aa-3b22-476b-849c-c1138b5e6223","Type":"ContainerStarted","Data":"8b0376c7ed4b6c28f2189e753452f932cdd4f68dfe7d90dbb0edaa942b3c9e4e"} Oct 07 12:38:57 crc kubenswrapper[4854]: E1007 12:38:57.733254 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l6mqx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-8d984cc4d-4774x_openstack-operators(1f529c42-e29a-4dfb-972b-5e143c2589b7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 12:38:57 crc kubenswrapper[4854]: E1007 12:38:57.733447 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dt9jc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd5cb47d7-vvcpc_openstack-operators(3cf5fba8-8e19-4c34-bea9-5b910302d68f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.734834 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-nvh92" event={"ID":"47a57505-60e3-4139-95d9-426eb48e4e56","Type":"ContainerStarted","Data":"bee77fb4495625c20f7c5110f90a87301b1ddf103a8ed449130a580431b344cd"} Oct 07 12:38:57 crc kubenswrapper[4854]: E1007 12:38:57.734922 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sph9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6cbc6dd547-dkw22_openstack-operators(68f7e193-d05e-4ae9-a3b8-c075b608dfa9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.736433 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc"] Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.740804 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-m6qn8" event={"ID":"d3b199b3-50a7-4e60-b377-101e8c0d1882","Type":"ContainerStarted","Data":"64365806854bceecfe187bc7aa210a94938ac1b4a09c643c0786c0609bb3e892"} Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.743816 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs"] Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.743866 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h4lkl" event={"ID":"80562dd3-4b06-4e44-88e5-50febc56fa3d","Type":"ContainerStarted","Data":"dc0f09f6701188d6d6092628028d5df9862e7f3868b92b5ae8b872686b65face"} Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.745440 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-jw8gn" event={"ID":"52f4d849-2233-4984-8afb-4ccf15d94914","Type":"ContainerStarted","Data":"b096e4a2390a568113702e001d79943599a41eed4fc1e9f34990d27204257ac2"} Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.756916 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x"] Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.764696 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22"] Oct 07 12:38:57 crc kubenswrapper[4854]: W1007 12:38:57.768352 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddee05e6c_eda8_4059_a9b0_88b3ab2eb219.slice/crio-bcbc0cea3694fd2d94018bf74092ba189546a9f8d7677304b3c86293476fd9e5 WatchSource:0}: Error finding container bcbc0cea3694fd2d94018bf74092ba189546a9f8d7677304b3c86293476fd9e5: Status 404 returned error can't find the container with id bcbc0cea3694fd2d94018bf74092ba189546a9f8d7677304b3c86293476fd9e5 Oct 07 12:38:57 crc kubenswrapper[4854]: E1007 12:38:57.806394 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kfvp5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-65d89cfd9f-ct8rs_openstack-operators(c7263a64-2cf7-404c-b690-d2042b34d0cd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.832204 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g"] Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.841990 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj"] Oct 07 12:38:57 crc kubenswrapper[4854]: W1007 12:38:57.867506 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b140fde_b7d7_4a32_b80e_0cfa788c09b5.slice/crio-fec7483d56092852532c629fd64d62f7134edaa39b6ef857c5ada7af6a312c43 WatchSource:0}: Error finding container fec7483d56092852532c629fd64d62f7134edaa39b6ef857c5ada7af6a312c43: Status 404 returned error can't find the container with id fec7483d56092852532c629fd64d62f7134edaa39b6ef857c5ada7af6a312c43 Oct 07 12:38:57 crc kubenswrapper[4854]: W1007 12:38:57.867946 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7938e36_fecf_4df1_9e1d_886f84e4c597.slice/crio-a11da88d5c98c1648f25427ba0fa4aab21c11e05d9e17ef21c245aed1e7a2f41 WatchSource:0}: Error finding container a11da88d5c98c1648f25427ba0fa4aab21c11e05d9e17ef21c245aed1e7a2f41: Status 404 returned error can't find the container with id a11da88d5c98c1648f25427ba0fa4aab21c11e05d9e17ef21c245aed1e7a2f41 Oct 07 12:38:57 crc kubenswrapper[4854]: E1007 12:38:57.880697 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-94znf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj_openstack-operators(e7938e36-fecf-4df1-9e1d-886f84e4c597): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 12:38:57 crc kubenswrapper[4854]: E1007 12:38:57.882414 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj" podUID="e7938e36-fecf-4df1-9e1d-886f84e4c597" Oct 07 12:38:57 crc kubenswrapper[4854]: I1007 12:38:57.993583 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" Oct 07 12:38:58 crc kubenswrapper[4854]: E1007 12:38:58.075027 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc" podUID="3cf5fba8-8e19-4c34-bea9-5b910302d68f" Oct 07 12:38:58 crc kubenswrapper[4854]: E1007 12:38:58.105200 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x" podUID="1f529c42-e29a-4dfb-972b-5e143c2589b7" Oct 07 12:38:58 crc kubenswrapper[4854]: E1007 12:38:58.131193 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22" podUID="68f7e193-d05e-4ae9-a3b8-c075b608dfa9" Oct 07 12:38:58 crc kubenswrapper[4854]: E1007 12:38:58.139212 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs" podUID="c7263a64-2cf7-404c-b690-d2042b34d0cd" Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.504602 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c"] Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.772904 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-d4s4h" event={"ID":"bae9e78e-d7e2-4b0b-851b-0705610a640b","Type":"ContainerStarted","Data":"2ee9db57b2508b7d978110db93ae27e24b5d62bfa62d38b398cf4b1e5792c7a0"} Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.775030 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-sc7j5" event={"ID":"35f2082e-767b-4de0-9c71-76d1d1cb020a","Type":"ContainerStarted","Data":"10a53ab01a19fe1b3d69215f97d5976f7a3373798bde62f44a5925c5578f47d0"} Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.776078 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" event={"ID":"9f33763b-0578-4fa9-8d46-51202dfb0b12","Type":"ContainerStarted","Data":"3885ec8a66fc6a59da67bce99233f5e761d6c9da6a0e89d77f46d921e48b246d"} Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.778102 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22" event={"ID":"68f7e193-d05e-4ae9-a3b8-c075b608dfa9","Type":"ContainerStarted","Data":"b17e8fe0a4e5c08a58e7ce407126690a600ec1cb38ae565f72c6c9f583f4a001"} Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.778122 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22" event={"ID":"68f7e193-d05e-4ae9-a3b8-c075b608dfa9","Type":"ContainerStarted","Data":"0adb4a45df09584f2f1a984f8b6e914f61bf3cd6d4b41d1184b201bd63b53ffd"} Oct 07 12:38:58 crc kubenswrapper[4854]: E1007 12:38:58.782959 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22" podUID="68f7e193-d05e-4ae9-a3b8-c075b608dfa9" Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.785331 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs" event={"ID":"c7263a64-2cf7-404c-b690-d2042b34d0cd","Type":"ContainerStarted","Data":"e32b1612cf4920cceb0c64c59ccef78b62d8a1920c2acb7e1710fe462490832a"} Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.785449 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs" event={"ID":"c7263a64-2cf7-404c-b690-d2042b34d0cd","Type":"ContainerStarted","Data":"77c2dc377fea35534e60ae738cb4cf9b709b1f8003014c6f738a67b7fbc40a4d"} Oct 07 12:38:58 crc kubenswrapper[4854]: E1007 12:38:58.792598 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757\\\"\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs" podUID="c7263a64-2cf7-404c-b690-d2042b34d0cd" Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.802378 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" event={"ID":"f66ed5a7-8fda-4f43-bf10-c1709f30a858","Type":"ContainerStarted","Data":"359f016c6dc2c40bf046e5bb45a648b677fd1fa2b4ed2428a4f5aac4525ca67b"} Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.816409 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hqdgf" event={"ID":"3ddf497a-0424-4168-ae55-47a94d4d5124","Type":"ContainerStarted","Data":"82212b0311556aa2e4ffbcec0aec7206d1b4ee8d2716ec274857a4094fa8ace8"} Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.846501 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj" event={"ID":"e7938e36-fecf-4df1-9e1d-886f84e4c597","Type":"ContainerStarted","Data":"a11da88d5c98c1648f25427ba0fa4aab21c11e05d9e17ef21c245aed1e7a2f41"} Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.859122 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-kcmxb" event={"ID":"101bbc87-6377-436d-b179-e0368ce39e68","Type":"ContainerStarted","Data":"f5f474cbd163f5df4eebe6227fe6f19a9c1a2bb39921c1b73eba64bf87cb3103"} Oct 07 12:38:58 crc kubenswrapper[4854]: E1007 12:38:58.861713 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj" podUID="e7938e36-fecf-4df1-9e1d-886f84e4c597" Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.868534 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x" event={"ID":"1f529c42-e29a-4dfb-972b-5e143c2589b7","Type":"ContainerStarted","Data":"368ce4d11711cd859fdae3b54864a0ee7be83bc38de6c652c22b255661c52324"} Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.868650 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x" event={"ID":"1f529c42-e29a-4dfb-972b-5e143c2589b7","Type":"ContainerStarted","Data":"53301c3b1ebed4abe39c6d0fc9e708c41ab56708ae5aa6425f3012e6569d2f5f"} Oct 07 12:38:58 crc kubenswrapper[4854]: E1007 12:38:58.874877 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x" podUID="1f529c42-e29a-4dfb-972b-5e143c2589b7" Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.903507 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc" event={"ID":"3cf5fba8-8e19-4c34-bea9-5b910302d68f","Type":"ContainerStarted","Data":"9b91d5ce19a968941f6511d0034ee34c76a55770f6beb029dacde77237b184df"} Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.903584 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc" event={"ID":"3cf5fba8-8e19-4c34-bea9-5b910302d68f","Type":"ContainerStarted","Data":"00f60d98c1d53d7b8b911c90b13cbdafe99ce278625a6ffd73e30959f657af02"} Oct 07 12:38:58 crc kubenswrapper[4854]: E1007 12:38:58.905713 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc" podUID="3cf5fba8-8e19-4c34-bea9-5b910302d68f" Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.908828 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" event={"ID":"3b140fde-b7d7-4a32-b80e-0cfa788c09b5","Type":"ContainerStarted","Data":"87a5b73beeaf9ccd55ad035b573ee12e96bb4af25db4b7b2d26c887abd23a671"} Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.908931 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" event={"ID":"3b140fde-b7d7-4a32-b80e-0cfa788c09b5","Type":"ContainerStarted","Data":"5fa6461397eaa68d59df1a360f9ab4c956ec0079643556f6096f4afb1f1b4720"} Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.908953 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" event={"ID":"3b140fde-b7d7-4a32-b80e-0cfa788c09b5","Type":"ContainerStarted","Data":"fec7483d56092852532c629fd64d62f7134edaa39b6ef857c5ada7af6a312c43"} Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.909020 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.912492 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xdhf9" event={"ID":"dee05e6c-eda8-4059-a9b0-88b3ab2eb219","Type":"ContainerStarted","Data":"bcbc0cea3694fd2d94018bf74092ba189546a9f8d7677304b3c86293476fd9e5"} Oct 07 12:38:58 crc kubenswrapper[4854]: I1007 12:38:58.984843 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" podStartSLOduration=2.9848131 podStartE2EDuration="2.9848131s" podCreationTimestamp="2025-10-07 12:38:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:38:58.972674191 +0000 UTC m=+854.960506466" watchObservedRunningTime="2025-10-07 12:38:58.9848131 +0000 UTC m=+854.972645365" Oct 07 12:38:59 crc kubenswrapper[4854]: E1007 12:38:59.924281 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22" podUID="68f7e193-d05e-4ae9-a3b8-c075b608dfa9" Oct 07 12:38:59 crc kubenswrapper[4854]: E1007 12:38:59.925478 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x" podUID="1f529c42-e29a-4dfb-972b-5e143c2589b7" Oct 07 12:38:59 crc kubenswrapper[4854]: E1007 12:38:59.925648 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj" podUID="e7938e36-fecf-4df1-9e1d-886f84e4c597" Oct 07 12:38:59 crc kubenswrapper[4854]: E1007 12:38:59.925748 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc" podUID="3cf5fba8-8e19-4c34-bea9-5b910302d68f" Oct 07 12:38:59 crc kubenswrapper[4854]: E1007 12:38:59.929287 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757\\\"\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs" podUID="c7263a64-2cf7-404c-b690-d2042b34d0cd" Oct 07 12:39:07 crc kubenswrapper[4854]: I1007 12:39:07.122795 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6559cd6d74-82w2g" Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.026131 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-x5n7x" event={"ID":"602ad7aa-3b22-476b-849c-c1138b5e6223","Type":"ContainerStarted","Data":"a15e6496a02b214549518d84a9520cabe7a83540162059967731b388544649ab"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.035133 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-nvh92" event={"ID":"47a57505-60e3-4139-95d9-426eb48e4e56","Type":"ContainerStarted","Data":"285046c822a8201dc8c6620f4df37a454220c87b949f8ffa418081301bb8b720"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.088271 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hqdgf" event={"ID":"3ddf497a-0424-4168-ae55-47a94d4d5124","Type":"ContainerStarted","Data":"8a88cc66923ae766e5b8f75252b050ca9f89bd27774e62e74e3b6b9ce5b500d1"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.091728 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-kcmxb" event={"ID":"101bbc87-6377-436d-b179-e0368ce39e68","Type":"ContainerStarted","Data":"541d5938abd8b0e81462a47db3bcfe6c41dc8a914b4cc3e610327ebbcfee1f51"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.101208 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-m6qn8" event={"ID":"d3b199b3-50a7-4e60-b377-101e8c0d1882","Type":"ContainerStarted","Data":"2cde7d4d459cc8c75a18ae4d1db1e96386b48e7829bdd2bdfd9ca3d520aa504f"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.105096 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-6tg4x" event={"ID":"b81a8a7a-ce97-4020-9cad-038a81ea3f79","Type":"ContainerStarted","Data":"fa762fdece0fe628e6f6a7532178361cd00f9f8177d220d9414b27b0d1df3628"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.113203 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-d4s4h" event={"ID":"bae9e78e-d7e2-4b0b-851b-0705610a640b","Type":"ContainerStarted","Data":"073bf4c79e0b3b7693af583ea0277d88ea1c407ad616b0c461e3a2a4794daf97"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.115065 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-bstfk" event={"ID":"c345db1e-94bb-4650-80af-e0c3dac97dbe","Type":"ContainerStarted","Data":"d5091e65165b596a41a34ed4b43cbbe4b5e9f6bf508ab7196bde1a8f14becf4e"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.116325 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-lknk5" event={"ID":"d09c6570-0b86-42a3-aa24-cd139b85c0fb","Type":"ContainerStarted","Data":"83474910defe275589abacf9ce822de111bd718204dbfa4d9157bbf372d33167"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.119256 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h4lkl" event={"ID":"80562dd3-4b06-4e44-88e5-50febc56fa3d","Type":"ContainerStarted","Data":"8e6d5f97a7a66fdfdedb58226a8e05b45b610e90c5cdc9c799520c2185a7fb80"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.126705 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xdhf9" event={"ID":"dee05e6c-eda8-4059-a9b0-88b3ab2eb219","Type":"ContainerStarted","Data":"aad8d78e102c43b04e453f468c7562ccb3ac818267e9e76fda85d13f39f2d81d"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.130327 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-jw8gn" event={"ID":"52f4d849-2233-4984-8afb-4ccf15d94914","Type":"ContainerStarted","Data":"cd473ea021ecbb6569b6041eb3ab49c6890467bebfa16f156dbeeec643a2a394"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.132993 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-ml6xv" event={"ID":"6e979fa2-9b10-4bd5-9adc-8d9e116da401","Type":"ContainerStarted","Data":"4ab716476a51efdc1680aa46e4424e6a01d722d732c010605844e5a7e69ba606"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.134090 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-ml6xv" Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.137225 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-sc7j5" event={"ID":"35f2082e-767b-4de0-9c71-76d1d1cb020a","Type":"ContainerStarted","Data":"4e5898f4fe6c3f2f18cdad3ea6374dcdd5d8dab6d5d146b58fcb7018778185a5"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.141180 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" event={"ID":"9f33763b-0578-4fa9-8d46-51202dfb0b12","Type":"ContainerStarted","Data":"068347194df14732407d64c40e99366401e3de8cda3cccbdc15b45d057927786"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.146413 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-pnm28" event={"ID":"b4d43b9e-cdd5-4513-aa57-002a29687247","Type":"ContainerStarted","Data":"85372f5cdf1c8ba89ee771c9bdbade934bfa38aec90dfb6856576c2df26c92a8"} Oct 07 12:39:11 crc kubenswrapper[4854]: I1007 12:39:11.168940 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-ml6xv" podStartSLOduration=3.133838725 podStartE2EDuration="16.168914365s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:56.736201114 +0000 UTC m=+852.724033369" lastFinishedPulling="2025-10-07 12:39:09.771276744 +0000 UTC m=+865.759109009" observedRunningTime="2025-10-07 12:39:11.165108844 +0000 UTC m=+867.152941099" watchObservedRunningTime="2025-10-07 12:39:11.168914365 +0000 UTC m=+867.156746620" Oct 07 12:39:12 crc kubenswrapper[4854]: I1007 12:39:12.161552 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-ml6xv" event={"ID":"6e979fa2-9b10-4bd5-9adc-8d9e116da401","Type":"ContainerStarted","Data":"34bb24f405d7cbfe100907fd1f057f9f6266344aec12420b2ee1aaeba4b32ed3"} Oct 07 12:39:12 crc kubenswrapper[4854]: I1007 12:39:12.163201 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" event={"ID":"f66ed5a7-8fda-4f43-bf10-c1709f30a858","Type":"ContainerStarted","Data":"1385619ce0842714a7481349edba1e1b2682218e87bef44c429d887fbdf60e1d"} Oct 07 12:39:12 crc kubenswrapper[4854]: I1007 12:39:12.166100 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-sc7j5" event={"ID":"35f2082e-767b-4de0-9c71-76d1d1cb020a","Type":"ContainerStarted","Data":"59e48d36f799f897e0a82798da8642448223b28d21982434ea7af2d73beb6bc1"} Oct 07 12:39:13 crc kubenswrapper[4854]: I1007 12:39:13.174480 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-sc7j5" Oct 07 12:39:13 crc kubenswrapper[4854]: I1007 12:39:13.197276 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-sc7j5" podStartSLOduration=6.109253728 podStartE2EDuration="18.197253114s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.700751708 +0000 UTC m=+853.688583963" lastFinishedPulling="2025-10-07 12:39:09.788751074 +0000 UTC m=+865.776583349" observedRunningTime="2025-10-07 12:39:13.193198858 +0000 UTC m=+869.181031123" watchObservedRunningTime="2025-10-07 12:39:13.197253114 +0000 UTC m=+869.185085369" Oct 07 12:39:15 crc kubenswrapper[4854]: I1007 12:39:15.825928 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-ml6xv" Oct 07 12:39:16 crc kubenswrapper[4854]: I1007 12:39:16.300804 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-sc7j5" Oct 07 12:39:17 crc kubenswrapper[4854]: I1007 12:39:17.205309 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-nvh92" event={"ID":"47a57505-60e3-4139-95d9-426eb48e4e56","Type":"ContainerStarted","Data":"2ceecf85228713cf834ea6fa424f5275117c2529304bd93442d1d96266d21564"} Oct 07 12:39:17 crc kubenswrapper[4854]: I1007 12:39:17.207592 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-kcmxb" event={"ID":"101bbc87-6377-436d-b179-e0368ce39e68","Type":"ContainerStarted","Data":"7188ff8c5bf28dd22e88eb5610e9226db0c649a1d0390a2dedc4f3adc93448e1"} Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.216177 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xdhf9" event={"ID":"dee05e6c-eda8-4059-a9b0-88b3ab2eb219","Type":"ContainerStarted","Data":"9919da0a781ea140d2338dc74c3270ec4f4828d4d43e67abb855e07d35e792c5"} Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.216627 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xdhf9" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.219115 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-pnm28" event={"ID":"b4d43b9e-cdd5-4513-aa57-002a29687247","Type":"ContainerStarted","Data":"c44b1aafb991fb372478b785a43669c914ccde7b1882fbd6880517876be0a00b"} Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.219527 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-pnm28" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.219854 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xdhf9" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.221298 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-bstfk" event={"ID":"c345db1e-94bb-4650-80af-e0c3dac97dbe","Type":"ContainerStarted","Data":"30a6bc1ec78195d56286632adf9135792621c893f99628a7660fbe73a9276a60"} Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.221716 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-bstfk" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.222598 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-pnm28" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.223244 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-bstfk" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.223714 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-x5n7x" event={"ID":"602ad7aa-3b22-476b-849c-c1138b5e6223","Type":"ContainerStarted","Data":"dfcfeec232c0cae7fd5574ac5a0eda8ef1f5366d1b2162dfed33b21751e6944f"} Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.224132 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-x5n7x" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.225429 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-x5n7x" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.226041 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h4lkl" event={"ID":"80562dd3-4b06-4e44-88e5-50febc56fa3d","Type":"ContainerStarted","Data":"f66b72e0f99e4de2fb9b95e1e772012f61f8fe4a5a1b10d1a6aafb356f42f520"} Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.226481 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h4lkl" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.227606 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-6tg4x" event={"ID":"b81a8a7a-ce97-4020-9cad-038a81ea3f79","Type":"ContainerStarted","Data":"d6c980933495f8e464d13ac9a47c2e729fb600f8481ac840d66b19737d3a177b"} Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.228120 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-6tg4x" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.229012 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h4lkl" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.229729 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-6tg4x" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.230051 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" event={"ID":"f66ed5a7-8fda-4f43-bf10-c1709f30a858","Type":"ContainerStarted","Data":"f588f8ee3d3551f60a79a491d0ed40c3d4aa35e296388fec3acbb04d08ed4b87"} Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.230181 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.231738 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hqdgf" event={"ID":"3ddf497a-0424-4168-ae55-47a94d4d5124","Type":"ContainerStarted","Data":"09eff746eec5e0e49a0a0d34e2fee0d894a6c52faf632da14c1482cdc0087549"} Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.232200 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hqdgf" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.233760 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hqdgf" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.234581 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.239838 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-lknk5" event={"ID":"d09c6570-0b86-42a3-aa24-cd139b85c0fb","Type":"ContainerStarted","Data":"25dd999ac9a05f5421330b4c2a8de5091b3e57b1a1c5806714bd5bc3be93def4"} Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.241158 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-lknk5" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.244418 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-xdhf9" podStartSLOduration=11.256772133 podStartE2EDuration="23.244391817s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.774308954 +0000 UTC m=+853.762141209" lastFinishedPulling="2025-10-07 12:39:09.761928638 +0000 UTC m=+865.749760893" observedRunningTime="2025-10-07 12:39:18.242398635 +0000 UTC m=+874.230230910" watchObservedRunningTime="2025-10-07 12:39:18.244391817 +0000 UTC m=+874.232224092" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.247359 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-lknk5" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.248179 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" event={"ID":"9f33763b-0578-4fa9-8d46-51202dfb0b12","Type":"ContainerStarted","Data":"049302e0b3f863cb27699dac0a3e2cea449f957c1f932fe5e928e3f335e30052"} Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.248258 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.250624 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-d4s4h" event={"ID":"bae9e78e-d7e2-4b0b-851b-0705610a640b","Type":"ContainerStarted","Data":"58118b76831444490f62bbd3f7236fa852ff122fbb01d877e44a396fe25a647d"} Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.250945 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-d4s4h" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.253482 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-d4s4h" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.253515 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-m6qn8" event={"ID":"d3b199b3-50a7-4e60-b377-101e8c0d1882","Type":"ContainerStarted","Data":"fe686d75e9354784e0dce77d05c76aa0dfc1a497adf06d0481844b5011cd6412"} Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.254640 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-m6qn8" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.255515 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.259877 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-m6qn8" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.261224 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-jw8gn" event={"ID":"52f4d849-2233-4984-8afb-4ccf15d94914","Type":"ContainerStarted","Data":"72611e0bc5616924b6d85511f1dddc229b7836351527400dababf38a7f1f7d87"} Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.261301 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-nvh92" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.261514 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-jw8gn" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.261953 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-kcmxb" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.263454 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-jw8gn" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.263585 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-kcmxb" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.265277 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-nvh92" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.274241 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-bstfk" podStartSLOduration=10.604906818 podStartE2EDuration="23.274212552s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.060780836 +0000 UTC m=+853.048613091" lastFinishedPulling="2025-10-07 12:39:09.73008653 +0000 UTC m=+865.717918825" observedRunningTime="2025-10-07 12:39:18.268465801 +0000 UTC m=+874.256298076" watchObservedRunningTime="2025-10-07 12:39:18.274212552 +0000 UTC m=+874.262044807" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.359761 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-hqdgf" podStartSLOduration=11.224692269 podStartE2EDuration="23.359735283s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.704413904 +0000 UTC m=+853.692246169" lastFinishedPulling="2025-10-07 12:39:09.839456928 +0000 UTC m=+865.827289183" observedRunningTime="2025-10-07 12:39:18.357612577 +0000 UTC m=+874.345444832" watchObservedRunningTime="2025-10-07 12:39:18.359735283 +0000 UTC m=+874.347567538" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.360066 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-pnm28" podStartSLOduration=10.727845053 podStartE2EDuration="23.360058401s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.207255381 +0000 UTC m=+853.195087636" lastFinishedPulling="2025-10-07 12:39:09.839468729 +0000 UTC m=+865.827300984" observedRunningTime="2025-10-07 12:39:18.293679254 +0000 UTC m=+874.281511509" watchObservedRunningTime="2025-10-07 12:39:18.360058401 +0000 UTC m=+874.347890656" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.377995 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h4lkl" podStartSLOduration=10.621383622 podStartE2EDuration="23.377971423s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.005271505 +0000 UTC m=+852.993103760" lastFinishedPulling="2025-10-07 12:39:09.761859276 +0000 UTC m=+865.749691561" observedRunningTime="2025-10-07 12:39:18.377668985 +0000 UTC m=+874.365501240" watchObservedRunningTime="2025-10-07 12:39:18.377971423 +0000 UTC m=+874.365803678" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.405674 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-lknk5" podStartSLOduration=10.572039603 podStartE2EDuration="23.405642571s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.008314975 +0000 UTC m=+852.996147230" lastFinishedPulling="2025-10-07 12:39:09.841917933 +0000 UTC m=+865.829750198" observedRunningTime="2025-10-07 12:39:18.401100291 +0000 UTC m=+874.388932566" watchObservedRunningTime="2025-10-07 12:39:18.405642571 +0000 UTC m=+874.393474826" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.426301 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-6tg4x" podStartSLOduration=9.998714215 podStartE2EDuration="23.426273734s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:56.410714138 +0000 UTC m=+852.398546393" lastFinishedPulling="2025-10-07 12:39:09.838273657 +0000 UTC m=+865.826105912" observedRunningTime="2025-10-07 12:39:18.422689819 +0000 UTC m=+874.410522074" watchObservedRunningTime="2025-10-07 12:39:18.426273734 +0000 UTC m=+874.414105989" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.473784 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-x5n7x" podStartSLOduration=11.192638835 podStartE2EDuration="23.473128197s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.462928799 +0000 UTC m=+853.450761054" lastFinishedPulling="2025-10-07 12:39:09.743418121 +0000 UTC m=+865.731250416" observedRunningTime="2025-10-07 12:39:18.468109705 +0000 UTC m=+874.455941960" watchObservedRunningTime="2025-10-07 12:39:18.473128197 +0000 UTC m=+874.460960452" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.509541 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c" podStartSLOduration=12.110579603 podStartE2EDuration="23.509518925s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:58.51742455 +0000 UTC m=+854.505256805" lastFinishedPulling="2025-10-07 12:39:09.916363872 +0000 UTC m=+865.904196127" observedRunningTime="2025-10-07 12:39:18.502221962 +0000 UTC m=+874.490054217" watchObservedRunningTime="2025-10-07 12:39:18.509518925 +0000 UTC m=+874.497351180" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.550859 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-kcmxb" podStartSLOduration=11.486879609 podStartE2EDuration="23.550831662s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.697545374 +0000 UTC m=+853.685377629" lastFinishedPulling="2025-10-07 12:39:09.761497427 +0000 UTC m=+865.749329682" observedRunningTime="2025-10-07 12:39:18.522032014 +0000 UTC m=+874.509864269" watchObservedRunningTime="2025-10-07 12:39:18.550831662 +0000 UTC m=+874.538663917" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.587214 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-nvh92" podStartSLOduration=11.512086083 podStartE2EDuration="23.587188289s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.697213835 +0000 UTC m=+853.685046090" lastFinishedPulling="2025-10-07 12:39:09.772316011 +0000 UTC m=+865.760148296" observedRunningTime="2025-10-07 12:39:18.579129466 +0000 UTC m=+874.566961721" watchObservedRunningTime="2025-10-07 12:39:18.587188289 +0000 UTC m=+874.575020544" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.599181 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-m6qn8" podStartSLOduration=11.449916987 podStartE2EDuration="23.599156714s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.691394182 +0000 UTC m=+853.679226437" lastFinishedPulling="2025-10-07 12:39:09.840633889 +0000 UTC m=+865.828466164" observedRunningTime="2025-10-07 12:39:18.598442445 +0000 UTC m=+874.586274700" watchObservedRunningTime="2025-10-07 12:39:18.599156714 +0000 UTC m=+874.586988969" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.626024 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-5n2wq" podStartSLOduration=11.500195949 podStartE2EDuration="23.62599624s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.716231565 +0000 UTC m=+853.704063820" lastFinishedPulling="2025-10-07 12:39:09.842031856 +0000 UTC m=+865.829864111" observedRunningTime="2025-10-07 12:39:18.619252882 +0000 UTC m=+874.607085137" watchObservedRunningTime="2025-10-07 12:39:18.62599624 +0000 UTC m=+874.613828495" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.640434 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-d4s4h" podStartSLOduration=11.515474181 podStartE2EDuration="23.640409429s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.716282387 +0000 UTC m=+853.704114642" lastFinishedPulling="2025-10-07 12:39:09.841217635 +0000 UTC m=+865.829049890" observedRunningTime="2025-10-07 12:39:18.639760742 +0000 UTC m=+874.627593007" watchObservedRunningTime="2025-10-07 12:39:18.640409429 +0000 UTC m=+874.628241684" Oct 07 12:39:18 crc kubenswrapper[4854]: I1007 12:39:18.660597 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-jw8gn" podStartSLOduration=10.55371633 podStartE2EDuration="23.660523358s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:56.735520786 +0000 UTC m=+852.723353051" lastFinishedPulling="2025-10-07 12:39:09.842327804 +0000 UTC m=+865.830160079" observedRunningTime="2025-10-07 12:39:18.658859225 +0000 UTC m=+874.646691490" watchObservedRunningTime="2025-10-07 12:39:18.660523358 +0000 UTC m=+874.648355613" Oct 07 12:39:22 crc kubenswrapper[4854]: I1007 12:39:22.300956 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22" event={"ID":"68f7e193-d05e-4ae9-a3b8-c075b608dfa9","Type":"ContainerStarted","Data":"c130010fb651ee274938fa2a6f4525ba322ba3ca4d3927d96d65e1ae07275377"} Oct 07 12:39:22 crc kubenswrapper[4854]: I1007 12:39:22.301955 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22" Oct 07 12:39:22 crc kubenswrapper[4854]: I1007 12:39:22.303264 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x" event={"ID":"1f529c42-e29a-4dfb-972b-5e143c2589b7","Type":"ContainerStarted","Data":"992842a1ed089226d929ccbffb5317fa3cd716dc9b6e0e7398c1dc00eb785395"} Oct 07 12:39:22 crc kubenswrapper[4854]: I1007 12:39:22.303924 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x" Oct 07 12:39:22 crc kubenswrapper[4854]: I1007 12:39:22.305182 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs" event={"ID":"c7263a64-2cf7-404c-b690-d2042b34d0cd","Type":"ContainerStarted","Data":"12160e62993f5e3df1df69cce7c19a37abdc3109ba3b63fe681048017349f6ce"} Oct 07 12:39:22 crc kubenswrapper[4854]: I1007 12:39:22.305522 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs" Oct 07 12:39:22 crc kubenswrapper[4854]: I1007 12:39:22.307239 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc" event={"ID":"3cf5fba8-8e19-4c34-bea9-5b910302d68f","Type":"ContainerStarted","Data":"2bf40ea5bf7ca96e0d6db3822857e03fcae519967545b04852f24131b16b7371"} Oct 07 12:39:22 crc kubenswrapper[4854]: I1007 12:39:22.307779 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc" Oct 07 12:39:22 crc kubenswrapper[4854]: I1007 12:39:22.309963 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj" event={"ID":"e7938e36-fecf-4df1-9e1d-886f84e4c597","Type":"ContainerStarted","Data":"2471e6cef571f7e9cf23bb333a14c36725a30077c30710f7e3250283a488747b"} Oct 07 12:39:22 crc kubenswrapper[4854]: I1007 12:39:22.326408 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22" podStartSLOduration=2.354985586 podStartE2EDuration="26.326387902s" podCreationTimestamp="2025-10-07 12:38:56 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.73464374 +0000 UTC m=+853.722475995" lastFinishedPulling="2025-10-07 12:39:21.706046056 +0000 UTC m=+877.693878311" observedRunningTime="2025-10-07 12:39:22.321289268 +0000 UTC m=+878.309121523" watchObservedRunningTime="2025-10-07 12:39:22.326387902 +0000 UTC m=+878.314220157" Oct 07 12:39:22 crc kubenswrapper[4854]: I1007 12:39:22.342130 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj" podStartSLOduration=2.515989783 podStartE2EDuration="26.342095225s" podCreationTimestamp="2025-10-07 12:38:56 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.880508949 +0000 UTC m=+853.868341204" lastFinishedPulling="2025-10-07 12:39:21.706614391 +0000 UTC m=+877.694446646" observedRunningTime="2025-10-07 12:39:22.336778665 +0000 UTC m=+878.324610920" watchObservedRunningTime="2025-10-07 12:39:22.342095225 +0000 UTC m=+878.329927480" Oct 07 12:39:22 crc kubenswrapper[4854]: I1007 12:39:22.361440 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs" podStartSLOduration=3.461427316 podStartE2EDuration="27.361413084s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.80609956 +0000 UTC m=+853.793931815" lastFinishedPulling="2025-10-07 12:39:21.706085318 +0000 UTC m=+877.693917583" observedRunningTime="2025-10-07 12:39:22.357216323 +0000 UTC m=+878.345048598" watchObservedRunningTime="2025-10-07 12:39:22.361413084 +0000 UTC m=+878.349245339" Oct 07 12:39:22 crc kubenswrapper[4854]: I1007 12:39:22.394630 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x" podStartSLOduration=3.412187501 podStartE2EDuration="27.394601787s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.733037808 +0000 UTC m=+853.720870063" lastFinishedPulling="2025-10-07 12:39:21.715452094 +0000 UTC m=+877.703284349" observedRunningTime="2025-10-07 12:39:22.391103525 +0000 UTC m=+878.378935790" watchObservedRunningTime="2025-10-07 12:39:22.394601787 +0000 UTC m=+878.382434052" Oct 07 12:39:22 crc kubenswrapper[4854]: I1007 12:39:22.414293 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc" podStartSLOduration=3.42922787 podStartE2EDuration="27.414260955s" podCreationTimestamp="2025-10-07 12:38:55 +0000 UTC" firstStartedPulling="2025-10-07 12:38:57.733319655 +0000 UTC m=+853.721151910" lastFinishedPulling="2025-10-07 12:39:21.71835274 +0000 UTC m=+877.706184995" observedRunningTime="2025-10-07 12:39:22.408703938 +0000 UTC m=+878.396536213" watchObservedRunningTime="2025-10-07 12:39:22.414260955 +0000 UTC m=+878.402093220" Oct 07 12:39:26 crc kubenswrapper[4854]: I1007 12:39:26.306606 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-ct8rs" Oct 07 12:39:26 crc kubenswrapper[4854]: I1007 12:39:26.333071 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-4774x" Oct 07 12:39:26 crc kubenswrapper[4854]: I1007 12:39:26.453050 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvcpc" Oct 07 12:39:26 crc kubenswrapper[4854]: I1007 12:39:26.772802 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-dkw22" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.651108 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-9nxdf"] Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.653470 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-9nxdf" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.659733 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.659777 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.660800 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-s42mz" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.665581 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.678090 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-9nxdf"] Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.752196 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5nwlr"] Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.753794 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.757321 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.766060 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zgz4\" (UniqueName: \"kubernetes.io/projected/14e4f50b-e577-497e-9ffb-24e3fe0e93c5-kube-api-access-4zgz4\") pod \"dnsmasq-dns-675f4bcbfc-9nxdf\" (UID: \"14e4f50b-e577-497e-9ffb-24e3fe0e93c5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-9nxdf" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.766163 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14e4f50b-e577-497e-9ffb-24e3fe0e93c5-config\") pod \"dnsmasq-dns-675f4bcbfc-9nxdf\" (UID: \"14e4f50b-e577-497e-9ffb-24e3fe0e93c5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-9nxdf" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.782976 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5nwlr"] Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.868052 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14e4f50b-e577-497e-9ffb-24e3fe0e93c5-config\") pod \"dnsmasq-dns-675f4bcbfc-9nxdf\" (UID: \"14e4f50b-e577-497e-9ffb-24e3fe0e93c5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-9nxdf" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.868235 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c261705-13dc-4a3b-b7ad-6e31f546cb50-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5nwlr\" (UID: \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.868310 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zgz4\" (UniqueName: \"kubernetes.io/projected/14e4f50b-e577-497e-9ffb-24e3fe0e93c5-kube-api-access-4zgz4\") pod \"dnsmasq-dns-675f4bcbfc-9nxdf\" (UID: \"14e4f50b-e577-497e-9ffb-24e3fe0e93c5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-9nxdf" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.868338 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9fgb\" (UniqueName: \"kubernetes.io/projected/0c261705-13dc-4a3b-b7ad-6e31f546cb50-kube-api-access-l9fgb\") pod \"dnsmasq-dns-78dd6ddcc-5nwlr\" (UID: \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.868375 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c261705-13dc-4a3b-b7ad-6e31f546cb50-config\") pod \"dnsmasq-dns-78dd6ddcc-5nwlr\" (UID: \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.869610 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14e4f50b-e577-497e-9ffb-24e3fe0e93c5-config\") pod \"dnsmasq-dns-675f4bcbfc-9nxdf\" (UID: \"14e4f50b-e577-497e-9ffb-24e3fe0e93c5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-9nxdf" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.891351 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zgz4\" (UniqueName: \"kubernetes.io/projected/14e4f50b-e577-497e-9ffb-24e3fe0e93c5-kube-api-access-4zgz4\") pod \"dnsmasq-dns-675f4bcbfc-9nxdf\" (UID: \"14e4f50b-e577-497e-9ffb-24e3fe0e93c5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-9nxdf" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.970234 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9fgb\" (UniqueName: \"kubernetes.io/projected/0c261705-13dc-4a3b-b7ad-6e31f546cb50-kube-api-access-l9fgb\") pod \"dnsmasq-dns-78dd6ddcc-5nwlr\" (UID: \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.970602 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c261705-13dc-4a3b-b7ad-6e31f546cb50-config\") pod \"dnsmasq-dns-78dd6ddcc-5nwlr\" (UID: \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.970677 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c261705-13dc-4a3b-b7ad-6e31f546cb50-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5nwlr\" (UID: \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.971614 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c261705-13dc-4a3b-b7ad-6e31f546cb50-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5nwlr\" (UID: \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.971835 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c261705-13dc-4a3b-b7ad-6e31f546cb50-config\") pod \"dnsmasq-dns-78dd6ddcc-5nwlr\" (UID: \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.979154 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-9nxdf" Oct 07 12:39:41 crc kubenswrapper[4854]: I1007 12:39:41.988134 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9fgb\" (UniqueName: \"kubernetes.io/projected/0c261705-13dc-4a3b-b7ad-6e31f546cb50-kube-api-access-l9fgb\") pod \"dnsmasq-dns-78dd6ddcc-5nwlr\" (UID: \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" Oct 07 12:39:42 crc kubenswrapper[4854]: I1007 12:39:42.069984 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" Oct 07 12:39:42 crc kubenswrapper[4854]: I1007 12:39:42.268490 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-9nxdf"] Oct 07 12:39:42 crc kubenswrapper[4854]: W1007 12:39:42.276433 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14e4f50b_e577_497e_9ffb_24e3fe0e93c5.slice/crio-266c107e72b470a1facfd51636f3326837a55bcd855426e38e557c5f87976189 WatchSource:0}: Error finding container 266c107e72b470a1facfd51636f3326837a55bcd855426e38e557c5f87976189: Status 404 returned error can't find the container with id 266c107e72b470a1facfd51636f3326837a55bcd855426e38e557c5f87976189 Oct 07 12:39:42 crc kubenswrapper[4854]: I1007 12:39:42.279263 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 12:39:42 crc kubenswrapper[4854]: I1007 12:39:42.498959 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-9nxdf" event={"ID":"14e4f50b-e577-497e-9ffb-24e3fe0e93c5","Type":"ContainerStarted","Data":"266c107e72b470a1facfd51636f3326837a55bcd855426e38e557c5f87976189"} Oct 07 12:39:42 crc kubenswrapper[4854]: I1007 12:39:42.555728 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5nwlr"] Oct 07 12:39:42 crc kubenswrapper[4854]: W1007 12:39:42.566251 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c261705_13dc_4a3b_b7ad_6e31f546cb50.slice/crio-9de9c1927ef5ffb63b991d2008f60e6f61d9178c40dd2ce5b980833e17202448 WatchSource:0}: Error finding container 9de9c1927ef5ffb63b991d2008f60e6f61d9178c40dd2ce5b980833e17202448: Status 404 returned error can't find the container with id 9de9c1927ef5ffb63b991d2008f60e6f61d9178c40dd2ce5b980833e17202448 Oct 07 12:39:43 crc kubenswrapper[4854]: I1007 12:39:43.519847 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" event={"ID":"0c261705-13dc-4a3b-b7ad-6e31f546cb50","Type":"ContainerStarted","Data":"9de9c1927ef5ffb63b991d2008f60e6f61d9178c40dd2ce5b980833e17202448"} Oct 07 12:39:44 crc kubenswrapper[4854]: I1007 12:39:44.677974 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-9nxdf"] Oct 07 12:39:44 crc kubenswrapper[4854]: I1007 12:39:44.729602 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-cs286"] Oct 07 12:39:44 crc kubenswrapper[4854]: I1007 12:39:44.731327 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-cs286" Oct 07 12:39:44 crc kubenswrapper[4854]: I1007 12:39:44.737891 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-cs286"] Oct 07 12:39:44 crc kubenswrapper[4854]: I1007 12:39:44.818037 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-dns-svc\") pod \"dnsmasq-dns-666b6646f7-cs286\" (UID: \"db03aba0-4761-426e-ac26-50f599cc0ad9\") " pod="openstack/dnsmasq-dns-666b6646f7-cs286" Oct 07 12:39:44 crc kubenswrapper[4854]: I1007 12:39:44.818119 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-config\") pod \"dnsmasq-dns-666b6646f7-cs286\" (UID: \"db03aba0-4761-426e-ac26-50f599cc0ad9\") " pod="openstack/dnsmasq-dns-666b6646f7-cs286" Oct 07 12:39:44 crc kubenswrapper[4854]: I1007 12:39:44.818237 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vvtx\" (UniqueName: \"kubernetes.io/projected/db03aba0-4761-426e-ac26-50f599cc0ad9-kube-api-access-5vvtx\") pod \"dnsmasq-dns-666b6646f7-cs286\" (UID: \"db03aba0-4761-426e-ac26-50f599cc0ad9\") " pod="openstack/dnsmasq-dns-666b6646f7-cs286" Oct 07 12:39:44 crc kubenswrapper[4854]: I1007 12:39:44.919786 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-dns-svc\") pod \"dnsmasq-dns-666b6646f7-cs286\" (UID: \"db03aba0-4761-426e-ac26-50f599cc0ad9\") " pod="openstack/dnsmasq-dns-666b6646f7-cs286" Oct 07 12:39:44 crc kubenswrapper[4854]: I1007 12:39:44.920179 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-config\") pod \"dnsmasq-dns-666b6646f7-cs286\" (UID: \"db03aba0-4761-426e-ac26-50f599cc0ad9\") " pod="openstack/dnsmasq-dns-666b6646f7-cs286" Oct 07 12:39:44 crc kubenswrapper[4854]: I1007 12:39:44.920359 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vvtx\" (UniqueName: \"kubernetes.io/projected/db03aba0-4761-426e-ac26-50f599cc0ad9-kube-api-access-5vvtx\") pod \"dnsmasq-dns-666b6646f7-cs286\" (UID: \"db03aba0-4761-426e-ac26-50f599cc0ad9\") " pod="openstack/dnsmasq-dns-666b6646f7-cs286" Oct 07 12:39:44 crc kubenswrapper[4854]: I1007 12:39:44.920998 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-dns-svc\") pod \"dnsmasq-dns-666b6646f7-cs286\" (UID: \"db03aba0-4761-426e-ac26-50f599cc0ad9\") " pod="openstack/dnsmasq-dns-666b6646f7-cs286" Oct 07 12:39:44 crc kubenswrapper[4854]: I1007 12:39:44.921936 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-config\") pod \"dnsmasq-dns-666b6646f7-cs286\" (UID: \"db03aba0-4761-426e-ac26-50f599cc0ad9\") " pod="openstack/dnsmasq-dns-666b6646f7-cs286" Oct 07 12:39:44 crc kubenswrapper[4854]: I1007 12:39:44.965711 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vvtx\" (UniqueName: \"kubernetes.io/projected/db03aba0-4761-426e-ac26-50f599cc0ad9-kube-api-access-5vvtx\") pod \"dnsmasq-dns-666b6646f7-cs286\" (UID: \"db03aba0-4761-426e-ac26-50f599cc0ad9\") " pod="openstack/dnsmasq-dns-666b6646f7-cs286" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.011019 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5nwlr"] Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.032659 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ct6nt"] Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.034319 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.049737 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ct6nt"] Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.061757 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-cs286" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.122422 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa9a983-c51a-4a92-9b76-cd1432f92f57-config\") pod \"dnsmasq-dns-57d769cc4f-ct6nt\" (UID: \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\") " pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.122985 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhhv9\" (UniqueName: \"kubernetes.io/projected/5aa9a983-c51a-4a92-9b76-cd1432f92f57-kube-api-access-bhhv9\") pod \"dnsmasq-dns-57d769cc4f-ct6nt\" (UID: \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\") " pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.123072 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aa9a983-c51a-4a92-9b76-cd1432f92f57-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ct6nt\" (UID: \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\") " pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.226954 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhhv9\" (UniqueName: \"kubernetes.io/projected/5aa9a983-c51a-4a92-9b76-cd1432f92f57-kube-api-access-bhhv9\") pod \"dnsmasq-dns-57d769cc4f-ct6nt\" (UID: \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\") " pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.227063 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aa9a983-c51a-4a92-9b76-cd1432f92f57-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ct6nt\" (UID: \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\") " pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.227118 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa9a983-c51a-4a92-9b76-cd1432f92f57-config\") pod \"dnsmasq-dns-57d769cc4f-ct6nt\" (UID: \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\") " pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.228446 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa9a983-c51a-4a92-9b76-cd1432f92f57-config\") pod \"dnsmasq-dns-57d769cc4f-ct6nt\" (UID: \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\") " pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.228600 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aa9a983-c51a-4a92-9b76-cd1432f92f57-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ct6nt\" (UID: \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\") " pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.250450 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhhv9\" (UniqueName: \"kubernetes.io/projected/5aa9a983-c51a-4a92-9b76-cd1432f92f57-kube-api-access-bhhv9\") pod \"dnsmasq-dns-57d769cc4f-ct6nt\" (UID: \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\") " pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.351838 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.569728 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-cs286"] Oct 07 12:39:45 crc kubenswrapper[4854]: W1007 12:39:45.587355 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb03aba0_4761_426e_ac26_50f599cc0ad9.slice/crio-e833ee49d0aba6c3b607ece3d1174e1345a873f2e18e6fd930661f6824d230b6 WatchSource:0}: Error finding container e833ee49d0aba6c3b607ece3d1174e1345a873f2e18e6fd930661f6824d230b6: Status 404 returned error can't find the container with id e833ee49d0aba6c3b607ece3d1174e1345a873f2e18e6fd930661f6824d230b6 Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.822934 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ct6nt"] Oct 07 12:39:45 crc kubenswrapper[4854]: W1007 12:39:45.831634 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5aa9a983_c51a_4a92_9b76_cd1432f92f57.slice/crio-57c0f6cf9b0cdc5e7790bf7f8f4eb70e7ddf9ee0077afa132a1586cd918c8bf8 WatchSource:0}: Error finding container 57c0f6cf9b0cdc5e7790bf7f8f4eb70e7ddf9ee0077afa132a1586cd918c8bf8: Status 404 returned error can't find the container with id 57c0f6cf9b0cdc5e7790bf7f8f4eb70e7ddf9ee0077afa132a1586cd918c8bf8 Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.875897 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.877530 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.882941 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.883853 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.884091 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.884435 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.884597 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.884804 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.884999 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-wrd72" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.896830 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.940042 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.940116 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.940187 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.940210 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.940260 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.940290 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.940328 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h59t\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-kube-api-access-6h59t\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.940400 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.940437 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.940462 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:45 crc kubenswrapper[4854]: I1007 12:39:45.940496 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.041831 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h59t\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-kube-api-access-6h59t\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.041876 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.041902 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.041920 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.041945 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.041972 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.041998 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.042017 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.042034 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.042063 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.042097 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.043981 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.044217 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.044648 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.044679 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.045345 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.046583 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.048256 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.048511 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.048802 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.049242 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.064475 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h59t\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-kube-api-access-6h59t\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.070704 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.218177 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.221767 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.226483 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.226513 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.226862 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.226964 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-kkdtc" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.227004 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.227038 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.227229 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.227232 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.232894 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.245214 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.245279 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.245311 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.245356 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.245378 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.245432 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79513100-48d2-4e7b-ae14-888322cab8f3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.245461 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.245483 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2l9x\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-kube-api-access-q2l9x\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.245507 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.245540 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79513100-48d2-4e7b-ae14-888322cab8f3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.245580 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.348929 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.348994 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2l9x\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-kube-api-access-q2l9x\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.349048 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.349116 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79513100-48d2-4e7b-ae14-888322cab8f3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.349165 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.349187 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.349440 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.349495 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.349557 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.349850 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.349888 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.350105 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.350213 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.350526 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.350883 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79513100-48d2-4e7b-ae14-888322cab8f3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.353484 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.353963 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.362700 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79513100-48d2-4e7b-ae14-888322cab8f3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.363591 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79513100-48d2-4e7b-ae14-888322cab8f3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.376365 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.376971 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.391894 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2l9x\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-kube-api-access-q2l9x\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.396300 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.552972 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" event={"ID":"5aa9a983-c51a-4a92-9b76-cd1432f92f57","Type":"ContainerStarted","Data":"57c0f6cf9b0cdc5e7790bf7f8f4eb70e7ddf9ee0077afa132a1586cd918c8bf8"} Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.556640 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-cs286" event={"ID":"db03aba0-4761-426e-ac26-50f599cc0ad9","Type":"ContainerStarted","Data":"e833ee49d0aba6c3b607ece3d1174e1345a873f2e18e6fd930661f6824d230b6"} Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.557558 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.725450 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 12:39:46 crc kubenswrapper[4854]: W1007 12:39:46.731083 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c293f13_b2a5_4d4b_9f69_fd118e34eab2.slice/crio-7c0980786f4f741e8ff31c00d722199c81181f404d93d164f100dc2152c83e6c WatchSource:0}: Error finding container 7c0980786f4f741e8ff31c00d722199c81181f404d93d164f100dc2152c83e6c: Status 404 returned error can't find the container with id 7c0980786f4f741e8ff31c00d722199c81181f404d93d164f100dc2152c83e6c Oct 07 12:39:46 crc kubenswrapper[4854]: I1007 12:39:46.993819 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.491391 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.504213 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.507518 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-x6thd" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.511591 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.511602 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.513302 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.514652 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.518757 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.523288 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.569216 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4c293f13-b2a5-4d4b-9f69-fd118e34eab2","Type":"ContainerStarted","Data":"7c0980786f4f741e8ff31c00d722199c81181f404d93d164f100dc2152c83e6c"} Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.570915 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"79513100-48d2-4e7b-ae14-888322cab8f3","Type":"ContainerStarted","Data":"90a6236e4e1a2098e0ad321b015dc2006dca7a42f2f4eb0e997754f1087e31d6"} Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.676484 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.676545 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-config-data-default\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.676585 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.676616 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.676651 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.676673 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-kolla-config\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.676753 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e599e18f-63c0-4756-845c-973257921fd0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.676782 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6rdc\" (UniqueName: \"kubernetes.io/projected/e599e18f-63c0-4756-845c-973257921fd0-kube-api-access-z6rdc\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.676849 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-secrets\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.778707 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.778777 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.778806 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-kolla-config\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.778872 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e599e18f-63c0-4756-845c-973257921fd0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.778892 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6rdc\" (UniqueName: \"kubernetes.io/projected/e599e18f-63c0-4756-845c-973257921fd0-kube-api-access-z6rdc\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.778941 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-secrets\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.778965 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.778984 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-config-data-default\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.779004 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.779815 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e599e18f-63c0-4756-845c-973257921fd0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.780114 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.780879 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-kolla-config\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.782549 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.782636 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-config-data-default\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.787687 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-secrets\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.788788 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.792474 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.803898 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.807446 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6rdc\" (UniqueName: \"kubernetes.io/projected/e599e18f-63c0-4756-845c-973257921fd0-kube-api-access-z6rdc\") pod \"openstack-galera-0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " pod="openstack/openstack-galera-0" Oct 07 12:39:47 crc kubenswrapper[4854]: I1007 12:39:47.852822 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.364905 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.583532 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e599e18f-63c0-4756-845c-973257921fd0","Type":"ContainerStarted","Data":"5cf807c10f39826889aef602ca0e5723b38bc982f61ccda6d7a95eb756b531b0"} Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.748605 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.752261 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.758333 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.760175 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-6vr6p" Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.760276 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.760388 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.769378 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.902774 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.902929 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.903103 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f725ba88-4d40-4eab-890d-e114448fabe9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.903133 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.903540 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.903774 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bfsl\" (UniqueName: \"kubernetes.io/projected/f725ba88-4d40-4eab-890d-e114448fabe9-kube-api-access-6bfsl\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.911769 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.911913 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:48 crc kubenswrapper[4854]: I1007 12:39:48.911937 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.013398 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.013454 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.014694 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f725ba88-4d40-4eab-890d-e114448fabe9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.014728 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.014774 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.014861 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bfsl\" (UniqueName: \"kubernetes.io/projected/f725ba88-4d40-4eab-890d-e114448fabe9-kube-api-access-6bfsl\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.014900 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.014992 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.015025 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.016714 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.016840 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.017518 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.020501 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f725ba88-4d40-4eab-890d-e114448fabe9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.028459 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.032835 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.035227 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.047569 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.050634 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bfsl\" (UniqueName: \"kubernetes.io/projected/f725ba88-4d40-4eab-890d-e114448fabe9-kube-api-access-6bfsl\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.074048 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.089496 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.141319 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.142552 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.149123 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.149201 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.149517 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-267tz" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.154515 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.325921 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nqgj\" (UniqueName: \"kubernetes.io/projected/c2cebadb-2142-477a-85b3-53e7c73fa6cc-kube-api-access-9nqgj\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.326002 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c2cebadb-2142-477a-85b3-53e7c73fa6cc-kolla-config\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.326080 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cebadb-2142-477a-85b3-53e7c73fa6cc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.326099 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cebadb-2142-477a-85b3-53e7c73fa6cc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.326179 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2cebadb-2142-477a-85b3-53e7c73fa6cc-config-data\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.427909 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cebadb-2142-477a-85b3-53e7c73fa6cc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.428986 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cebadb-2142-477a-85b3-53e7c73fa6cc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.429085 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2cebadb-2142-477a-85b3-53e7c73fa6cc-config-data\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.429170 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nqgj\" (UniqueName: \"kubernetes.io/projected/c2cebadb-2142-477a-85b3-53e7c73fa6cc-kube-api-access-9nqgj\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.429230 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c2cebadb-2142-477a-85b3-53e7c73fa6cc-kolla-config\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.430101 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c2cebadb-2142-477a-85b3-53e7c73fa6cc-kolla-config\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.431068 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2cebadb-2142-477a-85b3-53e7c73fa6cc-config-data\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.435758 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cebadb-2142-477a-85b3-53e7c73fa6cc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.437746 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cebadb-2142-477a-85b3-53e7c73fa6cc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.450561 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nqgj\" (UniqueName: \"kubernetes.io/projected/c2cebadb-2142-477a-85b3-53e7c73fa6cc-kube-api-access-9nqgj\") pod \"memcached-0\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.477343 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 12:39:49 crc kubenswrapper[4854]: I1007 12:39:49.748585 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 12:39:50 crc kubenswrapper[4854]: I1007 12:39:50.005926 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 12:39:50 crc kubenswrapper[4854]: W1007 12:39:50.023902 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2cebadb_2142_477a_85b3_53e7c73fa6cc.slice/crio-e8eeecea5a138704147e9142fc2e5ccab80504d6993b3a52faa6fd5a7ac302c7 WatchSource:0}: Error finding container e8eeecea5a138704147e9142fc2e5ccab80504d6993b3a52faa6fd5a7ac302c7: Status 404 returned error can't find the container with id e8eeecea5a138704147e9142fc2e5ccab80504d6993b3a52faa6fd5a7ac302c7 Oct 07 12:39:50 crc kubenswrapper[4854]: I1007 12:39:50.610358 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c2cebadb-2142-477a-85b3-53e7c73fa6cc","Type":"ContainerStarted","Data":"e8eeecea5a138704147e9142fc2e5ccab80504d6993b3a52faa6fd5a7ac302c7"} Oct 07 12:39:50 crc kubenswrapper[4854]: I1007 12:39:50.615084 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f725ba88-4d40-4eab-890d-e114448fabe9","Type":"ContainerStarted","Data":"5edd9c8fe76bd7a06cde58a29ab4392ccc9451823d6be61f1cd64ada2c9e90b0"} Oct 07 12:39:51 crc kubenswrapper[4854]: I1007 12:39:51.116529 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 12:39:51 crc kubenswrapper[4854]: I1007 12:39:51.117704 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 12:39:51 crc kubenswrapper[4854]: I1007 12:39:51.122208 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-z298z" Oct 07 12:39:51 crc kubenswrapper[4854]: I1007 12:39:51.132881 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 12:39:51 crc kubenswrapper[4854]: I1007 12:39:51.271290 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2vhn\" (UniqueName: \"kubernetes.io/projected/6c132b4c-1591-4194-8912-637f54cea863-kube-api-access-d2vhn\") pod \"kube-state-metrics-0\" (UID: \"6c132b4c-1591-4194-8912-637f54cea863\") " pod="openstack/kube-state-metrics-0" Oct 07 12:39:51 crc kubenswrapper[4854]: I1007 12:39:51.373606 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2vhn\" (UniqueName: \"kubernetes.io/projected/6c132b4c-1591-4194-8912-637f54cea863-kube-api-access-d2vhn\") pod \"kube-state-metrics-0\" (UID: \"6c132b4c-1591-4194-8912-637f54cea863\") " pod="openstack/kube-state-metrics-0" Oct 07 12:39:51 crc kubenswrapper[4854]: I1007 12:39:51.394052 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2vhn\" (UniqueName: \"kubernetes.io/projected/6c132b4c-1591-4194-8912-637f54cea863-kube-api-access-d2vhn\") pod \"kube-state-metrics-0\" (UID: \"6c132b4c-1591-4194-8912-637f54cea863\") " pod="openstack/kube-state-metrics-0" Oct 07 12:39:51 crc kubenswrapper[4854]: I1007 12:39:51.476696 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.381632 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hllqq"] Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.383251 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.385254 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.385515 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.390723 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-lcv92" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.392634 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-j5h2b"] Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.396271 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.428911 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-j5h2b"] Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.483188 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hllqq"] Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.558743 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-run\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.558794 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e6702f3-b113-49f9-b85f-a2d294bac6dc-ovn-controller-tls-certs\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.558823 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-log-ovn\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.558850 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-lib\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.558891 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jxd7\" (UniqueName: \"kubernetes.io/projected/847eb385-fc80-4568-813d-638dac11d81a-kube-api-access-4jxd7\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.558925 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e6702f3-b113-49f9-b85f-a2d294bac6dc-scripts\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.558940 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6702f3-b113-49f9-b85f-a2d294bac6dc-combined-ca-bundle\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.559174 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-run-ovn\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.559309 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h566\" (UniqueName: \"kubernetes.io/projected/6e6702f3-b113-49f9-b85f-a2d294bac6dc-kube-api-access-7h566\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.559392 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/847eb385-fc80-4568-813d-638dac11d81a-scripts\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.559462 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-log\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.559510 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-etc-ovs\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.559543 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-run\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661117 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-etc-ovs\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661210 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-run\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661270 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-run\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661302 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e6702f3-b113-49f9-b85f-a2d294bac6dc-ovn-controller-tls-certs\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661333 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-log-ovn\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661366 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-lib\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661405 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jxd7\" (UniqueName: \"kubernetes.io/projected/847eb385-fc80-4568-813d-638dac11d81a-kube-api-access-4jxd7\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661436 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6702f3-b113-49f9-b85f-a2d294bac6dc-combined-ca-bundle\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661458 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e6702f3-b113-49f9-b85f-a2d294bac6dc-scripts\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661510 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-run-ovn\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661561 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h566\" (UniqueName: \"kubernetes.io/projected/6e6702f3-b113-49f9-b85f-a2d294bac6dc-kube-api-access-7h566\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661585 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/847eb385-fc80-4568-813d-638dac11d81a-scripts\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661610 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-log\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661864 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-run\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.661906 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-etc-ovs\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.662012 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-log\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.662109 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-log-ovn\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.662444 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-run-ovn\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.662611 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-lib\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.662720 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-run\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.664442 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e6702f3-b113-49f9-b85f-a2d294bac6dc-scripts\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.668958 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6702f3-b113-49f9-b85f-a2d294bac6dc-combined-ca-bundle\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.669579 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e6702f3-b113-49f9-b85f-a2d294bac6dc-ovn-controller-tls-certs\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.686864 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jxd7\" (UniqueName: \"kubernetes.io/projected/847eb385-fc80-4568-813d-638dac11d81a-kube-api-access-4jxd7\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.686936 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h566\" (UniqueName: \"kubernetes.io/projected/6e6702f3-b113-49f9-b85f-a2d294bac6dc-kube-api-access-7h566\") pod \"ovn-controller-hllqq\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.708750 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hllqq" Oct 07 12:39:54 crc kubenswrapper[4854]: I1007 12:39:54.759867 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/847eb385-fc80-4568-813d-638dac11d81a-scripts\") pod \"ovn-controller-ovs-j5h2b\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:55 crc kubenswrapper[4854]: I1007 12:39:55.018973 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:39:55 crc kubenswrapper[4854]: I1007 12:39:55.800192 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 12:39:55 crc kubenswrapper[4854]: I1007 12:39:55.802851 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:55 crc kubenswrapper[4854]: I1007 12:39:55.806200 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 07 12:39:55 crc kubenswrapper[4854]: I1007 12:39:55.806657 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-hxd49" Oct 07 12:39:55 crc kubenswrapper[4854]: I1007 12:39:55.808480 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 07 12:39:55 crc kubenswrapper[4854]: I1007 12:39:55.808606 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 07 12:39:55 crc kubenswrapper[4854]: I1007 12:39:55.809006 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 07 12:39:55 crc kubenswrapper[4854]: I1007 12:39:55.811111 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 12:39:55 crc kubenswrapper[4854]: I1007 12:39:55.999853 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:55 crc kubenswrapper[4854]: I1007 12:39:55.999910 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:55.999933 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7v84\" (UniqueName: \"kubernetes.io/projected/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-kube-api-access-x7v84\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.000000 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.000036 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-config\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.000075 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.000093 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.000109 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.101347 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.101430 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-config\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.101496 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.101516 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.101536 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.101590 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.101610 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.101650 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7v84\" (UniqueName: \"kubernetes.io/projected/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-kube-api-access-x7v84\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.102966 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.103966 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-config\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.104258 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.104270 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.111036 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.114514 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.122241 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7v84\" (UniqueName: \"kubernetes.io/projected/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-kube-api-access-x7v84\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.128455 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.154165 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:56 crc kubenswrapper[4854]: I1007 12:39:56.430392 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.568161 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.570234 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.575356 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.576967 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.577325 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.577513 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-w8wf5" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.577679 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.755835 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f63842c-f85f-4e07-8221-4ce96b22bf44-config\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.755899 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f63842c-f85f-4e07-8221-4ce96b22bf44-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.755946 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.755977 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.756025 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6f63842c-f85f-4e07-8221-4ce96b22bf44-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.756287 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.756365 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.756494 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j59bh\" (UniqueName: \"kubernetes.io/projected/6f63842c-f85f-4e07-8221-4ce96b22bf44-kube-api-access-j59bh\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.858846 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j59bh\" (UniqueName: \"kubernetes.io/projected/6f63842c-f85f-4e07-8221-4ce96b22bf44-kube-api-access-j59bh\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.859249 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f63842c-f85f-4e07-8221-4ce96b22bf44-config\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.859354 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f63842c-f85f-4e07-8221-4ce96b22bf44-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.859469 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.859554 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.859777 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6f63842c-f85f-4e07-8221-4ce96b22bf44-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.859875 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.859977 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.860250 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.861188 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6f63842c-f85f-4e07-8221-4ce96b22bf44-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.861418 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f63842c-f85f-4e07-8221-4ce96b22bf44-config\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.861696 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f63842c-f85f-4e07-8221-4ce96b22bf44-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.866986 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.875890 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.885289 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.885317 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j59bh\" (UniqueName: \"kubernetes.io/projected/6f63842c-f85f-4e07-8221-4ce96b22bf44-kube-api-access-j59bh\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.887430 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " pod="openstack/ovsdbserver-sb-0" Oct 07 12:39:58 crc kubenswrapper[4854]: I1007 12:39:58.907834 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 12:40:07 crc kubenswrapper[4854]: E1007 12:40:07.160564 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 07 12:40:07 crc kubenswrapper[4854]: E1007 12:40:07.161655 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q2l9x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(79513100-48d2-4e7b-ae14-888322cab8f3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 12:40:07 crc kubenswrapper[4854]: E1007 12:40:07.162930 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="79513100-48d2-4e7b-ae14-888322cab8f3" Oct 07 12:40:07 crc kubenswrapper[4854]: E1007 12:40:07.854104 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="79513100-48d2-4e7b-ae14-888322cab8f3" Oct 07 12:40:18 crc kubenswrapper[4854]: E1007 12:40:18.187605 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Oct 07 12:40:18 crc kubenswrapper[4854]: E1007 12:40:18.188410 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6bfsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(f725ba88-4d40-4eab-890d-e114448fabe9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 12:40:18 crc kubenswrapper[4854]: E1007 12:40:18.189654 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="f725ba88-4d40-4eab-890d-e114448fabe9" Oct 07 12:40:18 crc kubenswrapper[4854]: E1007 12:40:18.663462 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 07 12:40:18 crc kubenswrapper[4854]: E1007 12:40:18.663729 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6h59t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(4c293f13-b2a5-4d4b-9f69-fd118e34eab2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 12:40:18 crc kubenswrapper[4854]: E1007 12:40:18.664945 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="4c293f13-b2a5-4d4b-9f69-fd118e34eab2" Oct 07 12:40:18 crc kubenswrapper[4854]: E1007 12:40:18.944363 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="4c293f13-b2a5-4d4b-9f69-fd118e34eab2" Oct 07 12:40:19 crc kubenswrapper[4854]: E1007 12:40:19.438863 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Oct 07 12:40:19 crc kubenswrapper[4854]: E1007 12:40:19.439081 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n97h56ch678h5dbhddh5cch674h54dh5d9h66fh547hd8h545h99hfh576hfdh5b5hd6h688h684h66h66bh649h5ch5c8h577h5cbh76h68bhdhddq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9nqgj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(c2cebadb-2142-477a-85b3-53e7c73fa6cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 12:40:19 crc kubenswrapper[4854]: E1007 12:40:19.440208 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="c2cebadb-2142-477a-85b3-53e7c73fa6cc" Oct 07 12:40:19 crc kubenswrapper[4854]: E1007 12:40:19.952061 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="c2cebadb-2142-477a-85b3-53e7c73fa6cc" Oct 07 12:40:20 crc kubenswrapper[4854]: E1007 12:40:20.314671 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 12:40:20 crc kubenswrapper[4854]: E1007 12:40:20.314955 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bhhv9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-ct6nt_openstack(5aa9a983-c51a-4a92-9b76-cd1432f92f57): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 12:40:20 crc kubenswrapper[4854]: E1007 12:40:20.316405 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" podUID="5aa9a983-c51a-4a92-9b76-cd1432f92f57" Oct 07 12:40:20 crc kubenswrapper[4854]: E1007 12:40:20.318776 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 12:40:20 crc kubenswrapper[4854]: E1007 12:40:20.318930 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l9fgb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-5nwlr_openstack(0c261705-13dc-4a3b-b7ad-6e31f546cb50): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 12:40:20 crc kubenswrapper[4854]: E1007 12:40:20.320175 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" podUID="0c261705-13dc-4a3b-b7ad-6e31f546cb50" Oct 07 12:40:20 crc kubenswrapper[4854]: E1007 12:40:20.340731 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 12:40:20 crc kubenswrapper[4854]: E1007 12:40:20.340858 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5vvtx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-cs286_openstack(db03aba0-4761-426e-ac26-50f599cc0ad9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 12:40:20 crc kubenswrapper[4854]: E1007 12:40:20.342245 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-cs286" podUID="db03aba0-4761-426e-ac26-50f599cc0ad9" Oct 07 12:40:20 crc kubenswrapper[4854]: E1007 12:40:20.536888 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 12:40:20 crc kubenswrapper[4854]: E1007 12:40:20.537391 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4zgz4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-9nxdf_openstack(14e4f50b-e577-497e-9ffb-24e3fe0e93c5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 12:40:20 crc kubenswrapper[4854]: E1007 12:40:20.539234 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-9nxdf" podUID="14e4f50b-e577-497e-9ffb-24e3fe0e93c5" Oct 07 12:40:20 crc kubenswrapper[4854]: I1007 12:40:20.777100 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 12:40:20 crc kubenswrapper[4854]: W1007 12:40:20.780842 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c132b4c_1591_4194_8912_637f54cea863.slice/crio-556ab3cd7fa1d85d049f918caeb3a5b0e00dd0377241a23fc44567af89a5a0f2 WatchSource:0}: Error finding container 556ab3cd7fa1d85d049f918caeb3a5b0e00dd0377241a23fc44567af89a5a0f2: Status 404 returned error can't find the container with id 556ab3cd7fa1d85d049f918caeb3a5b0e00dd0377241a23fc44567af89a5a0f2 Oct 07 12:40:20 crc kubenswrapper[4854]: I1007 12:40:20.873083 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hllqq"] Oct 07 12:40:20 crc kubenswrapper[4854]: W1007 12:40:20.890717 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e6702f3_b113_49f9_b85f_a2d294bac6dc.slice/crio-02e5362acc7369cd09a76a5490e1b8a1968c6ee1ced875ad0db909a6d49b1a47 WatchSource:0}: Error finding container 02e5362acc7369cd09a76a5490e1b8a1968c6ee1ced875ad0db909a6d49b1a47: Status 404 returned error can't find the container with id 02e5362acc7369cd09a76a5490e1b8a1968c6ee1ced875ad0db909a6d49b1a47 Oct 07 12:40:20 crc kubenswrapper[4854]: I1007 12:40:20.957973 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6c132b4c-1591-4194-8912-637f54cea863","Type":"ContainerStarted","Data":"556ab3cd7fa1d85d049f918caeb3a5b0e00dd0377241a23fc44567af89a5a0f2"} Oct 07 12:40:20 crc kubenswrapper[4854]: I1007 12:40:20.959232 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hllqq" event={"ID":"6e6702f3-b113-49f9-b85f-a2d294bac6dc","Type":"ContainerStarted","Data":"02e5362acc7369cd09a76a5490e1b8a1968c6ee1ced875ad0db909a6d49b1a47"} Oct 07 12:40:20 crc kubenswrapper[4854]: I1007 12:40:20.962877 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f725ba88-4d40-4eab-890d-e114448fabe9","Type":"ContainerStarted","Data":"17b9a36db2be4e7a076fcf2c8b6464f79d7b5f4ecc9432981400b860803f26bc"} Oct 07 12:40:20 crc kubenswrapper[4854]: I1007 12:40:20.966925 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e599e18f-63c0-4756-845c-973257921fd0","Type":"ContainerStarted","Data":"9c4dd23c8a3639b3bf49f7fd9dbee5833829885f4ce50007c496ab2ed016afff"} Oct 07 12:40:20 crc kubenswrapper[4854]: I1007 12:40:20.966969 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 12:40:20 crc kubenswrapper[4854]: E1007 12:40:20.968467 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-cs286" podUID="db03aba0-4761-426e-ac26-50f599cc0ad9" Oct 07 12:40:20 crc kubenswrapper[4854]: E1007 12:40:20.969984 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" podUID="5aa9a983-c51a-4a92-9b76-cd1432f92f57" Oct 07 12:40:20 crc kubenswrapper[4854]: W1007 12:40:20.970473 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbbd823b_2e1d_4901_855a_72cd9a13a6fd.slice/crio-bb54c736c2d80891bb8b405ef6a3e05368ba386d2418cd4bc7daf465ad50f464 WatchSource:0}: Error finding container bb54c736c2d80891bb8b405ef6a3e05368ba386d2418cd4bc7daf465ad50f464: Status 404 returned error can't find the container with id bb54c736c2d80891bb8b405ef6a3e05368ba386d2418cd4bc7daf465ad50f464 Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.392448 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.398703 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-9nxdf" Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.539509 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c261705-13dc-4a3b-b7ad-6e31f546cb50-dns-svc\") pod \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\" (UID: \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\") " Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.539566 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c261705-13dc-4a3b-b7ad-6e31f546cb50-config\") pod \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\" (UID: \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\") " Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.539624 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14e4f50b-e577-497e-9ffb-24e3fe0e93c5-config\") pod \"14e4f50b-e577-497e-9ffb-24e3fe0e93c5\" (UID: \"14e4f50b-e577-497e-9ffb-24e3fe0e93c5\") " Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.539745 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zgz4\" (UniqueName: \"kubernetes.io/projected/14e4f50b-e577-497e-9ffb-24e3fe0e93c5-kube-api-access-4zgz4\") pod \"14e4f50b-e577-497e-9ffb-24e3fe0e93c5\" (UID: \"14e4f50b-e577-497e-9ffb-24e3fe0e93c5\") " Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.539784 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9fgb\" (UniqueName: \"kubernetes.io/projected/0c261705-13dc-4a3b-b7ad-6e31f546cb50-kube-api-access-l9fgb\") pod \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\" (UID: \"0c261705-13dc-4a3b-b7ad-6e31f546cb50\") " Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.540209 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c261705-13dc-4a3b-b7ad-6e31f546cb50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0c261705-13dc-4a3b-b7ad-6e31f546cb50" (UID: "0c261705-13dc-4a3b-b7ad-6e31f546cb50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.540234 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c261705-13dc-4a3b-b7ad-6e31f546cb50-config" (OuterVolumeSpecName: "config") pod "0c261705-13dc-4a3b-b7ad-6e31f546cb50" (UID: "0c261705-13dc-4a3b-b7ad-6e31f546cb50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.540245 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14e4f50b-e577-497e-9ffb-24e3fe0e93c5-config" (OuterVolumeSpecName: "config") pod "14e4f50b-e577-497e-9ffb-24e3fe0e93c5" (UID: "14e4f50b-e577-497e-9ffb-24e3fe0e93c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.545118 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c261705-13dc-4a3b-b7ad-6e31f546cb50-kube-api-access-l9fgb" (OuterVolumeSpecName: "kube-api-access-l9fgb") pod "0c261705-13dc-4a3b-b7ad-6e31f546cb50" (UID: "0c261705-13dc-4a3b-b7ad-6e31f546cb50"). InnerVolumeSpecName "kube-api-access-l9fgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.546556 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e4f50b-e577-497e-9ffb-24e3fe0e93c5-kube-api-access-4zgz4" (OuterVolumeSpecName: "kube-api-access-4zgz4") pod "14e4f50b-e577-497e-9ffb-24e3fe0e93c5" (UID: "14e4f50b-e577-497e-9ffb-24e3fe0e93c5"). InnerVolumeSpecName "kube-api-access-4zgz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.641551 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zgz4\" (UniqueName: \"kubernetes.io/projected/14e4f50b-e577-497e-9ffb-24e3fe0e93c5-kube-api-access-4zgz4\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.641587 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9fgb\" (UniqueName: \"kubernetes.io/projected/0c261705-13dc-4a3b-b7ad-6e31f546cb50-kube-api-access-l9fgb\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.641598 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c261705-13dc-4a3b-b7ad-6e31f546cb50-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.641607 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c261705-13dc-4a3b-b7ad-6e31f546cb50-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.641617 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14e4f50b-e577-497e-9ffb-24e3fe0e93c5-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.711573 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.979691 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dbbd823b-2e1d-4901-855a-72cd9a13a6fd","Type":"ContainerStarted","Data":"bb54c736c2d80891bb8b405ef6a3e05368ba386d2418cd4bc7daf465ad50f464"} Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.981886 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-9nxdf" event={"ID":"14e4f50b-e577-497e-9ffb-24e3fe0e93c5","Type":"ContainerDied","Data":"266c107e72b470a1facfd51636f3326837a55bcd855426e38e557c5f87976189"} Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.981943 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-9nxdf" Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.985725 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" event={"ID":"0c261705-13dc-4a3b-b7ad-6e31f546cb50","Type":"ContainerDied","Data":"9de9c1927ef5ffb63b991d2008f60e6f61d9178c40dd2ce5b980833e17202448"} Oct 07 12:40:21 crc kubenswrapper[4854]: I1007 12:40:21.985768 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5nwlr" Oct 07 12:40:21 crc kubenswrapper[4854]: W1007 12:40:21.991777 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f63842c_f85f_4e07_8221_4ce96b22bf44.slice/crio-783c7da94febb249bab5774efd43f0f500c8148eb2f5c48b8262c6b322e93806 WatchSource:0}: Error finding container 783c7da94febb249bab5774efd43f0f500c8148eb2f5c48b8262c6b322e93806: Status 404 returned error can't find the container with id 783c7da94febb249bab5774efd43f0f500c8148eb2f5c48b8262c6b322e93806 Oct 07 12:40:22 crc kubenswrapper[4854]: I1007 12:40:22.012287 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-j5h2b"] Oct 07 12:40:22 crc kubenswrapper[4854]: I1007 12:40:22.058244 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-9nxdf"] Oct 07 12:40:22 crc kubenswrapper[4854]: I1007 12:40:22.062707 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-9nxdf"] Oct 07 12:40:22 crc kubenswrapper[4854]: I1007 12:40:22.072429 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5nwlr"] Oct 07 12:40:22 crc kubenswrapper[4854]: I1007 12:40:22.076929 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5nwlr"] Oct 07 12:40:22 crc kubenswrapper[4854]: W1007 12:40:22.311216 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod847eb385_fc80_4568_813d_638dac11d81a.slice/crio-8d8bfd0c8e1c01256c77de2d1efaa967dc06074cc98619a0d133d5d16b4687b4 WatchSource:0}: Error finding container 8d8bfd0c8e1c01256c77de2d1efaa967dc06074cc98619a0d133d5d16b4687b4: Status 404 returned error can't find the container with id 8d8bfd0c8e1c01256c77de2d1efaa967dc06074cc98619a0d133d5d16b4687b4 Oct 07 12:40:22 crc kubenswrapper[4854]: I1007 12:40:22.713783 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c261705-13dc-4a3b-b7ad-6e31f546cb50" path="/var/lib/kubelet/pods/0c261705-13dc-4a3b-b7ad-6e31f546cb50/volumes" Oct 07 12:40:22 crc kubenswrapper[4854]: I1007 12:40:22.714315 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14e4f50b-e577-497e-9ffb-24e3fe0e93c5" path="/var/lib/kubelet/pods/14e4f50b-e577-497e-9ffb-24e3fe0e93c5/volumes" Oct 07 12:40:22 crc kubenswrapper[4854]: I1007 12:40:22.994622 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j5h2b" event={"ID":"847eb385-fc80-4568-813d-638dac11d81a","Type":"ContainerStarted","Data":"8d8bfd0c8e1c01256c77de2d1efaa967dc06074cc98619a0d133d5d16b4687b4"} Oct 07 12:40:22 crc kubenswrapper[4854]: I1007 12:40:22.995909 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6f63842c-f85f-4e07-8221-4ce96b22bf44","Type":"ContainerStarted","Data":"783c7da94febb249bab5774efd43f0f500c8148eb2f5c48b8262c6b322e93806"} Oct 07 12:40:25 crc kubenswrapper[4854]: I1007 12:40:25.009578 4854 generic.go:334] "Generic (PLEG): container finished" podID="e599e18f-63c0-4756-845c-973257921fd0" containerID="9c4dd23c8a3639b3bf49f7fd9dbee5833829885f4ce50007c496ab2ed016afff" exitCode=0 Oct 07 12:40:25 crc kubenswrapper[4854]: I1007 12:40:25.009653 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e599e18f-63c0-4756-845c-973257921fd0","Type":"ContainerDied","Data":"9c4dd23c8a3639b3bf49f7fd9dbee5833829885f4ce50007c496ab2ed016afff"} Oct 07 12:40:25 crc kubenswrapper[4854]: I1007 12:40:25.013104 4854 generic.go:334] "Generic (PLEG): container finished" podID="f725ba88-4d40-4eab-890d-e114448fabe9" containerID="17b9a36db2be4e7a076fcf2c8b6464f79d7b5f4ecc9432981400b860803f26bc" exitCode=0 Oct 07 12:40:25 crc kubenswrapper[4854]: I1007 12:40:25.013127 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f725ba88-4d40-4eab-890d-e114448fabe9","Type":"ContainerDied","Data":"17b9a36db2be4e7a076fcf2c8b6464f79d7b5f4ecc9432981400b860803f26bc"} Oct 07 12:40:28 crc kubenswrapper[4854]: I1007 12:40:28.045178 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6f63842c-f85f-4e07-8221-4ce96b22bf44","Type":"ContainerStarted","Data":"746867422f38d038a9f9c789e768db3e7fd483a9e53e306db5812c58e0fc6129"} Oct 07 12:40:28 crc kubenswrapper[4854]: I1007 12:40:28.049675 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f725ba88-4d40-4eab-890d-e114448fabe9","Type":"ContainerStarted","Data":"11b5b9529231881fc76b4bbf3d8f51ee97af2d3afad7b47b620ba4ffbcd4d4ec"} Oct 07 12:40:28 crc kubenswrapper[4854]: I1007 12:40:28.056808 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e599e18f-63c0-4756-845c-973257921fd0","Type":"ContainerStarted","Data":"00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc"} Oct 07 12:40:28 crc kubenswrapper[4854]: I1007 12:40:28.059560 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dbbd823b-2e1d-4901-855a-72cd9a13a6fd","Type":"ContainerStarted","Data":"f92c3a568292b0a3d969cd5adf77c55dbce4f1e154b300face2ca6a1e33c82df"} Oct 07 12:40:28 crc kubenswrapper[4854]: I1007 12:40:28.062783 4854 generic.go:334] "Generic (PLEG): container finished" podID="847eb385-fc80-4568-813d-638dac11d81a" containerID="f75d1ca815bc77a4c9197c11735a40bdd70e1b2d81354ef594fe6844420ca14c" exitCode=0 Oct 07 12:40:28 crc kubenswrapper[4854]: I1007 12:40:28.062852 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j5h2b" event={"ID":"847eb385-fc80-4568-813d-638dac11d81a","Type":"ContainerDied","Data":"f75d1ca815bc77a4c9197c11735a40bdd70e1b2d81354ef594fe6844420ca14c"} Oct 07 12:40:28 crc kubenswrapper[4854]: I1007 12:40:28.065902 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6c132b4c-1591-4194-8912-637f54cea863","Type":"ContainerStarted","Data":"a2b65fae5dce349adef0285c33c91c4d9dbae7be68e750c07159d89c13fba861"} Oct 07 12:40:28 crc kubenswrapper[4854]: I1007 12:40:28.069218 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hllqq" event={"ID":"6e6702f3-b113-49f9-b85f-a2d294bac6dc","Type":"ContainerStarted","Data":"26e52344aa056f49042bb8a78b4be03e2be3272b704fe00a7d0d27c06de21f1c"} Oct 07 12:40:28 crc kubenswrapper[4854]: I1007 12:40:28.069479 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-hllqq" Oct 07 12:40:28 crc kubenswrapper[4854]: I1007 12:40:28.080649 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371995.774149 podStartE2EDuration="41.080626196s" podCreationTimestamp="2025-10-07 12:39:47 +0000 UTC" firstStartedPulling="2025-10-07 12:39:49.767971853 +0000 UTC m=+905.755804108" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:40:28.070444957 +0000 UTC m=+944.058277232" watchObservedRunningTime="2025-10-07 12:40:28.080626196 +0000 UTC m=+944.068458461" Oct 07 12:40:28 crc kubenswrapper[4854]: I1007 12:40:28.091381 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=31.04637831 podStartE2EDuration="37.091356662s" podCreationTimestamp="2025-10-07 12:39:51 +0000 UTC" firstStartedPulling="2025-10-07 12:40:20.783461258 +0000 UTC m=+936.771293503" lastFinishedPulling="2025-10-07 12:40:26.82843959 +0000 UTC m=+942.816271855" observedRunningTime="2025-10-07 12:40:28.090417225 +0000 UTC m=+944.078249490" watchObservedRunningTime="2025-10-07 12:40:28.091356662 +0000 UTC m=+944.079188917" Oct 07 12:40:28 crc kubenswrapper[4854]: I1007 12:40:28.136108 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hllqq" podStartSLOduration=28.201307033 podStartE2EDuration="34.13608613s" podCreationTimestamp="2025-10-07 12:39:54 +0000 UTC" firstStartedPulling="2025-10-07 12:40:20.893359874 +0000 UTC m=+936.881192129" lastFinishedPulling="2025-10-07 12:40:26.828138971 +0000 UTC m=+942.815971226" observedRunningTime="2025-10-07 12:40:28.133841934 +0000 UTC m=+944.121674219" watchObservedRunningTime="2025-10-07 12:40:28.13608613 +0000 UTC m=+944.123918385" Oct 07 12:40:28 crc kubenswrapper[4854]: I1007 12:40:28.157886 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=10.317955978 podStartE2EDuration="42.157866501s" podCreationTimestamp="2025-10-07 12:39:46 +0000 UTC" firstStartedPulling="2025-10-07 12:39:48.391881198 +0000 UTC m=+904.379713443" lastFinishedPulling="2025-10-07 12:40:20.231791721 +0000 UTC m=+936.219623966" observedRunningTime="2025-10-07 12:40:28.149884146 +0000 UTC m=+944.137716401" watchObservedRunningTime="2025-10-07 12:40:28.157866501 +0000 UTC m=+944.145698756" Oct 07 12:40:29 crc kubenswrapper[4854]: I1007 12:40:29.083456 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j5h2b" event={"ID":"847eb385-fc80-4568-813d-638dac11d81a","Type":"ContainerStarted","Data":"1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38"} Oct 07 12:40:29 crc kubenswrapper[4854]: I1007 12:40:29.083800 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j5h2b" event={"ID":"847eb385-fc80-4568-813d-638dac11d81a","Type":"ContainerStarted","Data":"49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6"} Oct 07 12:40:29 crc kubenswrapper[4854]: I1007 12:40:29.083817 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:40:29 crc kubenswrapper[4854]: I1007 12:40:29.083829 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:40:29 crc kubenswrapper[4854]: I1007 12:40:29.088285 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"79513100-48d2-4e7b-ae14-888322cab8f3","Type":"ContainerStarted","Data":"9c7ddd3a4c8f213d021b724e370c377203bc8a7c36b48c8171f8d9f35b3f1843"} Oct 07 12:40:29 crc kubenswrapper[4854]: I1007 12:40:29.089043 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 12:40:29 crc kubenswrapper[4854]: I1007 12:40:29.089994 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 07 12:40:29 crc kubenswrapper[4854]: I1007 12:40:29.090043 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 07 12:40:29 crc kubenswrapper[4854]: I1007 12:40:29.112338 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-j5h2b" podStartSLOduration=30.595072428 podStartE2EDuration="35.112314989s" podCreationTimestamp="2025-10-07 12:39:54 +0000 UTC" firstStartedPulling="2025-10-07 12:40:22.314137885 +0000 UTC m=+938.301970140" lastFinishedPulling="2025-10-07 12:40:26.831352775 +0000 UTC m=+942.819212701" observedRunningTime="2025-10-07 12:40:29.103532711 +0000 UTC m=+945.091364966" watchObservedRunningTime="2025-10-07 12:40:29.112314989 +0000 UTC m=+945.100147244" Oct 07 12:40:31 crc kubenswrapper[4854]: I1007 12:40:31.104929 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6f63842c-f85f-4e07-8221-4ce96b22bf44","Type":"ContainerStarted","Data":"3b5428679fa70871d5e7daa3b281bec044977353bc58c91be36e9d4e54d19bb6"} Oct 07 12:40:31 crc kubenswrapper[4854]: I1007 12:40:31.117931 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dbbd823b-2e1d-4901-855a-72cd9a13a6fd","Type":"ContainerStarted","Data":"cabf2a84de378bf0db16b49e75a56dc1796963bdeaadcbaca2e8f231777ff7d6"} Oct 07 12:40:31 crc kubenswrapper[4854]: I1007 12:40:31.129831 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=25.618120767 podStartE2EDuration="34.129809383s" podCreationTimestamp="2025-10-07 12:39:57 +0000 UTC" firstStartedPulling="2025-10-07 12:40:21.995642036 +0000 UTC m=+937.983474291" lastFinishedPulling="2025-10-07 12:40:30.507330652 +0000 UTC m=+946.495162907" observedRunningTime="2025-10-07 12:40:31.128809804 +0000 UTC m=+947.116642069" watchObservedRunningTime="2025-10-07 12:40:31.129809383 +0000 UTC m=+947.117641638" Oct 07 12:40:31 crc kubenswrapper[4854]: I1007 12:40:31.431415 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 07 12:40:31 crc kubenswrapper[4854]: I1007 12:40:31.908587 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 07 12:40:31 crc kubenswrapper[4854]: I1007 12:40:31.952273 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 07 12:40:31 crc kubenswrapper[4854]: I1007 12:40:31.974820 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=28.436034466 podStartE2EDuration="37.974794238s" podCreationTimestamp="2025-10-07 12:39:54 +0000 UTC" firstStartedPulling="2025-10-07 12:40:20.972495625 +0000 UTC m=+936.960327880" lastFinishedPulling="2025-10-07 12:40:30.511255387 +0000 UTC m=+946.499087652" observedRunningTime="2025-10-07 12:40:31.157058016 +0000 UTC m=+947.144890281" watchObservedRunningTime="2025-10-07 12:40:31.974794238 +0000 UTC m=+947.962626503" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.127597 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.177858 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.429274 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ct6nt"] Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.431196 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.479495 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-v2g7t"] Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.480787 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.483400 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.495643 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-d4cpx"] Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.496805 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.499638 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-v2g7t"] Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.501929 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.507921 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.525273 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-d4cpx"] Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.541429 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-v2g7t\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.541501 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/47f8159b-e07f-47bd-92e8-a57f3e0c545d-ovn-rundir\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.541571 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p28ml\" (UniqueName: \"kubernetes.io/projected/47f8159b-e07f-47bd-92e8-a57f3e0c545d-kube-api-access-p28ml\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.541611 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47f8159b-e07f-47bd-92e8-a57f3e0c545d-config\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.541635 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f8159b-e07f-47bd-92e8-a57f3e0c545d-combined-ca-bundle\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.541661 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/47f8159b-e07f-47bd-92e8-a57f3e0c545d-ovs-rundir\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.541688 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/47f8159b-e07f-47bd-92e8-a57f3e0c545d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.541736 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzrnr\" (UniqueName: \"kubernetes.io/projected/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-kube-api-access-nzrnr\") pod \"dnsmasq-dns-7f896c8c65-v2g7t\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.541777 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-v2g7t\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.541807 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-config\") pod \"dnsmasq-dns-7f896c8c65-v2g7t\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.643609 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/47f8159b-e07f-47bd-92e8-a57f3e0c545d-ovn-rundir\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.643756 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p28ml\" (UniqueName: \"kubernetes.io/projected/47f8159b-e07f-47bd-92e8-a57f3e0c545d-kube-api-access-p28ml\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.643821 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47f8159b-e07f-47bd-92e8-a57f3e0c545d-config\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.643843 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f8159b-e07f-47bd-92e8-a57f3e0c545d-combined-ca-bundle\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.643872 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/47f8159b-e07f-47bd-92e8-a57f3e0c545d-ovs-rundir\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.643915 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/47f8159b-e07f-47bd-92e8-a57f3e0c545d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.643988 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzrnr\" (UniqueName: \"kubernetes.io/projected/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-kube-api-access-nzrnr\") pod \"dnsmasq-dns-7f896c8c65-v2g7t\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.644018 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-v2g7t\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.644066 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-config\") pod \"dnsmasq-dns-7f896c8c65-v2g7t\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.644102 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-v2g7t\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.644167 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/47f8159b-e07f-47bd-92e8-a57f3e0c545d-ovn-rundir\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.644281 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/47f8159b-e07f-47bd-92e8-a57f3e0c545d-ovs-rundir\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.644683 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47f8159b-e07f-47bd-92e8-a57f3e0c545d-config\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.645235 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-config\") pod \"dnsmasq-dns-7f896c8c65-v2g7t\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.645389 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-v2g7t\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.646007 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-v2g7t\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.651489 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/47f8159b-e07f-47bd-92e8-a57f3e0c545d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.670587 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f8159b-e07f-47bd-92e8-a57f3e0c545d-combined-ca-bundle\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.670812 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p28ml\" (UniqueName: \"kubernetes.io/projected/47f8159b-e07f-47bd-92e8-a57f3e0c545d-kube-api-access-p28ml\") pod \"ovn-controller-metrics-d4cpx\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.680091 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzrnr\" (UniqueName: \"kubernetes.io/projected/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-kube-api-access-nzrnr\") pod \"dnsmasq-dns-7f896c8c65-v2g7t\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.796015 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-cs286"] Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.797922 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.814908 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.903211 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pwbvk"] Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.904675 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.912054 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.916922 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pwbvk"] Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.981498 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clnl9\" (UniqueName: \"kubernetes.io/projected/0979bfc9-c30c-4e15-bf34-2ad6ce892212-kube-api-access-clnl9\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.981551 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-config\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.981581 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.981602 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.981649 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:32 crc kubenswrapper[4854]: I1007 12:40:32.981795 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.086920 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa9a983-c51a-4a92-9b76-cd1432f92f57-config\") pod \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\" (UID: \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\") " Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.087396 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhhv9\" (UniqueName: \"kubernetes.io/projected/5aa9a983-c51a-4a92-9b76-cd1432f92f57-kube-api-access-bhhv9\") pod \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\" (UID: \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\") " Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.087417 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aa9a983-c51a-4a92-9b76-cd1432f92f57-dns-svc\") pod \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\" (UID: \"5aa9a983-c51a-4a92-9b76-cd1432f92f57\") " Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.087643 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.087720 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clnl9\" (UniqueName: \"kubernetes.io/projected/0979bfc9-c30c-4e15-bf34-2ad6ce892212-kube-api-access-clnl9\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.087743 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-config\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.087775 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.087800 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.088669 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.088763 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.089221 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aa9a983-c51a-4a92-9b76-cd1432f92f57-config" (OuterVolumeSpecName: "config") pod "5aa9a983-c51a-4a92-9b76-cd1432f92f57" (UID: "5aa9a983-c51a-4a92-9b76-cd1432f92f57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.089476 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-config\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.089970 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aa9a983-c51a-4a92-9b76-cd1432f92f57-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5aa9a983-c51a-4a92-9b76-cd1432f92f57" (UID: "5aa9a983-c51a-4a92-9b76-cd1432f92f57"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.089981 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.093524 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aa9a983-c51a-4a92-9b76-cd1432f92f57-kube-api-access-bhhv9" (OuterVolumeSpecName: "kube-api-access-bhhv9") pod "5aa9a983-c51a-4a92-9b76-cd1432f92f57" (UID: "5aa9a983-c51a-4a92-9b76-cd1432f92f57"). InnerVolumeSpecName "kube-api-access-bhhv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.116771 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clnl9\" (UniqueName: \"kubernetes.io/projected/0979bfc9-c30c-4e15-bf34-2ad6ce892212-kube-api-access-clnl9\") pod \"dnsmasq-dns-86db49b7ff-pwbvk\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.135549 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4c293f13-b2a5-4d4b-9f69-fd118e34eab2","Type":"ContainerStarted","Data":"e014726a2ceb29d439118ca6c1761fd0aaf31c3c217a1281253c7e737429cda0"} Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.138948 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.139622 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ct6nt" event={"ID":"5aa9a983-c51a-4a92-9b76-cd1432f92f57","Type":"ContainerDied","Data":"57c0f6cf9b0cdc5e7790bf7f8f4eb70e7ddf9ee0077afa132a1586cd918c8bf8"} Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.189570 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa9a983-c51a-4a92-9b76-cd1432f92f57-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.189603 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhhv9\" (UniqueName: \"kubernetes.io/projected/5aa9a983-c51a-4a92-9b76-cd1432f92f57-kube-api-access-bhhv9\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.189612 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aa9a983-c51a-4a92-9b76-cd1432f92f57-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.213947 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.238632 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ct6nt"] Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.251982 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ct6nt"] Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.318331 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.405454 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-v2g7t"] Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.533131 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-d4cpx"] Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.540043 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.542045 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.547384 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-8hj7z" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.547436 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.547689 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.556366 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.566434 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.595931 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.596016 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/698aae03-92da-4cc2-a9d2-ecdb5f143439-scripts\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.596039 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmz8x\" (UniqueName: \"kubernetes.io/projected/698aae03-92da-4cc2-a9d2-ecdb5f143439-kube-api-access-bmz8x\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.596062 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/698aae03-92da-4cc2-a9d2-ecdb5f143439-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.596188 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/698aae03-92da-4cc2-a9d2-ecdb5f143439-config\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.596272 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.596401 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.611776 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pwbvk"] Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.698030 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.698124 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.698256 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/698aae03-92da-4cc2-a9d2-ecdb5f143439-scripts\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.698296 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmz8x\" (UniqueName: \"kubernetes.io/projected/698aae03-92da-4cc2-a9d2-ecdb5f143439-kube-api-access-bmz8x\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.698327 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/698aae03-92da-4cc2-a9d2-ecdb5f143439-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.698359 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/698aae03-92da-4cc2-a9d2-ecdb5f143439-config\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.698386 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.704088 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/698aae03-92da-4cc2-a9d2-ecdb5f143439-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.704480 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/698aae03-92da-4cc2-a9d2-ecdb5f143439-scripts\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.704543 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/698aae03-92da-4cc2-a9d2-ecdb5f143439-config\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.704863 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.715335 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.715433 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.721896 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmz8x\" (UniqueName: \"kubernetes.io/projected/698aae03-92da-4cc2-a9d2-ecdb5f143439-kube-api-access-bmz8x\") pod \"ovn-northd-0\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: I1007 12:40:33.865340 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 12:40:33 crc kubenswrapper[4854]: W1007 12:40:33.931725 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5e39c6f_6431_4bf5_a6a9_4f0d1f88e904.slice/crio-5fb8652c76503130b60fbb5b7b9db536de8ce87df4de9aa2f6fe424814bd6a42 WatchSource:0}: Error finding container 5fb8652c76503130b60fbb5b7b9db536de8ce87df4de9aa2f6fe424814bd6a42: Status 404 returned error can't find the container with id 5fb8652c76503130b60fbb5b7b9db536de8ce87df4de9aa2f6fe424814bd6a42 Oct 07 12:40:33 crc kubenswrapper[4854]: W1007 12:40:33.946761 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0979bfc9_c30c_4e15_bf34_2ad6ce892212.slice/crio-a84721e217372c2459d84361cb6466b4222cc0d76f6078d142b44c63b82eee10 WatchSource:0}: Error finding container a84721e217372c2459d84361cb6466b4222cc0d76f6078d142b44c63b82eee10: Status 404 returned error can't find the container with id a84721e217372c2459d84361cb6466b4222cc0d76f6078d142b44c63b82eee10 Oct 07 12:40:34 crc kubenswrapper[4854]: I1007 12:40:34.162815 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" event={"ID":"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904","Type":"ContainerStarted","Data":"5fb8652c76503130b60fbb5b7b9db536de8ce87df4de9aa2f6fe424814bd6a42"} Oct 07 12:40:34 crc kubenswrapper[4854]: I1007 12:40:34.166099 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-d4cpx" event={"ID":"47f8159b-e07f-47bd-92e8-a57f3e0c545d","Type":"ContainerStarted","Data":"70cd323ec7510604eb6a540c69ab38d09f868b8b9b462b34ef8aa05cc1da4a28"} Oct 07 12:40:34 crc kubenswrapper[4854]: I1007 12:40:34.168258 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" event={"ID":"0979bfc9-c30c-4e15-bf34-2ad6ce892212","Type":"ContainerStarted","Data":"a84721e217372c2459d84361cb6466b4222cc0d76f6078d142b44c63b82eee10"} Oct 07 12:40:34 crc kubenswrapper[4854]: I1007 12:40:34.550489 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 12:40:34 crc kubenswrapper[4854]: W1007 12:40:34.563279 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod698aae03_92da_4cc2_a9d2_ecdb5f143439.slice/crio-147c22642e55d4e465f5e81792155fda570f16630ec6fc54a0deefbceed4dcaf WatchSource:0}: Error finding container 147c22642e55d4e465f5e81792155fda570f16630ec6fc54a0deefbceed4dcaf: Status 404 returned error can't find the container with id 147c22642e55d4e465f5e81792155fda570f16630ec6fc54a0deefbceed4dcaf Oct 07 12:40:34 crc kubenswrapper[4854]: I1007 12:40:34.714579 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aa9a983-c51a-4a92-9b76-cd1432f92f57" path="/var/lib/kubelet/pods/5aa9a983-c51a-4a92-9b76-cd1432f92f57/volumes" Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.171576 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.176413 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-d4cpx" event={"ID":"47f8159b-e07f-47bd-92e8-a57f3e0c545d","Type":"ContainerStarted","Data":"3e6647d1edb8724d040f38fe0892860ca4c3bb6eab536177c27f549d3f099144"} Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.181359 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"698aae03-92da-4cc2-a9d2-ecdb5f143439","Type":"ContainerStarted","Data":"147c22642e55d4e465f5e81792155fda570f16630ec6fc54a0deefbceed4dcaf"} Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.184254 4854 generic.go:334] "Generic (PLEG): container finished" podID="0979bfc9-c30c-4e15-bf34-2ad6ce892212" containerID="1b89918dd94baf47b1069784c6b60227f3f8a4acf69a2fd159423cad935abdc6" exitCode=0 Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.184279 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" event={"ID":"0979bfc9-c30c-4e15-bf34-2ad6ce892212","Type":"ContainerDied","Data":"1b89918dd94baf47b1069784c6b60227f3f8a4acf69a2fd159423cad935abdc6"} Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.186690 4854 generic.go:334] "Generic (PLEG): container finished" podID="a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904" containerID="5c641405336bede383dae45b938cbbc46902d6595f07d91d152999d6d9282284" exitCode=0 Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.186740 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" event={"ID":"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904","Type":"ContainerDied","Data":"5c641405336bede383dae45b938cbbc46902d6595f07d91d152999d6d9282284"} Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.190659 4854 generic.go:334] "Generic (PLEG): container finished" podID="db03aba0-4761-426e-ac26-50f599cc0ad9" containerID="78cb0739a6298ba156174ab454fddfed41a731b72758b5c3941e6ca4f5414615" exitCode=0 Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.190705 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-cs286" event={"ID":"db03aba0-4761-426e-ac26-50f599cc0ad9","Type":"ContainerDied","Data":"78cb0739a6298ba156174ab454fddfed41a731b72758b5c3941e6ca4f5414615"} Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.205722 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c2cebadb-2142-477a-85b3-53e7c73fa6cc","Type":"ContainerStarted","Data":"452b12795119ff3315cae84f6ce86ad79fa535175852c78bfc7a106938624b75"} Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.206419 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.223748 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.236348 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-d4cpx" podStartSLOduration=3.236324009 podStartE2EDuration="3.236324009s" podCreationTimestamp="2025-10-07 12:40:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:40:35.229059825 +0000 UTC m=+951.216892080" watchObservedRunningTime="2025-10-07 12:40:35.236324009 +0000 UTC m=+951.224156274" Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.358521 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.347248703 podStartE2EDuration="46.358500927s" podCreationTimestamp="2025-10-07 12:39:49 +0000 UTC" firstStartedPulling="2025-10-07 12:39:50.035546063 +0000 UTC m=+906.023378318" lastFinishedPulling="2025-10-07 12:40:34.046798277 +0000 UTC m=+950.034630542" observedRunningTime="2025-10-07 12:40:35.340164357 +0000 UTC m=+951.327996602" watchObservedRunningTime="2025-10-07 12:40:35.358500927 +0000 UTC m=+951.346333182" Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.506502 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-cs286" Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.632160 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-dns-svc\") pod \"db03aba0-4761-426e-ac26-50f599cc0ad9\" (UID: \"db03aba0-4761-426e-ac26-50f599cc0ad9\") " Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.632610 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vvtx\" (UniqueName: \"kubernetes.io/projected/db03aba0-4761-426e-ac26-50f599cc0ad9-kube-api-access-5vvtx\") pod \"db03aba0-4761-426e-ac26-50f599cc0ad9\" (UID: \"db03aba0-4761-426e-ac26-50f599cc0ad9\") " Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.632721 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-config\") pod \"db03aba0-4761-426e-ac26-50f599cc0ad9\" (UID: \"db03aba0-4761-426e-ac26-50f599cc0ad9\") " Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.635823 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db03aba0-4761-426e-ac26-50f599cc0ad9-kube-api-access-5vvtx" (OuterVolumeSpecName: "kube-api-access-5vvtx") pod "db03aba0-4761-426e-ac26-50f599cc0ad9" (UID: "db03aba0-4761-426e-ac26-50f599cc0ad9"). InnerVolumeSpecName "kube-api-access-5vvtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.662596 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-config" (OuterVolumeSpecName: "config") pod "db03aba0-4761-426e-ac26-50f599cc0ad9" (UID: "db03aba0-4761-426e-ac26-50f599cc0ad9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:35 crc kubenswrapper[4854]: E1007 12:40:35.662837 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-dns-svc podName:db03aba0-4761-426e-ac26-50f599cc0ad9 nodeName:}" failed. No retries permitted until 2025-10-07 12:40:36.162816289 +0000 UTC m=+952.150648544 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-dns-svc") pod "db03aba0-4761-426e-ac26-50f599cc0ad9" (UID: "db03aba0-4761-426e-ac26-50f599cc0ad9") : error deleting /var/lib/kubelet/pods/db03aba0-4761-426e-ac26-50f599cc0ad9/volume-subpaths: remove /var/lib/kubelet/pods/db03aba0-4761-426e-ac26-50f599cc0ad9/volume-subpaths: no such file or directory Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.734592 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vvtx\" (UniqueName: \"kubernetes.io/projected/db03aba0-4761-426e-ac26-50f599cc0ad9-kube-api-access-5vvtx\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:35 crc kubenswrapper[4854]: I1007 12:40:35.734631 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.215494 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" event={"ID":"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904","Type":"ContainerStarted","Data":"bbccf7f2e650655c5c8a0080d9a400830560bbd4d6ac677df091b8c297ed97c0"} Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.215574 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.218693 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-cs286" Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.218709 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-cs286" event={"ID":"db03aba0-4761-426e-ac26-50f599cc0ad9","Type":"ContainerDied","Data":"e833ee49d0aba6c3b607ece3d1174e1345a873f2e18e6fd930661f6824d230b6"} Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.218757 4854 scope.go:117] "RemoveContainer" containerID="78cb0739a6298ba156174ab454fddfed41a731b72758b5c3941e6ca4f5414615" Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.221234 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" event={"ID":"0979bfc9-c30c-4e15-bf34-2ad6ce892212","Type":"ContainerStarted","Data":"ea7a33cbb5dfdc8268be7d4b4fefe87d5f67f78082d47cf3fe1e4af92cc4870b"} Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.222744 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.235537 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" podStartSLOduration=3.807317934 podStartE2EDuration="4.235510344s" podCreationTimestamp="2025-10-07 12:40:32 +0000 UTC" firstStartedPulling="2025-10-07 12:40:33.943430073 +0000 UTC m=+949.931262348" lastFinishedPulling="2025-10-07 12:40:34.371622503 +0000 UTC m=+950.359454758" observedRunningTime="2025-10-07 12:40:36.232954779 +0000 UTC m=+952.220787044" watchObservedRunningTime="2025-10-07 12:40:36.235510344 +0000 UTC m=+952.223342619" Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.242598 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-dns-svc\") pod \"db03aba0-4761-426e-ac26-50f599cc0ad9\" (UID: \"db03aba0-4761-426e-ac26-50f599cc0ad9\") " Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.243313 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "db03aba0-4761-426e-ac26-50f599cc0ad9" (UID: "db03aba0-4761-426e-ac26-50f599cc0ad9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.346024 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db03aba0-4761-426e-ac26-50f599cc0ad9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.551122 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" podStartSLOduration=4.551102168 podStartE2EDuration="4.551102168s" podCreationTimestamp="2025-10-07 12:40:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:40:36.261570762 +0000 UTC m=+952.249403027" watchObservedRunningTime="2025-10-07 12:40:36.551102168 +0000 UTC m=+952.538934443" Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.578669 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-cs286"] Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.586008 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-cs286"] Oct 07 12:40:36 crc kubenswrapper[4854]: I1007 12:40:36.716115 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db03aba0-4761-426e-ac26-50f599cc0ad9" path="/var/lib/kubelet/pods/db03aba0-4761-426e-ac26-50f599cc0ad9/volumes" Oct 07 12:40:37 crc kubenswrapper[4854]: I1007 12:40:37.238890 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"698aae03-92da-4cc2-a9d2-ecdb5f143439","Type":"ContainerStarted","Data":"b1471691e8926652d5ee413690eb324d89b03c02d93cd735cdce65f268baf71f"} Oct 07 12:40:37 crc kubenswrapper[4854]: I1007 12:40:37.238940 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"698aae03-92da-4cc2-a9d2-ecdb5f143439","Type":"ContainerStarted","Data":"89f0eb49b67715625593c9e8be3b27b5e68c8ad630cb00e0f86cd890329ac748"} Oct 07 12:40:37 crc kubenswrapper[4854]: I1007 12:40:37.276367 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.827675864 podStartE2EDuration="4.276334486s" podCreationTimestamp="2025-10-07 12:40:33 +0000 UTC" firstStartedPulling="2025-10-07 12:40:34.571224292 +0000 UTC m=+950.559056557" lastFinishedPulling="2025-10-07 12:40:36.019882924 +0000 UTC m=+952.007715179" observedRunningTime="2025-10-07 12:40:37.264287521 +0000 UTC m=+953.252119806" watchObservedRunningTime="2025-10-07 12:40:37.276334486 +0000 UTC m=+953.264166771" Oct 07 12:40:37 crc kubenswrapper[4854]: I1007 12:40:37.853175 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 07 12:40:37 crc kubenswrapper[4854]: I1007 12:40:37.853524 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 07 12:40:37 crc kubenswrapper[4854]: I1007 12:40:37.904382 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 07 12:40:38 crc kubenswrapper[4854]: I1007 12:40:38.248898 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 07 12:40:38 crc kubenswrapper[4854]: I1007 12:40:38.325349 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.171191 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-h4dv5"] Oct 07 12:40:39 crc kubenswrapper[4854]: E1007 12:40:39.171595 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db03aba0-4761-426e-ac26-50f599cc0ad9" containerName="init" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.171618 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="db03aba0-4761-426e-ac26-50f599cc0ad9" containerName="init" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.171819 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="db03aba0-4761-426e-ac26-50f599cc0ad9" containerName="init" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.172484 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-h4dv5" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.184408 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-h4dv5"] Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.310220 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8gxj\" (UniqueName: \"kubernetes.io/projected/7b897234-a3ea-40e0-a94c-0f501794c5d4-kube-api-access-t8gxj\") pod \"keystone-db-create-h4dv5\" (UID: \"7b897234-a3ea-40e0-a94c-0f501794c5d4\") " pod="openstack/keystone-db-create-h4dv5" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.364848 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-m7b9h"] Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.366559 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m7b9h" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.374207 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m7b9h"] Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.412021 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8gxj\" (UniqueName: \"kubernetes.io/projected/7b897234-a3ea-40e0-a94c-0f501794c5d4-kube-api-access-t8gxj\") pod \"keystone-db-create-h4dv5\" (UID: \"7b897234-a3ea-40e0-a94c-0f501794c5d4\") " pod="openstack/keystone-db-create-h4dv5" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.433043 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8gxj\" (UniqueName: \"kubernetes.io/projected/7b897234-a3ea-40e0-a94c-0f501794c5d4-kube-api-access-t8gxj\") pod \"keystone-db-create-h4dv5\" (UID: \"7b897234-a3ea-40e0-a94c-0f501794c5d4\") " pod="openstack/keystone-db-create-h4dv5" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.479387 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.501437 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-h4dv5" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.513937 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g9jm\" (UniqueName: \"kubernetes.io/projected/7e143666-1ff7-48c5-b28b-42fc66cd56af-kube-api-access-6g9jm\") pod \"placement-db-create-m7b9h\" (UID: \"7e143666-1ff7-48c5-b28b-42fc66cd56af\") " pod="openstack/placement-db-create-m7b9h" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.618510 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g9jm\" (UniqueName: \"kubernetes.io/projected/7e143666-1ff7-48c5-b28b-42fc66cd56af-kube-api-access-6g9jm\") pod \"placement-db-create-m7b9h\" (UID: \"7e143666-1ff7-48c5-b28b-42fc66cd56af\") " pod="openstack/placement-db-create-m7b9h" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.636074 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g9jm\" (UniqueName: \"kubernetes.io/projected/7e143666-1ff7-48c5-b28b-42fc66cd56af-kube-api-access-6g9jm\") pod \"placement-db-create-m7b9h\" (UID: \"7e143666-1ff7-48c5-b28b-42fc66cd56af\") " pod="openstack/placement-db-create-m7b9h" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.682237 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m7b9h" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.696214 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-4q2wn"] Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.697289 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4q2wn" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.701987 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4q2wn"] Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.822724 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr27x\" (UniqueName: \"kubernetes.io/projected/278d1b2b-3c52-4899-9432-1c1ca85a0f6d-kube-api-access-pr27x\") pod \"glance-db-create-4q2wn\" (UID: \"278d1b2b-3c52-4899-9432-1c1ca85a0f6d\") " pod="openstack/glance-db-create-4q2wn" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.924883 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr27x\" (UniqueName: \"kubernetes.io/projected/278d1b2b-3c52-4899-9432-1c1ca85a0f6d-kube-api-access-pr27x\") pod \"glance-db-create-4q2wn\" (UID: \"278d1b2b-3c52-4899-9432-1c1ca85a0f6d\") " pod="openstack/glance-db-create-4q2wn" Oct 07 12:40:39 crc kubenswrapper[4854]: I1007 12:40:39.944918 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr27x\" (UniqueName: \"kubernetes.io/projected/278d1b2b-3c52-4899-9432-1c1ca85a0f6d-kube-api-access-pr27x\") pod \"glance-db-create-4q2wn\" (UID: \"278d1b2b-3c52-4899-9432-1c1ca85a0f6d\") " pod="openstack/glance-db-create-4q2wn" Oct 07 12:40:40 crc kubenswrapper[4854]: I1007 12:40:40.021737 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-h4dv5"] Oct 07 12:40:40 crc kubenswrapper[4854]: W1007 12:40:40.025039 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b897234_a3ea_40e0_a94c_0f501794c5d4.slice/crio-92e49287032c73bed576ea64656071629f8f69df9b923c5eadc09fde0cf5e2f8 WatchSource:0}: Error finding container 92e49287032c73bed576ea64656071629f8f69df9b923c5eadc09fde0cf5e2f8: Status 404 returned error can't find the container with id 92e49287032c73bed576ea64656071629f8f69df9b923c5eadc09fde0cf5e2f8 Oct 07 12:40:40 crc kubenswrapper[4854]: I1007 12:40:40.055540 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4q2wn" Oct 07 12:40:40 crc kubenswrapper[4854]: I1007 12:40:40.187966 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m7b9h"] Oct 07 12:40:40 crc kubenswrapper[4854]: I1007 12:40:40.283508 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-h4dv5" event={"ID":"7b897234-a3ea-40e0-a94c-0f501794c5d4","Type":"ContainerStarted","Data":"92e49287032c73bed576ea64656071629f8f69df9b923c5eadc09fde0cf5e2f8"} Oct 07 12:40:40 crc kubenswrapper[4854]: I1007 12:40:40.321440 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m7b9h" event={"ID":"7e143666-1ff7-48c5-b28b-42fc66cd56af","Type":"ContainerStarted","Data":"ec6b80b6699b2330758827605d30e605f0f6c87ed71f0daf07cf15eac142d148"} Oct 07 12:40:40 crc kubenswrapper[4854]: I1007 12:40:40.532421 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4q2wn"] Oct 07 12:40:40 crc kubenswrapper[4854]: W1007 12:40:40.540460 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod278d1b2b_3c52_4899_9432_1c1ca85a0f6d.slice/crio-0c482fe207b4ddfd69413e2a309d5aebd0cd9fdb9e27ab6dbe4a00b847fa1c4c WatchSource:0}: Error finding container 0c482fe207b4ddfd69413e2a309d5aebd0cd9fdb9e27ab6dbe4a00b847fa1c4c: Status 404 returned error can't find the container with id 0c482fe207b4ddfd69413e2a309d5aebd0cd9fdb9e27ab6dbe4a00b847fa1c4c Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.332197 4854 generic.go:334] "Generic (PLEG): container finished" podID="7b897234-a3ea-40e0-a94c-0f501794c5d4" containerID="15908e800d395af5a8d7f5f01dfdbf58a442b4cdc1794a444f36b3b3f543c0f8" exitCode=0 Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.332588 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-h4dv5" event={"ID":"7b897234-a3ea-40e0-a94c-0f501794c5d4","Type":"ContainerDied","Data":"15908e800d395af5a8d7f5f01dfdbf58a442b4cdc1794a444f36b3b3f543c0f8"} Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.337907 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4q2wn" event={"ID":"278d1b2b-3c52-4899-9432-1c1ca85a0f6d","Type":"ContainerStarted","Data":"1b0f1975974dee7a4bec144321d0e281dc8c0efd5c6aa89d6f4fd870a7a51e3a"} Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.337944 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4q2wn" event={"ID":"278d1b2b-3c52-4899-9432-1c1ca85a0f6d","Type":"ContainerStarted","Data":"0c482fe207b4ddfd69413e2a309d5aebd0cd9fdb9e27ab6dbe4a00b847fa1c4c"} Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.339669 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m7b9h" event={"ID":"7e143666-1ff7-48c5-b28b-42fc66cd56af","Type":"ContainerStarted","Data":"b8ca0301667963e3ec2366a136d0517b008b1b5e00eb6795c50b7ce516616afb"} Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.376428 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-4q2wn" podStartSLOduration=2.376410081 podStartE2EDuration="2.376410081s" podCreationTimestamp="2025-10-07 12:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:40:41.373529546 +0000 UTC m=+957.361361801" watchObservedRunningTime="2025-10-07 12:40:41.376410081 +0000 UTC m=+957.364242336" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.393463 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-m7b9h" podStartSLOduration=2.393430773 podStartE2EDuration="2.393430773s" podCreationTimestamp="2025-10-07 12:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:40:41.389305861 +0000 UTC m=+957.377138156" watchObservedRunningTime="2025-10-07 12:40:41.393430773 +0000 UTC m=+957.381263068" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.489494 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.557442 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-v2g7t"] Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.557660 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" podUID="a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904" containerName="dnsmasq-dns" containerID="cri-o://bbccf7f2e650655c5c8a0080d9a400830560bbd4d6ac677df091b8c297ed97c0" gracePeriod=10 Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.559635 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.600271 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-g5xmp"] Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.601805 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.624875 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-g5xmp"] Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.758076 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-dns-svc\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.758159 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.758188 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.758205 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmml6\" (UniqueName: \"kubernetes.io/projected/a039d6d1-21b4-480b-b8ed-c50693487fba-kube-api-access-fmml6\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.758223 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-config\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.860272 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-dns-svc\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.860350 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.860378 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.860399 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmml6\" (UniqueName: \"kubernetes.io/projected/a039d6d1-21b4-480b-b8ed-c50693487fba-kube-api-access-fmml6\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.860416 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-config\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.861408 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.861420 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-dns-svc\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.862021 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.862365 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-config\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:41 crc kubenswrapper[4854]: I1007 12:40:41.897968 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmml6\" (UniqueName: \"kubernetes.io/projected/a039d6d1-21b4-480b-b8ed-c50693487fba-kube-api-access-fmml6\") pod \"dnsmasq-dns-698758b865-g5xmp\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.081689 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.144762 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.283261 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-ovsdbserver-sb\") pod \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.283430 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzrnr\" (UniqueName: \"kubernetes.io/projected/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-kube-api-access-nzrnr\") pod \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.283464 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-dns-svc\") pod \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.283514 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-config\") pod \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\" (UID: \"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904\") " Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.287568 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-kube-api-access-nzrnr" (OuterVolumeSpecName: "kube-api-access-nzrnr") pod "a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904" (UID: "a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904"). InnerVolumeSpecName "kube-api-access-nzrnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.329043 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-config" (OuterVolumeSpecName: "config") pod "a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904" (UID: "a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.332705 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904" (UID: "a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.351821 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904" (UID: "a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.353799 4854 generic.go:334] "Generic (PLEG): container finished" podID="7e143666-1ff7-48c5-b28b-42fc66cd56af" containerID="b8ca0301667963e3ec2366a136d0517b008b1b5e00eb6795c50b7ce516616afb" exitCode=0 Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.353893 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m7b9h" event={"ID":"7e143666-1ff7-48c5-b28b-42fc66cd56af","Type":"ContainerDied","Data":"b8ca0301667963e3ec2366a136d0517b008b1b5e00eb6795c50b7ce516616afb"} Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.356031 4854 generic.go:334] "Generic (PLEG): container finished" podID="a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904" containerID="bbccf7f2e650655c5c8a0080d9a400830560bbd4d6ac677df091b8c297ed97c0" exitCode=0 Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.356093 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" event={"ID":"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904","Type":"ContainerDied","Data":"bbccf7f2e650655c5c8a0080d9a400830560bbd4d6ac677df091b8c297ed97c0"} Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.356114 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.356134 4854 scope.go:117] "RemoveContainer" containerID="bbccf7f2e650655c5c8a0080d9a400830560bbd4d6ac677df091b8c297ed97c0" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.356121 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-v2g7t" event={"ID":"a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904","Type":"ContainerDied","Data":"5fb8652c76503130b60fbb5b7b9db536de8ce87df4de9aa2f6fe424814bd6a42"} Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.358064 4854 generic.go:334] "Generic (PLEG): container finished" podID="278d1b2b-3c52-4899-9432-1c1ca85a0f6d" containerID="1b0f1975974dee7a4bec144321d0e281dc8c0efd5c6aa89d6f4fd870a7a51e3a" exitCode=0 Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.358143 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4q2wn" event={"ID":"278d1b2b-3c52-4899-9432-1c1ca85a0f6d","Type":"ContainerDied","Data":"1b0f1975974dee7a4bec144321d0e281dc8c0efd5c6aa89d6f4fd870a7a51e3a"} Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.394450 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.394514 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.394531 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzrnr\" (UniqueName: \"kubernetes.io/projected/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-kube-api-access-nzrnr\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.394546 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.419397 4854 scope.go:117] "RemoveContainer" containerID="5c641405336bede383dae45b938cbbc46902d6595f07d91d152999d6d9282284" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.432794 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-v2g7t"] Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.438503 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-v2g7t"] Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.444537 4854 scope.go:117] "RemoveContainer" containerID="bbccf7f2e650655c5c8a0080d9a400830560bbd4d6ac677df091b8c297ed97c0" Oct 07 12:40:42 crc kubenswrapper[4854]: E1007 12:40:42.444993 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbccf7f2e650655c5c8a0080d9a400830560bbd4d6ac677df091b8c297ed97c0\": container with ID starting with bbccf7f2e650655c5c8a0080d9a400830560bbd4d6ac677df091b8c297ed97c0 not found: ID does not exist" containerID="bbccf7f2e650655c5c8a0080d9a400830560bbd4d6ac677df091b8c297ed97c0" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.445052 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbccf7f2e650655c5c8a0080d9a400830560bbd4d6ac677df091b8c297ed97c0"} err="failed to get container status \"bbccf7f2e650655c5c8a0080d9a400830560bbd4d6ac677df091b8c297ed97c0\": rpc error: code = NotFound desc = could not find container \"bbccf7f2e650655c5c8a0080d9a400830560bbd4d6ac677df091b8c297ed97c0\": container with ID starting with bbccf7f2e650655c5c8a0080d9a400830560bbd4d6ac677df091b8c297ed97c0 not found: ID does not exist" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.445086 4854 scope.go:117] "RemoveContainer" containerID="5c641405336bede383dae45b938cbbc46902d6595f07d91d152999d6d9282284" Oct 07 12:40:42 crc kubenswrapper[4854]: E1007 12:40:42.445450 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c641405336bede383dae45b938cbbc46902d6595f07d91d152999d6d9282284\": container with ID starting with 5c641405336bede383dae45b938cbbc46902d6595f07d91d152999d6d9282284 not found: ID does not exist" containerID="5c641405336bede383dae45b938cbbc46902d6595f07d91d152999d6d9282284" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.445499 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c641405336bede383dae45b938cbbc46902d6595f07d91d152999d6d9282284"} err="failed to get container status \"5c641405336bede383dae45b938cbbc46902d6595f07d91d152999d6d9282284\": rpc error: code = NotFound desc = could not find container \"5c641405336bede383dae45b938cbbc46902d6595f07d91d152999d6d9282284\": container with ID starting with 5c641405336bede383dae45b938cbbc46902d6595f07d91d152999d6d9282284 not found: ID does not exist" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.534788 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-g5xmp"] Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.725374 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904" path="/var/lib/kubelet/pods/a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904/volumes" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.728535 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 07 12:40:42 crc kubenswrapper[4854]: E1007 12:40:42.728868 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904" containerName="init" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.728885 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904" containerName="init" Oct 07 12:40:42 crc kubenswrapper[4854]: E1007 12:40:42.728902 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904" containerName="dnsmasq-dns" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.728909 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904" containerName="dnsmasq-dns" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.729086 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e39c6f-6431-4bf5-a6a9-4f0d1f88e904" containerName="dnsmasq-dns" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.737047 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.739004 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.739384 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.739516 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-vnspg" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.739900 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.750832 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.833093 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-h4dv5" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.907859 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6f9410d0-f08a-4288-901b-8c28b54f6d53-cache\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.908186 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6f9410d0-f08a-4288-901b-8c28b54f6d53-lock\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.908248 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.908331 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j85zd\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-kube-api-access-j85zd\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:42 crc kubenswrapper[4854]: I1007 12:40:42.909105 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.011370 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8gxj\" (UniqueName: \"kubernetes.io/projected/7b897234-a3ea-40e0-a94c-0f501794c5d4-kube-api-access-t8gxj\") pod \"7b897234-a3ea-40e0-a94c-0f501794c5d4\" (UID: \"7b897234-a3ea-40e0-a94c-0f501794c5d4\") " Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.012048 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6f9410d0-f08a-4288-901b-8c28b54f6d53-lock\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.012248 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.012453 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j85zd\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-kube-api-access-j85zd\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.012604 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6f9410d0-f08a-4288-901b-8c28b54f6d53-lock\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.012615 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:43 crc kubenswrapper[4854]: E1007 12:40:43.012455 4854 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 12:40:43 crc kubenswrapper[4854]: E1007 12:40:43.012816 4854 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.012861 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6f9410d0-f08a-4288-901b-8c28b54f6d53-cache\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:43 crc kubenswrapper[4854]: E1007 12:40:43.012877 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift podName:6f9410d0-f08a-4288-901b-8c28b54f6d53 nodeName:}" failed. No retries permitted until 2025-10-07 12:40:43.512854934 +0000 UTC m=+959.500687289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift") pod "swift-storage-0" (UID: "6f9410d0-f08a-4288-901b-8c28b54f6d53") : configmap "swift-ring-files" not found Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.013214 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.013262 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6f9410d0-f08a-4288-901b-8c28b54f6d53-cache\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.017900 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b897234-a3ea-40e0-a94c-0f501794c5d4-kube-api-access-t8gxj" (OuterVolumeSpecName: "kube-api-access-t8gxj") pod "7b897234-a3ea-40e0-a94c-0f501794c5d4" (UID: "7b897234-a3ea-40e0-a94c-0f501794c5d4"). InnerVolumeSpecName "kube-api-access-t8gxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.033251 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j85zd\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-kube-api-access-j85zd\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.049402 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.070418 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vsdfv"] Oct 07 12:40:43 crc kubenswrapper[4854]: E1007 12:40:43.070833 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b897234-a3ea-40e0-a94c-0f501794c5d4" containerName="mariadb-database-create" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.070856 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b897234-a3ea-40e0-a94c-0f501794c5d4" containerName="mariadb-database-create" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.071059 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b897234-a3ea-40e0-a94c-0f501794c5d4" containerName="mariadb-database-create" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.071757 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.073867 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.073889 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.075544 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.085733 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vsdfv"] Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.114906 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8gxj\" (UniqueName: \"kubernetes.io/projected/7b897234-a3ea-40e0-a94c-0f501794c5d4-kube-api-access-t8gxj\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.216189 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/933b6600-078b-4555-b765-7a22cf257e7c-ring-data-devices\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.216271 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/933b6600-078b-4555-b765-7a22cf257e7c-scripts\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.216362 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml9p7\" (UniqueName: \"kubernetes.io/projected/933b6600-078b-4555-b765-7a22cf257e7c-kube-api-access-ml9p7\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.216466 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-combined-ca-bundle\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.216546 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-swiftconf\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.216593 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-dispersionconf\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.216778 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/933b6600-078b-4555-b765-7a22cf257e7c-etc-swift\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.317894 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-swiftconf\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.317944 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-dispersionconf\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.317975 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/933b6600-078b-4555-b765-7a22cf257e7c-etc-swift\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.318018 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/933b6600-078b-4555-b765-7a22cf257e7c-ring-data-devices\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.318046 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/933b6600-078b-4555-b765-7a22cf257e7c-scripts\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.318087 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml9p7\" (UniqueName: \"kubernetes.io/projected/933b6600-078b-4555-b765-7a22cf257e7c-kube-api-access-ml9p7\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.318119 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-combined-ca-bundle\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.318969 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/933b6600-078b-4555-b765-7a22cf257e7c-etc-swift\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.319159 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/933b6600-078b-4555-b765-7a22cf257e7c-ring-data-devices\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.319188 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/933b6600-078b-4555-b765-7a22cf257e7c-scripts\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.319191 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.321250 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-swiftconf\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.322283 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-dispersionconf\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.322703 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-combined-ca-bundle\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.337741 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml9p7\" (UniqueName: \"kubernetes.io/projected/933b6600-078b-4555-b765-7a22cf257e7c-kube-api-access-ml9p7\") pod \"swift-ring-rebalance-vsdfv\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.370421 4854 generic.go:334] "Generic (PLEG): container finished" podID="a039d6d1-21b4-480b-b8ed-c50693487fba" containerID="83587e7eabfb609f59037051b59e8c8d9fc3ca00cdcbc65b9336aaea9034b914" exitCode=0 Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.370495 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-g5xmp" event={"ID":"a039d6d1-21b4-480b-b8ed-c50693487fba","Type":"ContainerDied","Data":"83587e7eabfb609f59037051b59e8c8d9fc3ca00cdcbc65b9336aaea9034b914"} Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.370518 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-g5xmp" event={"ID":"a039d6d1-21b4-480b-b8ed-c50693487fba","Type":"ContainerStarted","Data":"d139652ed59936e19247e2048e5ebbc0241fd32a04ffab193c1963c3e5f1fbd5"} Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.372126 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-h4dv5" event={"ID":"7b897234-a3ea-40e0-a94c-0f501794c5d4","Type":"ContainerDied","Data":"92e49287032c73bed576ea64656071629f8f69df9b923c5eadc09fde0cf5e2f8"} Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.372233 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92e49287032c73bed576ea64656071629f8f69df9b923c5eadc09fde0cf5e2f8" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.372200 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-h4dv5" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.419231 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.521186 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:43 crc kubenswrapper[4854]: E1007 12:40:43.521938 4854 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 12:40:43 crc kubenswrapper[4854]: E1007 12:40:43.521959 4854 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 12:40:43 crc kubenswrapper[4854]: E1007 12:40:43.521995 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift podName:6f9410d0-f08a-4288-901b-8c28b54f6d53 nodeName:}" failed. No retries permitted until 2025-10-07 12:40:44.521982758 +0000 UTC m=+960.509815013 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift") pod "swift-storage-0" (UID: "6f9410d0-f08a-4288-901b-8c28b54f6d53") : configmap "swift-ring-files" not found Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.767613 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m7b9h" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.844382 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4q2wn" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.929396 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g9jm\" (UniqueName: \"kubernetes.io/projected/7e143666-1ff7-48c5-b28b-42fc66cd56af-kube-api-access-6g9jm\") pod \"7e143666-1ff7-48c5-b28b-42fc66cd56af\" (UID: \"7e143666-1ff7-48c5-b28b-42fc66cd56af\") " Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.945219 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e143666-1ff7-48c5-b28b-42fc66cd56af-kube-api-access-6g9jm" (OuterVolumeSpecName: "kube-api-access-6g9jm") pod "7e143666-1ff7-48c5-b28b-42fc66cd56af" (UID: "7e143666-1ff7-48c5-b28b-42fc66cd56af"). InnerVolumeSpecName "kube-api-access-6g9jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:40:43 crc kubenswrapper[4854]: I1007 12:40:43.962212 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vsdfv"] Oct 07 12:40:43 crc kubenswrapper[4854]: W1007 12:40:43.964870 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod933b6600_078b_4555_b765_7a22cf257e7c.slice/crio-2c8e83dae9ead1da6d14e058dc73d40f5cf786a2d38842459c7d9de1d0f5fcf3 WatchSource:0}: Error finding container 2c8e83dae9ead1da6d14e058dc73d40f5cf786a2d38842459c7d9de1d0f5fcf3: Status 404 returned error can't find the container with id 2c8e83dae9ead1da6d14e058dc73d40f5cf786a2d38842459c7d9de1d0f5fcf3 Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.031696 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr27x\" (UniqueName: \"kubernetes.io/projected/278d1b2b-3c52-4899-9432-1c1ca85a0f6d-kube-api-access-pr27x\") pod \"278d1b2b-3c52-4899-9432-1c1ca85a0f6d\" (UID: \"278d1b2b-3c52-4899-9432-1c1ca85a0f6d\") " Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.032139 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g9jm\" (UniqueName: \"kubernetes.io/projected/7e143666-1ff7-48c5-b28b-42fc66cd56af-kube-api-access-6g9jm\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.034827 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278d1b2b-3c52-4899-9432-1c1ca85a0f6d-kube-api-access-pr27x" (OuterVolumeSpecName: "kube-api-access-pr27x") pod "278d1b2b-3c52-4899-9432-1c1ca85a0f6d" (UID: "278d1b2b-3c52-4899-9432-1c1ca85a0f6d"). InnerVolumeSpecName "kube-api-access-pr27x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.133800 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr27x\" (UniqueName: \"kubernetes.io/projected/278d1b2b-3c52-4899-9432-1c1ca85a0f6d-kube-api-access-pr27x\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.383685 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m7b9h" event={"ID":"7e143666-1ff7-48c5-b28b-42fc66cd56af","Type":"ContainerDied","Data":"ec6b80b6699b2330758827605d30e605f0f6c87ed71f0daf07cf15eac142d148"} Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.384507 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec6b80b6699b2330758827605d30e605f0f6c87ed71f0daf07cf15eac142d148" Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.384593 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m7b9h" Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.385911 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-g5xmp" event={"ID":"a039d6d1-21b4-480b-b8ed-c50693487fba","Type":"ContainerStarted","Data":"59e333b744eb6889433b1f20e4487a8bd3bebd36f78607ef3df410ee233120f6"} Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.386031 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.387024 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vsdfv" event={"ID":"933b6600-078b-4555-b765-7a22cf257e7c","Type":"ContainerStarted","Data":"2c8e83dae9ead1da6d14e058dc73d40f5cf786a2d38842459c7d9de1d0f5fcf3"} Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.388265 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4q2wn" event={"ID":"278d1b2b-3c52-4899-9432-1c1ca85a0f6d","Type":"ContainerDied","Data":"0c482fe207b4ddfd69413e2a309d5aebd0cd9fdb9e27ab6dbe4a00b847fa1c4c"} Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.388298 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c482fe207b4ddfd69413e2a309d5aebd0cd9fdb9e27ab6dbe4a00b847fa1c4c" Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.388321 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4q2wn" Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.439701 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-g5xmp" podStartSLOduration=3.439017894 podStartE2EDuration="3.439017894s" podCreationTimestamp="2025-10-07 12:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:40:44.409312009 +0000 UTC m=+960.397144274" watchObservedRunningTime="2025-10-07 12:40:44.439017894 +0000 UTC m=+960.426850149" Oct 07 12:40:44 crc kubenswrapper[4854]: I1007 12:40:44.542043 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:44 crc kubenswrapper[4854]: E1007 12:40:44.542260 4854 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 12:40:44 crc kubenswrapper[4854]: E1007 12:40:44.542432 4854 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 12:40:44 crc kubenswrapper[4854]: E1007 12:40:44.542484 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift podName:6f9410d0-f08a-4288-901b-8c28b54f6d53 nodeName:}" failed. No retries permitted until 2025-10-07 12:40:46.54246629 +0000 UTC m=+962.530298545 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift") pod "swift-storage-0" (UID: "6f9410d0-f08a-4288-901b-8c28b54f6d53") : configmap "swift-ring-files" not found Oct 07 12:40:46 crc kubenswrapper[4854]: I1007 12:40:46.578313 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:46 crc kubenswrapper[4854]: E1007 12:40:46.578531 4854 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 12:40:46 crc kubenswrapper[4854]: E1007 12:40:46.578766 4854 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 12:40:46 crc kubenswrapper[4854]: E1007 12:40:46.578835 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift podName:6f9410d0-f08a-4288-901b-8c28b54f6d53 nodeName:}" failed. No retries permitted until 2025-10-07 12:40:50.57881365 +0000 UTC m=+966.566645915 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift") pod "swift-storage-0" (UID: "6f9410d0-f08a-4288-901b-8c28b54f6d53") : configmap "swift-ring-files" not found Oct 07 12:40:47 crc kubenswrapper[4854]: I1007 12:40:47.412703 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vsdfv" event={"ID":"933b6600-078b-4555-b765-7a22cf257e7c","Type":"ContainerStarted","Data":"692cc3bb8cbbd18ce414ae4744d581b16427eea3290c632474a8a767fdbd9811"} Oct 07 12:40:48 crc kubenswrapper[4854]: I1007 12:40:48.964116 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 07 12:40:48 crc kubenswrapper[4854]: I1007 12:40:48.988076 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vsdfv" podStartSLOduration=2.948320411 podStartE2EDuration="5.988052331s" podCreationTimestamp="2025-10-07 12:40:43 +0000 UTC" firstStartedPulling="2025-10-07 12:40:43.96690965 +0000 UTC m=+959.954741905" lastFinishedPulling="2025-10-07 12:40:47.00664157 +0000 UTC m=+962.994473825" observedRunningTime="2025-10-07 12:40:47.438540878 +0000 UTC m=+963.426373213" watchObservedRunningTime="2025-10-07 12:40:48.988052331 +0000 UTC m=+964.975884586" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.196867 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-242e-account-create-7jrdt"] Oct 07 12:40:49 crc kubenswrapper[4854]: E1007 12:40:49.197253 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e143666-1ff7-48c5-b28b-42fc66cd56af" containerName="mariadb-database-create" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.197273 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e143666-1ff7-48c5-b28b-42fc66cd56af" containerName="mariadb-database-create" Oct 07 12:40:49 crc kubenswrapper[4854]: E1007 12:40:49.197287 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278d1b2b-3c52-4899-9432-1c1ca85a0f6d" containerName="mariadb-database-create" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.197294 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="278d1b2b-3c52-4899-9432-1c1ca85a0f6d" containerName="mariadb-database-create" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.197459 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e143666-1ff7-48c5-b28b-42fc66cd56af" containerName="mariadb-database-create" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.197475 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="278d1b2b-3c52-4899-9432-1c1ca85a0f6d" containerName="mariadb-database-create" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.197971 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-242e-account-create-7jrdt" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.205922 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.206343 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-242e-account-create-7jrdt"] Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.347836 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c2dt\" (UniqueName: \"kubernetes.io/projected/db6903f7-f211-44ad-a2d6-cc7b92c1c477-kube-api-access-8c2dt\") pod \"keystone-242e-account-create-7jrdt\" (UID: \"db6903f7-f211-44ad-a2d6-cc7b92c1c477\") " pod="openstack/keystone-242e-account-create-7jrdt" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.449771 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c2dt\" (UniqueName: \"kubernetes.io/projected/db6903f7-f211-44ad-a2d6-cc7b92c1c477-kube-api-access-8c2dt\") pod \"keystone-242e-account-create-7jrdt\" (UID: \"db6903f7-f211-44ad-a2d6-cc7b92c1c477\") " pod="openstack/keystone-242e-account-create-7jrdt" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.480857 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c2dt\" (UniqueName: \"kubernetes.io/projected/db6903f7-f211-44ad-a2d6-cc7b92c1c477-kube-api-access-8c2dt\") pod \"keystone-242e-account-create-7jrdt\" (UID: \"db6903f7-f211-44ad-a2d6-cc7b92c1c477\") " pod="openstack/keystone-242e-account-create-7jrdt" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.517474 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-242e-account-create-7jrdt" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.800655 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3267-account-create-lnz96"] Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.801818 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3267-account-create-lnz96" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.805646 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.816103 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3267-account-create-lnz96"] Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.959375 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sngc8\" (UniqueName: \"kubernetes.io/projected/d71ee2f2-10ff-48f3-8eed-e8eab883ad22-kube-api-access-sngc8\") pod \"glance-3267-account-create-lnz96\" (UID: \"d71ee2f2-10ff-48f3-8eed-e8eab883ad22\") " pod="openstack/glance-3267-account-create-lnz96" Oct 07 12:40:49 crc kubenswrapper[4854]: I1007 12:40:49.993456 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-242e-account-create-7jrdt"] Oct 07 12:40:50 crc kubenswrapper[4854]: I1007 12:40:50.061180 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sngc8\" (UniqueName: \"kubernetes.io/projected/d71ee2f2-10ff-48f3-8eed-e8eab883ad22-kube-api-access-sngc8\") pod \"glance-3267-account-create-lnz96\" (UID: \"d71ee2f2-10ff-48f3-8eed-e8eab883ad22\") " pod="openstack/glance-3267-account-create-lnz96" Oct 07 12:40:50 crc kubenswrapper[4854]: I1007 12:40:50.079821 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sngc8\" (UniqueName: \"kubernetes.io/projected/d71ee2f2-10ff-48f3-8eed-e8eab883ad22-kube-api-access-sngc8\") pod \"glance-3267-account-create-lnz96\" (UID: \"d71ee2f2-10ff-48f3-8eed-e8eab883ad22\") " pod="openstack/glance-3267-account-create-lnz96" Oct 07 12:40:50 crc kubenswrapper[4854]: I1007 12:40:50.130758 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3267-account-create-lnz96" Oct 07 12:40:50 crc kubenswrapper[4854]: I1007 12:40:50.452743 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-242e-account-create-7jrdt" event={"ID":"db6903f7-f211-44ad-a2d6-cc7b92c1c477","Type":"ContainerStarted","Data":"f86bf46a9d38e3d5f5310e3c1c0c4e4d1de7f562c6847e508b6bccd1073c8209"} Oct 07 12:40:50 crc kubenswrapper[4854]: I1007 12:40:50.453009 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-242e-account-create-7jrdt" event={"ID":"db6903f7-f211-44ad-a2d6-cc7b92c1c477","Type":"ContainerStarted","Data":"ab254487269d5c721e991c433412b96dabb023dd5649366c28e968dc23fb5525"} Oct 07 12:40:50 crc kubenswrapper[4854]: I1007 12:40:50.470763 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-242e-account-create-7jrdt" podStartSLOduration=1.470744506 podStartE2EDuration="1.470744506s" podCreationTimestamp="2025-10-07 12:40:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:40:50.467455619 +0000 UTC m=+966.455287874" watchObservedRunningTime="2025-10-07 12:40:50.470744506 +0000 UTC m=+966.458576761" Oct 07 12:40:50 crc kubenswrapper[4854]: I1007 12:40:50.655471 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3267-account-create-lnz96"] Oct 07 12:40:50 crc kubenswrapper[4854]: W1007 12:40:50.661196 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd71ee2f2_10ff_48f3_8eed_e8eab883ad22.slice/crio-01c43889445cfda0d8dcf30dcdf55bcf4cc48264c0bf2bf174410517b948d661 WatchSource:0}: Error finding container 01c43889445cfda0d8dcf30dcdf55bcf4cc48264c0bf2bf174410517b948d661: Status 404 returned error can't find the container with id 01c43889445cfda0d8dcf30dcdf55bcf4cc48264c0bf2bf174410517b948d661 Oct 07 12:40:50 crc kubenswrapper[4854]: I1007 12:40:50.673794 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:50 crc kubenswrapper[4854]: E1007 12:40:50.674055 4854 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 12:40:50 crc kubenswrapper[4854]: E1007 12:40:50.674098 4854 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 12:40:50 crc kubenswrapper[4854]: E1007 12:40:50.674197 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift podName:6f9410d0-f08a-4288-901b-8c28b54f6d53 nodeName:}" failed. No retries permitted until 2025-10-07 12:40:58.674170827 +0000 UTC m=+974.662003112 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift") pod "swift-storage-0" (UID: "6f9410d0-f08a-4288-901b-8c28b54f6d53") : configmap "swift-ring-files" not found Oct 07 12:40:51 crc kubenswrapper[4854]: I1007 12:40:51.466492 4854 generic.go:334] "Generic (PLEG): container finished" podID="db6903f7-f211-44ad-a2d6-cc7b92c1c477" containerID="f86bf46a9d38e3d5f5310e3c1c0c4e4d1de7f562c6847e508b6bccd1073c8209" exitCode=0 Oct 07 12:40:51 crc kubenswrapper[4854]: I1007 12:40:51.466587 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-242e-account-create-7jrdt" event={"ID":"db6903f7-f211-44ad-a2d6-cc7b92c1c477","Type":"ContainerDied","Data":"f86bf46a9d38e3d5f5310e3c1c0c4e4d1de7f562c6847e508b6bccd1073c8209"} Oct 07 12:40:51 crc kubenswrapper[4854]: I1007 12:40:51.468677 4854 generic.go:334] "Generic (PLEG): container finished" podID="d71ee2f2-10ff-48f3-8eed-e8eab883ad22" containerID="b178e7943cef5d513b5b68323ca5b565273085d4c85f96bb739f2a36aa5dbd2d" exitCode=0 Oct 07 12:40:51 crc kubenswrapper[4854]: I1007 12:40:51.468715 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3267-account-create-lnz96" event={"ID":"d71ee2f2-10ff-48f3-8eed-e8eab883ad22","Type":"ContainerDied","Data":"b178e7943cef5d513b5b68323ca5b565273085d4c85f96bb739f2a36aa5dbd2d"} Oct 07 12:40:51 crc kubenswrapper[4854]: I1007 12:40:51.468757 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3267-account-create-lnz96" event={"ID":"d71ee2f2-10ff-48f3-8eed-e8eab883ad22","Type":"ContainerStarted","Data":"01c43889445cfda0d8dcf30dcdf55bcf4cc48264c0bf2bf174410517b948d661"} Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.084407 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.171620 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pwbvk"] Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.172040 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" podUID="0979bfc9-c30c-4e15-bf34-2ad6ce892212" containerName="dnsmasq-dns" containerID="cri-o://ea7a33cbb5dfdc8268be7d4b4fefe87d5f67f78082d47cf3fe1e4af92cc4870b" gracePeriod=10 Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.480914 4854 generic.go:334] "Generic (PLEG): container finished" podID="0979bfc9-c30c-4e15-bf34-2ad6ce892212" containerID="ea7a33cbb5dfdc8268be7d4b4fefe87d5f67f78082d47cf3fe1e4af92cc4870b" exitCode=0 Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.481181 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" event={"ID":"0979bfc9-c30c-4e15-bf34-2ad6ce892212","Type":"ContainerDied","Data":"ea7a33cbb5dfdc8268be7d4b4fefe87d5f67f78082d47cf3fe1e4af92cc4870b"} Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.675314 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.818702 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clnl9\" (UniqueName: \"kubernetes.io/projected/0979bfc9-c30c-4e15-bf34-2ad6ce892212-kube-api-access-clnl9\") pod \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.818805 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-ovsdbserver-nb\") pod \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.818916 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-config\") pod \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.819069 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-dns-svc\") pod \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.819101 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-ovsdbserver-sb\") pod \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\" (UID: \"0979bfc9-c30c-4e15-bf34-2ad6ce892212\") " Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.828588 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0979bfc9-c30c-4e15-bf34-2ad6ce892212-kube-api-access-clnl9" (OuterVolumeSpecName: "kube-api-access-clnl9") pod "0979bfc9-c30c-4e15-bf34-2ad6ce892212" (UID: "0979bfc9-c30c-4e15-bf34-2ad6ce892212"). InnerVolumeSpecName "kube-api-access-clnl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.861450 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0979bfc9-c30c-4e15-bf34-2ad6ce892212" (UID: "0979bfc9-c30c-4e15-bf34-2ad6ce892212"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.873835 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-config" (OuterVolumeSpecName: "config") pod "0979bfc9-c30c-4e15-bf34-2ad6ce892212" (UID: "0979bfc9-c30c-4e15-bf34-2ad6ce892212"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.878047 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0979bfc9-c30c-4e15-bf34-2ad6ce892212" (UID: "0979bfc9-c30c-4e15-bf34-2ad6ce892212"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.888083 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0979bfc9-c30c-4e15-bf34-2ad6ce892212" (UID: "0979bfc9-c30c-4e15-bf34-2ad6ce892212"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.904009 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3267-account-create-lnz96" Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.907408 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-242e-account-create-7jrdt" Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.928648 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clnl9\" (UniqueName: \"kubernetes.io/projected/0979bfc9-c30c-4e15-bf34-2ad6ce892212-kube-api-access-clnl9\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.928685 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.928697 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.928706 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:52 crc kubenswrapper[4854]: I1007 12:40:52.928715 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0979bfc9-c30c-4e15-bf34-2ad6ce892212-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.029607 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c2dt\" (UniqueName: \"kubernetes.io/projected/db6903f7-f211-44ad-a2d6-cc7b92c1c477-kube-api-access-8c2dt\") pod \"db6903f7-f211-44ad-a2d6-cc7b92c1c477\" (UID: \"db6903f7-f211-44ad-a2d6-cc7b92c1c477\") " Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.029763 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sngc8\" (UniqueName: \"kubernetes.io/projected/d71ee2f2-10ff-48f3-8eed-e8eab883ad22-kube-api-access-sngc8\") pod \"d71ee2f2-10ff-48f3-8eed-e8eab883ad22\" (UID: \"d71ee2f2-10ff-48f3-8eed-e8eab883ad22\") " Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.033232 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d71ee2f2-10ff-48f3-8eed-e8eab883ad22-kube-api-access-sngc8" (OuterVolumeSpecName: "kube-api-access-sngc8") pod "d71ee2f2-10ff-48f3-8eed-e8eab883ad22" (UID: "d71ee2f2-10ff-48f3-8eed-e8eab883ad22"). InnerVolumeSpecName "kube-api-access-sngc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.034794 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db6903f7-f211-44ad-a2d6-cc7b92c1c477-kube-api-access-8c2dt" (OuterVolumeSpecName: "kube-api-access-8c2dt") pod "db6903f7-f211-44ad-a2d6-cc7b92c1c477" (UID: "db6903f7-f211-44ad-a2d6-cc7b92c1c477"). InnerVolumeSpecName "kube-api-access-8c2dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.132367 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c2dt\" (UniqueName: \"kubernetes.io/projected/db6903f7-f211-44ad-a2d6-cc7b92c1c477-kube-api-access-8c2dt\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.132424 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sngc8\" (UniqueName: \"kubernetes.io/projected/d71ee2f2-10ff-48f3-8eed-e8eab883ad22-kube-api-access-sngc8\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.509791 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3267-account-create-lnz96" event={"ID":"d71ee2f2-10ff-48f3-8eed-e8eab883ad22","Type":"ContainerDied","Data":"01c43889445cfda0d8dcf30dcdf55bcf4cc48264c0bf2bf174410517b948d661"} Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.509873 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01c43889445cfda0d8dcf30dcdf55bcf4cc48264c0bf2bf174410517b948d661" Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.509989 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3267-account-create-lnz96" Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.514887 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" event={"ID":"0979bfc9-c30c-4e15-bf34-2ad6ce892212","Type":"ContainerDied","Data":"a84721e217372c2459d84361cb6466b4222cc0d76f6078d142b44c63b82eee10"} Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.514970 4854 scope.go:117] "RemoveContainer" containerID="ea7a33cbb5dfdc8268be7d4b4fefe87d5f67f78082d47cf3fe1e4af92cc4870b" Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.515208 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-pwbvk" Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.520366 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-242e-account-create-7jrdt" event={"ID":"db6903f7-f211-44ad-a2d6-cc7b92c1c477","Type":"ContainerDied","Data":"ab254487269d5c721e991c433412b96dabb023dd5649366c28e968dc23fb5525"} Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.520431 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab254487269d5c721e991c433412b96dabb023dd5649366c28e968dc23fb5525" Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.520468 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-242e-account-create-7jrdt" Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.563909 4854 scope.go:117] "RemoveContainer" containerID="1b89918dd94baf47b1069784c6b60227f3f8a4acf69a2fd159423cad935abdc6" Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.583431 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pwbvk"] Oct 07 12:40:53 crc kubenswrapper[4854]: I1007 12:40:53.592085 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pwbvk"] Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.537785 4854 generic.go:334] "Generic (PLEG): container finished" podID="933b6600-078b-4555-b765-7a22cf257e7c" containerID="692cc3bb8cbbd18ce414ae4744d581b16427eea3290c632474a8a767fdbd9811" exitCode=0 Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.537838 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vsdfv" event={"ID":"933b6600-078b-4555-b765-7a22cf257e7c","Type":"ContainerDied","Data":"692cc3bb8cbbd18ce414ae4744d581b16427eea3290c632474a8a767fdbd9811"} Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.716628 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0979bfc9-c30c-4e15-bf34-2ad6ce892212" path="/var/lib/kubelet/pods/0979bfc9-c30c-4e15-bf34-2ad6ce892212/volumes" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.931480 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-4jdkf"] Oct 07 12:40:54 crc kubenswrapper[4854]: E1007 12:40:54.931959 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0979bfc9-c30c-4e15-bf34-2ad6ce892212" containerName="init" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.931985 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="0979bfc9-c30c-4e15-bf34-2ad6ce892212" containerName="init" Oct 07 12:40:54 crc kubenswrapper[4854]: E1007 12:40:54.932021 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db6903f7-f211-44ad-a2d6-cc7b92c1c477" containerName="mariadb-account-create" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.932032 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="db6903f7-f211-44ad-a2d6-cc7b92c1c477" containerName="mariadb-account-create" Oct 07 12:40:54 crc kubenswrapper[4854]: E1007 12:40:54.932051 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71ee2f2-10ff-48f3-8eed-e8eab883ad22" containerName="mariadb-account-create" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.932061 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71ee2f2-10ff-48f3-8eed-e8eab883ad22" containerName="mariadb-account-create" Oct 07 12:40:54 crc kubenswrapper[4854]: E1007 12:40:54.932103 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0979bfc9-c30c-4e15-bf34-2ad6ce892212" containerName="dnsmasq-dns" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.932115 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="0979bfc9-c30c-4e15-bf34-2ad6ce892212" containerName="dnsmasq-dns" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.932384 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="db6903f7-f211-44ad-a2d6-cc7b92c1c477" containerName="mariadb-account-create" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.932411 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="d71ee2f2-10ff-48f3-8eed-e8eab883ad22" containerName="mariadb-account-create" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.932428 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="0979bfc9-c30c-4e15-bf34-2ad6ce892212" containerName="dnsmasq-dns" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.933334 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4jdkf" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.937820 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.938188 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rplkj" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.941676 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-4jdkf"] Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.971922 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zkcz\" (UniqueName: \"kubernetes.io/projected/245510a3-00ce-4faa-9a0e-4e06482e8a0e-kube-api-access-4zkcz\") pod \"glance-db-sync-4jdkf\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " pod="openstack/glance-db-sync-4jdkf" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.972004 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-config-data\") pod \"glance-db-sync-4jdkf\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " pod="openstack/glance-db-sync-4jdkf" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.972389 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-db-sync-config-data\") pod \"glance-db-sync-4jdkf\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " pod="openstack/glance-db-sync-4jdkf" Oct 07 12:40:54 crc kubenswrapper[4854]: I1007 12:40:54.972531 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-combined-ca-bundle\") pod \"glance-db-sync-4jdkf\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " pod="openstack/glance-db-sync-4jdkf" Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.073655 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-db-sync-config-data\") pod \"glance-db-sync-4jdkf\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " pod="openstack/glance-db-sync-4jdkf" Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.073759 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-combined-ca-bundle\") pod \"glance-db-sync-4jdkf\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " pod="openstack/glance-db-sync-4jdkf" Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.073862 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zkcz\" (UniqueName: \"kubernetes.io/projected/245510a3-00ce-4faa-9a0e-4e06482e8a0e-kube-api-access-4zkcz\") pod \"glance-db-sync-4jdkf\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " pod="openstack/glance-db-sync-4jdkf" Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.073906 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-config-data\") pod \"glance-db-sync-4jdkf\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " pod="openstack/glance-db-sync-4jdkf" Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.079187 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-db-sync-config-data\") pod \"glance-db-sync-4jdkf\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " pod="openstack/glance-db-sync-4jdkf" Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.079303 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-combined-ca-bundle\") pod \"glance-db-sync-4jdkf\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " pod="openstack/glance-db-sync-4jdkf" Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.093204 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-config-data\") pod \"glance-db-sync-4jdkf\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " pod="openstack/glance-db-sync-4jdkf" Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.098270 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zkcz\" (UniqueName: \"kubernetes.io/projected/245510a3-00ce-4faa-9a0e-4e06482e8a0e-kube-api-access-4zkcz\") pod \"glance-db-sync-4jdkf\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " pod="openstack/glance-db-sync-4jdkf" Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.250751 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4jdkf" Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.796189 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-4jdkf"] Oct 07 12:40:55 crc kubenswrapper[4854]: W1007 12:40:55.799667 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod245510a3_00ce_4faa_9a0e_4e06482e8a0e.slice/crio-dc6d5360754a6227b2280de78bf2815026f2f5c533a62e585c0e350666bfd1de WatchSource:0}: Error finding container dc6d5360754a6227b2280de78bf2815026f2f5c533a62e585c0e350666bfd1de: Status 404 returned error can't find the container with id dc6d5360754a6227b2280de78bf2815026f2f5c533a62e585c0e350666bfd1de Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.858831 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.988118 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-combined-ca-bundle\") pod \"933b6600-078b-4555-b765-7a22cf257e7c\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.988644 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/933b6600-078b-4555-b765-7a22cf257e7c-ring-data-devices\") pod \"933b6600-078b-4555-b765-7a22cf257e7c\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.988750 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-swiftconf\") pod \"933b6600-078b-4555-b765-7a22cf257e7c\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.988837 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/933b6600-078b-4555-b765-7a22cf257e7c-etc-swift\") pod \"933b6600-078b-4555-b765-7a22cf257e7c\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.988884 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/933b6600-078b-4555-b765-7a22cf257e7c-scripts\") pod \"933b6600-078b-4555-b765-7a22cf257e7c\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.989053 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml9p7\" (UniqueName: \"kubernetes.io/projected/933b6600-078b-4555-b765-7a22cf257e7c-kube-api-access-ml9p7\") pod \"933b6600-078b-4555-b765-7a22cf257e7c\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.989216 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-dispersionconf\") pod \"933b6600-078b-4555-b765-7a22cf257e7c\" (UID: \"933b6600-078b-4555-b765-7a22cf257e7c\") " Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.990974 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/933b6600-078b-4555-b765-7a22cf257e7c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "933b6600-078b-4555-b765-7a22cf257e7c" (UID: "933b6600-078b-4555-b765-7a22cf257e7c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.991430 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/933b6600-078b-4555-b765-7a22cf257e7c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "933b6600-078b-4555-b765-7a22cf257e7c" (UID: "933b6600-078b-4555-b765-7a22cf257e7c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.993343 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/933b6600-078b-4555-b765-7a22cf257e7c-kube-api-access-ml9p7" (OuterVolumeSpecName: "kube-api-access-ml9p7") pod "933b6600-078b-4555-b765-7a22cf257e7c" (UID: "933b6600-078b-4555-b765-7a22cf257e7c"). InnerVolumeSpecName "kube-api-access-ml9p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:40:55 crc kubenswrapper[4854]: I1007 12:40:55.995721 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "933b6600-078b-4555-b765-7a22cf257e7c" (UID: "933b6600-078b-4555-b765-7a22cf257e7c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:40:56 crc kubenswrapper[4854]: I1007 12:40:56.015002 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "933b6600-078b-4555-b765-7a22cf257e7c" (UID: "933b6600-078b-4555-b765-7a22cf257e7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:40:56 crc kubenswrapper[4854]: I1007 12:40:56.016535 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/933b6600-078b-4555-b765-7a22cf257e7c-scripts" (OuterVolumeSpecName: "scripts") pod "933b6600-078b-4555-b765-7a22cf257e7c" (UID: "933b6600-078b-4555-b765-7a22cf257e7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:40:56 crc kubenswrapper[4854]: I1007 12:40:56.019469 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "933b6600-078b-4555-b765-7a22cf257e7c" (UID: "933b6600-078b-4555-b765-7a22cf257e7c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:40:56 crc kubenswrapper[4854]: I1007 12:40:56.091944 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:56 crc kubenswrapper[4854]: I1007 12:40:56.091975 4854 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/933b6600-078b-4555-b765-7a22cf257e7c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:56 crc kubenswrapper[4854]: I1007 12:40:56.091987 4854 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:56 crc kubenswrapper[4854]: I1007 12:40:56.091995 4854 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/933b6600-078b-4555-b765-7a22cf257e7c-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:56 crc kubenswrapper[4854]: I1007 12:40:56.092005 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/933b6600-078b-4555-b765-7a22cf257e7c-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:56 crc kubenswrapper[4854]: I1007 12:40:56.092015 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml9p7\" (UniqueName: \"kubernetes.io/projected/933b6600-078b-4555-b765-7a22cf257e7c-kube-api-access-ml9p7\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:56 crc kubenswrapper[4854]: I1007 12:40:56.092024 4854 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/933b6600-078b-4555-b765-7a22cf257e7c-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 07 12:40:56 crc kubenswrapper[4854]: I1007 12:40:56.569039 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4jdkf" event={"ID":"245510a3-00ce-4faa-9a0e-4e06482e8a0e","Type":"ContainerStarted","Data":"dc6d5360754a6227b2280de78bf2815026f2f5c533a62e585c0e350666bfd1de"} Oct 07 12:40:56 crc kubenswrapper[4854]: I1007 12:40:56.571847 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vsdfv" event={"ID":"933b6600-078b-4555-b765-7a22cf257e7c","Type":"ContainerDied","Data":"2c8e83dae9ead1da6d14e058dc73d40f5cf786a2d38842459c7d9de1d0f5fcf3"} Oct 07 12:40:56 crc kubenswrapper[4854]: I1007 12:40:56.571930 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c8e83dae9ead1da6d14e058dc73d40f5cf786a2d38842459c7d9de1d0f5fcf3" Oct 07 12:40:56 crc kubenswrapper[4854]: I1007 12:40:56.571989 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vsdfv" Oct 07 12:40:58 crc kubenswrapper[4854]: I1007 12:40:58.744054 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:58 crc kubenswrapper[4854]: I1007 12:40:58.758972 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift\") pod \"swift-storage-0\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " pod="openstack/swift-storage-0" Oct 07 12:40:59 crc kubenswrapper[4854]: I1007 12:40:59.031751 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 12:40:59 crc kubenswrapper[4854]: I1007 12:40:59.494826 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d4d8-account-create-rl9sz"] Oct 07 12:40:59 crc kubenswrapper[4854]: E1007 12:40:59.495424 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933b6600-078b-4555-b765-7a22cf257e7c" containerName="swift-ring-rebalance" Oct 07 12:40:59 crc kubenswrapper[4854]: I1007 12:40:59.495441 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="933b6600-078b-4555-b765-7a22cf257e7c" containerName="swift-ring-rebalance" Oct 07 12:40:59 crc kubenswrapper[4854]: I1007 12:40:59.495581 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="933b6600-078b-4555-b765-7a22cf257e7c" containerName="swift-ring-rebalance" Oct 07 12:40:59 crc kubenswrapper[4854]: I1007 12:40:59.496070 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d4d8-account-create-rl9sz" Oct 07 12:40:59 crc kubenswrapper[4854]: I1007 12:40:59.498564 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 07 12:40:59 crc kubenswrapper[4854]: I1007 12:40:59.509534 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d4d8-account-create-rl9sz"] Oct 07 12:40:59 crc kubenswrapper[4854]: I1007 12:40:59.572473 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 07 12:40:59 crc kubenswrapper[4854]: W1007 12:40:59.577033 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9410d0_f08a_4288_901b_8c28b54f6d53.slice/crio-dd938a5843b7a1cab14cbbfebb68babf32b91355005f96e744472dfda46a4b27 WatchSource:0}: Error finding container dd938a5843b7a1cab14cbbfebb68babf32b91355005f96e744472dfda46a4b27: Status 404 returned error can't find the container with id dd938a5843b7a1cab14cbbfebb68babf32b91355005f96e744472dfda46a4b27 Oct 07 12:40:59 crc kubenswrapper[4854]: I1007 12:40:59.603068 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"dd938a5843b7a1cab14cbbfebb68babf32b91355005f96e744472dfda46a4b27"} Oct 07 12:40:59 crc kubenswrapper[4854]: I1007 12:40:59.674489 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vht6d\" (UniqueName: \"kubernetes.io/projected/40931f51-fd4e-40fd-82fc-fd8b92bf09e5-kube-api-access-vht6d\") pod \"placement-d4d8-account-create-rl9sz\" (UID: \"40931f51-fd4e-40fd-82fc-fd8b92bf09e5\") " pod="openstack/placement-d4d8-account-create-rl9sz" Oct 07 12:40:59 crc kubenswrapper[4854]: I1007 12:40:59.748831 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hllqq" podUID="6e6702f3-b113-49f9-b85f-a2d294bac6dc" containerName="ovn-controller" probeResult="failure" output=< Oct 07 12:40:59 crc kubenswrapper[4854]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 07 12:40:59 crc kubenswrapper[4854]: > Oct 07 12:40:59 crc kubenswrapper[4854]: I1007 12:40:59.775683 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vht6d\" (UniqueName: \"kubernetes.io/projected/40931f51-fd4e-40fd-82fc-fd8b92bf09e5-kube-api-access-vht6d\") pod \"placement-d4d8-account-create-rl9sz\" (UID: \"40931f51-fd4e-40fd-82fc-fd8b92bf09e5\") " pod="openstack/placement-d4d8-account-create-rl9sz" Oct 07 12:40:59 crc kubenswrapper[4854]: I1007 12:40:59.797893 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vht6d\" (UniqueName: \"kubernetes.io/projected/40931f51-fd4e-40fd-82fc-fd8b92bf09e5-kube-api-access-vht6d\") pod \"placement-d4d8-account-create-rl9sz\" (UID: \"40931f51-fd4e-40fd-82fc-fd8b92bf09e5\") " pod="openstack/placement-d4d8-account-create-rl9sz" Oct 07 12:40:59 crc kubenswrapper[4854]: I1007 12:40:59.815045 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d4d8-account-create-rl9sz" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.070272 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.075125 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.290144 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hllqq-config-xdd4r"] Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.292699 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.298970 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.313189 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hllqq-config-xdd4r"] Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.330724 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d4d8-account-create-rl9sz"] Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.385427 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-additional-scripts\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.385662 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6jwq\" (UniqueName: \"kubernetes.io/projected/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-kube-api-access-w6jwq\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.385773 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-scripts\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.385849 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-log-ovn\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.385993 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-run\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.386083 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-run-ovn\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.488297 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-additional-scripts\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.488359 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6jwq\" (UniqueName: \"kubernetes.io/projected/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-kube-api-access-w6jwq\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.488386 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-scripts\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.488403 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-log-ovn\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.488434 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-run\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.488459 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-run-ovn\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.488770 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-run-ovn\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.489927 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-additional-scripts\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.491890 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-scripts\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.491951 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-log-ovn\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.491988 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-run\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.523141 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6jwq\" (UniqueName: \"kubernetes.io/projected/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-kube-api-access-w6jwq\") pod \"ovn-controller-hllqq-config-xdd4r\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.612310 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.617038 4854 generic.go:334] "Generic (PLEG): container finished" podID="79513100-48d2-4e7b-ae14-888322cab8f3" containerID="9c7ddd3a4c8f213d021b724e370c377203bc8a7c36b48c8171f8d9f35b3f1843" exitCode=0 Oct 07 12:41:00 crc kubenswrapper[4854]: I1007 12:41:00.617129 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"79513100-48d2-4e7b-ae14-888322cab8f3","Type":"ContainerDied","Data":"9c7ddd3a4c8f213d021b724e370c377203bc8a7c36b48c8171f8d9f35b3f1843"} Oct 07 12:41:04 crc kubenswrapper[4854]: I1007 12:41:04.650194 4854 generic.go:334] "Generic (PLEG): container finished" podID="4c293f13-b2a5-4d4b-9f69-fd118e34eab2" containerID="e014726a2ceb29d439118ca6c1761fd0aaf31c3c217a1281253c7e737429cda0" exitCode=0 Oct 07 12:41:04 crc kubenswrapper[4854]: I1007 12:41:04.650342 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4c293f13-b2a5-4d4b-9f69-fd118e34eab2","Type":"ContainerDied","Data":"e014726a2ceb29d439118ca6c1761fd0aaf31c3c217a1281253c7e737429cda0"} Oct 07 12:41:04 crc kubenswrapper[4854]: I1007 12:41:04.769413 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hllqq" podUID="6e6702f3-b113-49f9-b85f-a2d294bac6dc" containerName="ovn-controller" probeResult="failure" output=< Oct 07 12:41:04 crc kubenswrapper[4854]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 07 12:41:04 crc kubenswrapper[4854]: > Oct 07 12:41:06 crc kubenswrapper[4854]: I1007 12:41:06.668096 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4d8-account-create-rl9sz" event={"ID":"40931f51-fd4e-40fd-82fc-fd8b92bf09e5","Type":"ContainerStarted","Data":"40d035cc84611c6ad2749c73c1fd39d9c1387aecdebcfa48d705553c0482956f"} Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.396993 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hllqq-config-xdd4r"] Oct 07 12:41:07 crc kubenswrapper[4854]: W1007 12:41:07.421911 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod843b3ab8_f64d_4f8a_956d_3abbdabd0bae.slice/crio-cf96d881742ee8d817044888a88eed8ae96d5af56c10c43bdd8699c8d01dc427 WatchSource:0}: Error finding container cf96d881742ee8d817044888a88eed8ae96d5af56c10c43bdd8699c8d01dc427: Status 404 returned error can't find the container with id cf96d881742ee8d817044888a88eed8ae96d5af56c10c43bdd8699c8d01dc427 Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.678410 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"46e4da89bda9a00a4ff3b8768a33badf8cfee5c8e4687bcdeb7761033e1a22b5"} Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.678455 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"64082d363d907b3ab83993dcfe80f76c9b9736fb8b76e7c87b840d99d91af5c8"} Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.678464 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"e7d1be76b597fbb83db1e8cb27aca94cb03bb1ca865e7042ef7e99b025c26823"} Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.681106 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4c293f13-b2a5-4d4b-9f69-fd118e34eab2","Type":"ContainerStarted","Data":"773b06224b9794e065a5371a7a6969873348f5679927494931f1681bc7245e22"} Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.681343 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.687074 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hllqq-config-xdd4r" event={"ID":"843b3ab8-f64d-4f8a-956d-3abbdabd0bae","Type":"ContainerStarted","Data":"cf96d881742ee8d817044888a88eed8ae96d5af56c10c43bdd8699c8d01dc427"} Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.689657 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4jdkf" event={"ID":"245510a3-00ce-4faa-9a0e-4e06482e8a0e","Type":"ContainerStarted","Data":"60bb9fc490e0c40f6772256cf4a2a1ca8e51250ec86c89bd133d4b4d9de69302"} Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.694566 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"79513100-48d2-4e7b-ae14-888322cab8f3","Type":"ContainerStarted","Data":"6b0f9e765f42e0528d289da2b61440b957c5df88bdf61759d5eef1939df5e18b"} Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.694773 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.697393 4854 generic.go:334] "Generic (PLEG): container finished" podID="40931f51-fd4e-40fd-82fc-fd8b92bf09e5" containerID="3f494071d1ef0d6df80043a7409a0e9096a6b62c0faf1b32f7a34dccf7655466" exitCode=0 Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.697424 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4d8-account-create-rl9sz" event={"ID":"40931f51-fd4e-40fd-82fc-fd8b92bf09e5","Type":"ContainerDied","Data":"3f494071d1ef0d6df80043a7409a0e9096a6b62c0faf1b32f7a34dccf7655466"} Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.708506 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371953.146292 podStartE2EDuration="1m23.708483438s" podCreationTimestamp="2025-10-07 12:39:44 +0000 UTC" firstStartedPulling="2025-10-07 12:39:46.741448674 +0000 UTC m=+902.729280929" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:41:07.706567202 +0000 UTC m=+983.694399487" watchObservedRunningTime="2025-10-07 12:41:07.708483438 +0000 UTC m=+983.696315693" Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.750990 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-4jdkf" podStartSLOduration=2.563778783 podStartE2EDuration="13.750969824s" podCreationTimestamp="2025-10-07 12:40:54 +0000 UTC" firstStartedPulling="2025-10-07 12:40:55.801690919 +0000 UTC m=+971.789523185" lastFinishedPulling="2025-10-07 12:41:06.988881971 +0000 UTC m=+982.976714226" observedRunningTime="2025-10-07 12:41:07.744844306 +0000 UTC m=+983.732676561" watchObservedRunningTime="2025-10-07 12:41:07.750969824 +0000 UTC m=+983.738802079" Oct 07 12:41:07 crc kubenswrapper[4854]: I1007 12:41:07.779619 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.960471417 podStartE2EDuration="1m22.779602867s" podCreationTimestamp="2025-10-07 12:39:45 +0000 UTC" firstStartedPulling="2025-10-07 12:39:47.007438584 +0000 UTC m=+902.995270839" lastFinishedPulling="2025-10-07 12:40:26.826570024 +0000 UTC m=+942.814402289" observedRunningTime="2025-10-07 12:41:07.774184069 +0000 UTC m=+983.762016334" watchObservedRunningTime="2025-10-07 12:41:07.779602867 +0000 UTC m=+983.767435122" Oct 07 12:41:08 crc kubenswrapper[4854]: I1007 12:41:08.714828 4854 generic.go:334] "Generic (PLEG): container finished" podID="843b3ab8-f64d-4f8a-956d-3abbdabd0bae" containerID="13cd0930c993fb29506907fbe02b5640f437f2eb9dee0a69b79aa644c500010d" exitCode=0 Oct 07 12:41:08 crc kubenswrapper[4854]: I1007 12:41:08.727221 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hllqq-config-xdd4r" event={"ID":"843b3ab8-f64d-4f8a-956d-3abbdabd0bae","Type":"ContainerDied","Data":"13cd0930c993fb29506907fbe02b5640f437f2eb9dee0a69b79aa644c500010d"} Oct 07 12:41:08 crc kubenswrapper[4854]: I1007 12:41:08.727275 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"7ec2bd59e5244b757b1dc78381bc28563a41c3f4cb1938a20f0abef176871dca"} Oct 07 12:41:09 crc kubenswrapper[4854]: I1007 12:41:09.088392 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d4d8-account-create-rl9sz" Oct 07 12:41:09 crc kubenswrapper[4854]: I1007 12:41:09.181068 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vht6d\" (UniqueName: \"kubernetes.io/projected/40931f51-fd4e-40fd-82fc-fd8b92bf09e5-kube-api-access-vht6d\") pod \"40931f51-fd4e-40fd-82fc-fd8b92bf09e5\" (UID: \"40931f51-fd4e-40fd-82fc-fd8b92bf09e5\") " Oct 07 12:41:09 crc kubenswrapper[4854]: I1007 12:41:09.187086 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40931f51-fd4e-40fd-82fc-fd8b92bf09e5-kube-api-access-vht6d" (OuterVolumeSpecName: "kube-api-access-vht6d") pod "40931f51-fd4e-40fd-82fc-fd8b92bf09e5" (UID: "40931f51-fd4e-40fd-82fc-fd8b92bf09e5"). InnerVolumeSpecName "kube-api-access-vht6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:09 crc kubenswrapper[4854]: I1007 12:41:09.283920 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vht6d\" (UniqueName: \"kubernetes.io/projected/40931f51-fd4e-40fd-82fc-fd8b92bf09e5-kube-api-access-vht6d\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:09 crc kubenswrapper[4854]: I1007 12:41:09.735358 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d4d8-account-create-rl9sz" Oct 07 12:41:09 crc kubenswrapper[4854]: I1007 12:41:09.735429 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d4d8-account-create-rl9sz" event={"ID":"40931f51-fd4e-40fd-82fc-fd8b92bf09e5","Type":"ContainerDied","Data":"40d035cc84611c6ad2749c73c1fd39d9c1387aecdebcfa48d705553c0482956f"} Oct 07 12:41:09 crc kubenswrapper[4854]: I1007 12:41:09.735469 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40d035cc84611c6ad2749c73c1fd39d9c1387aecdebcfa48d705553c0482956f" Oct 07 12:41:09 crc kubenswrapper[4854]: I1007 12:41:09.760120 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-hllqq" Oct 07 12:41:10 crc kubenswrapper[4854]: I1007 12:41:10.808772 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:41:10 crc kubenswrapper[4854]: I1007 12:41:10.809214 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.090329 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.216976 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-scripts\") pod \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.217102 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-log-ovn\") pod \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.217226 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "843b3ab8-f64d-4f8a-956d-3abbdabd0bae" (UID: "843b3ab8-f64d-4f8a-956d-3abbdabd0bae"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.217300 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-additional-scripts\") pod \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.217468 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-run-ovn\") pod \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.217610 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-run\") pod \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.217542 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "843b3ab8-f64d-4f8a-956d-3abbdabd0bae" (UID: "843b3ab8-f64d-4f8a-956d-3abbdabd0bae"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.217669 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-run" (OuterVolumeSpecName: "var-run") pod "843b3ab8-f64d-4f8a-956d-3abbdabd0bae" (UID: "843b3ab8-f64d-4f8a-956d-3abbdabd0bae"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.217834 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6jwq\" (UniqueName: \"kubernetes.io/projected/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-kube-api-access-w6jwq\") pod \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\" (UID: \"843b3ab8-f64d-4f8a-956d-3abbdabd0bae\") " Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.217902 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "843b3ab8-f64d-4f8a-956d-3abbdabd0bae" (UID: "843b3ab8-f64d-4f8a-956d-3abbdabd0bae"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.218004 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-scripts" (OuterVolumeSpecName: "scripts") pod "843b3ab8-f64d-4f8a-956d-3abbdabd0bae" (UID: "843b3ab8-f64d-4f8a-956d-3abbdabd0bae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.219037 4854 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.219074 4854 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.219092 4854 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.219106 4854 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-var-run\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.219121 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.236068 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-kube-api-access-w6jwq" (OuterVolumeSpecName: "kube-api-access-w6jwq") pod "843b3ab8-f64d-4f8a-956d-3abbdabd0bae" (UID: "843b3ab8-f64d-4f8a-956d-3abbdabd0bae"). InnerVolumeSpecName "kube-api-access-w6jwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.321068 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6jwq\" (UniqueName: \"kubernetes.io/projected/843b3ab8-f64d-4f8a-956d-3abbdabd0bae-kube-api-access-w6jwq\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.752857 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hllqq-config-xdd4r" event={"ID":"843b3ab8-f64d-4f8a-956d-3abbdabd0bae","Type":"ContainerDied","Data":"cf96d881742ee8d817044888a88eed8ae96d5af56c10c43bdd8699c8d01dc427"} Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.752927 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf96d881742ee8d817044888a88eed8ae96d5af56c10c43bdd8699c8d01dc427" Oct 07 12:41:11 crc kubenswrapper[4854]: I1007 12:41:11.752957 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hllqq-config-xdd4r" Oct 07 12:41:12 crc kubenswrapper[4854]: I1007 12:41:12.221186 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hllqq-config-xdd4r"] Oct 07 12:41:12 crc kubenswrapper[4854]: I1007 12:41:12.227321 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hllqq-config-xdd4r"] Oct 07 12:41:12 crc kubenswrapper[4854]: I1007 12:41:12.719088 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="843b3ab8-f64d-4f8a-956d-3abbdabd0bae" path="/var/lib/kubelet/pods/843b3ab8-f64d-4f8a-956d-3abbdabd0bae/volumes" Oct 07 12:41:14 crc kubenswrapper[4854]: I1007 12:41:14.779632 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"c1daeb00bdab450dbfd0e8a2fc796a0cac58f94569069238d66bafbd697e21fe"} Oct 07 12:41:14 crc kubenswrapper[4854]: I1007 12:41:14.780044 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"04a36a15910bb4bfdcb57384d35dbb7269dfced501b76e98351c42047a3a007d"} Oct 07 12:41:14 crc kubenswrapper[4854]: I1007 12:41:14.780055 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"bf5285e76e2b7c58fdaeb7235c95ba9d504989e3ace943cdfc50b7e56f2b19f6"} Oct 07 12:41:14 crc kubenswrapper[4854]: I1007 12:41:14.780066 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"6c57c9247bb283b1cdb97c6d4a8501239230072515b3dfc540b0bb0d946a0d67"} Oct 07 12:41:16 crc kubenswrapper[4854]: I1007 12:41:16.806003 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"16029fd8b03b23770b20c5cb19ee5eac37c556a4d1e730daaeee2955d74d1015"} Oct 07 12:41:16 crc kubenswrapper[4854]: I1007 12:41:16.806500 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"d139c560fe8a80f0502b14596ea2825172ee972c7730deb571bbc27b9d43e3a4"} Oct 07 12:41:16 crc kubenswrapper[4854]: I1007 12:41:16.806514 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"ef4e6844e5b6720c32f74f7e722421b7675513872680d1a7f5bcc693b7983ed2"} Oct 07 12:41:16 crc kubenswrapper[4854]: I1007 12:41:16.806526 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"41fbd8df61c4cf14f1e960d26b75dfc2bc04a3598dc2d46c94fb7688efb55eb9"} Oct 07 12:41:17 crc kubenswrapper[4854]: I1007 12:41:17.823741 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"bd423040637c8b41729c179d3e2612729ced5992aafd5990c3c9b5c7ca5727c0"} Oct 07 12:41:17 crc kubenswrapper[4854]: I1007 12:41:17.823782 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"6e9be7ed32f4b1f0286eb9c12fca3356d25542eaefb68cc3134f42aca219812e"} Oct 07 12:41:17 crc kubenswrapper[4854]: I1007 12:41:17.823791 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerStarted","Data":"3ead17d08219606cf249dcd61e3873b2c9b921a7f531b3b5ff7b4f38fcea5a39"} Oct 07 12:41:17 crc kubenswrapper[4854]: I1007 12:41:17.874566 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.61999239 podStartE2EDuration="36.874540888s" podCreationTimestamp="2025-10-07 12:40:41 +0000 UTC" firstStartedPulling="2025-10-07 12:40:59.579492243 +0000 UTC m=+975.567324498" lastFinishedPulling="2025-10-07 12:41:15.834040751 +0000 UTC m=+991.821872996" observedRunningTime="2025-10-07 12:41:17.860701625 +0000 UTC m=+993.848533930" watchObservedRunningTime="2025-10-07 12:41:17.874540888 +0000 UTC m=+993.862373143" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.171975 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-2ww9k"] Oct 07 12:41:18 crc kubenswrapper[4854]: E1007 12:41:18.172430 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40931f51-fd4e-40fd-82fc-fd8b92bf09e5" containerName="mariadb-account-create" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.172450 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="40931f51-fd4e-40fd-82fc-fd8b92bf09e5" containerName="mariadb-account-create" Oct 07 12:41:18 crc kubenswrapper[4854]: E1007 12:41:18.172489 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843b3ab8-f64d-4f8a-956d-3abbdabd0bae" containerName="ovn-config" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.172497 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="843b3ab8-f64d-4f8a-956d-3abbdabd0bae" containerName="ovn-config" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.172708 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="40931f51-fd4e-40fd-82fc-fd8b92bf09e5" containerName="mariadb-account-create" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.172730 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="843b3ab8-f64d-4f8a-956d-3abbdabd0bae" containerName="ovn-config" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.173889 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.175935 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.187548 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-2ww9k"] Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.233839 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.233887 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-config\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.233979 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zzsv\" (UniqueName: \"kubernetes.io/projected/0d48fc86-023d-4d59-9d7c-6ef250b11609-kube-api-access-4zzsv\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.234004 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.234035 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-dns-svc\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.234072 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.335302 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zzsv\" (UniqueName: \"kubernetes.io/projected/0d48fc86-023d-4d59-9d7c-6ef250b11609-kube-api-access-4zzsv\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.335345 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.335375 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-dns-svc\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.335413 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.335453 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.335470 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-config\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.336280 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-config\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.336306 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-dns-svc\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.336443 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.336537 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.336610 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.368658 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zzsv\" (UniqueName: \"kubernetes.io/projected/0d48fc86-023d-4d59-9d7c-6ef250b11609-kube-api-access-4zzsv\") pod \"dnsmasq-dns-764c5664d7-2ww9k\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:18 crc kubenswrapper[4854]: I1007 12:41:18.491495 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:19 crc kubenswrapper[4854]: I1007 12:41:19.556412 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-2ww9k"] Oct 07 12:41:19 crc kubenswrapper[4854]: W1007 12:41:19.557761 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d48fc86_023d_4d59_9d7c_6ef250b11609.slice/crio-3052e7dee08bd0a1e6d22a6cfa3e2b256d9918d7b5159687e8332776afad286a WatchSource:0}: Error finding container 3052e7dee08bd0a1e6d22a6cfa3e2b256d9918d7b5159687e8332776afad286a: Status 404 returned error can't find the container with id 3052e7dee08bd0a1e6d22a6cfa3e2b256d9918d7b5159687e8332776afad286a Oct 07 12:41:19 crc kubenswrapper[4854]: I1007 12:41:19.866390 4854 generic.go:334] "Generic (PLEG): container finished" podID="245510a3-00ce-4faa-9a0e-4e06482e8a0e" containerID="60bb9fc490e0c40f6772256cf4a2a1ca8e51250ec86c89bd133d4b4d9de69302" exitCode=0 Oct 07 12:41:19 crc kubenswrapper[4854]: I1007 12:41:19.866499 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4jdkf" event={"ID":"245510a3-00ce-4faa-9a0e-4e06482e8a0e","Type":"ContainerDied","Data":"60bb9fc490e0c40f6772256cf4a2a1ca8e51250ec86c89bd133d4b4d9de69302"} Oct 07 12:41:19 crc kubenswrapper[4854]: I1007 12:41:19.870416 4854 generic.go:334] "Generic (PLEG): container finished" podID="0d48fc86-023d-4d59-9d7c-6ef250b11609" containerID="0d49070f8fa61123631ed4df9d4d8a048b827eeaf435f873fe6b5ddf4ff6cd0d" exitCode=0 Oct 07 12:41:19 crc kubenswrapper[4854]: I1007 12:41:19.870461 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" event={"ID":"0d48fc86-023d-4d59-9d7c-6ef250b11609","Type":"ContainerDied","Data":"0d49070f8fa61123631ed4df9d4d8a048b827eeaf435f873fe6b5ddf4ff6cd0d"} Oct 07 12:41:19 crc kubenswrapper[4854]: I1007 12:41:19.870492 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" event={"ID":"0d48fc86-023d-4d59-9d7c-6ef250b11609","Type":"ContainerStarted","Data":"3052e7dee08bd0a1e6d22a6cfa3e2b256d9918d7b5159687e8332776afad286a"} Oct 07 12:41:20 crc kubenswrapper[4854]: I1007 12:41:20.879947 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" event={"ID":"0d48fc86-023d-4d59-9d7c-6ef250b11609","Type":"ContainerStarted","Data":"62c9f89380455de2e9b73a0ae0e82321affae32a398760874a56d9f3fa68d9ac"} Oct 07 12:41:20 crc kubenswrapper[4854]: I1007 12:41:20.908732 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" podStartSLOduration=2.908710354 podStartE2EDuration="2.908710354s" podCreationTimestamp="2025-10-07 12:41:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:41:20.904402868 +0000 UTC m=+996.892235163" watchObservedRunningTime="2025-10-07 12:41:20.908710354 +0000 UTC m=+996.896542609" Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.313453 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4jdkf" Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.493957 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-config-data\") pod \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.494231 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zkcz\" (UniqueName: \"kubernetes.io/projected/245510a3-00ce-4faa-9a0e-4e06482e8a0e-kube-api-access-4zkcz\") pod \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.494572 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-db-sync-config-data\") pod \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.494683 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-combined-ca-bundle\") pod \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\" (UID: \"245510a3-00ce-4faa-9a0e-4e06482e8a0e\") " Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.501217 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "245510a3-00ce-4faa-9a0e-4e06482e8a0e" (UID: "245510a3-00ce-4faa-9a0e-4e06482e8a0e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.502319 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/245510a3-00ce-4faa-9a0e-4e06482e8a0e-kube-api-access-4zkcz" (OuterVolumeSpecName: "kube-api-access-4zkcz") pod "245510a3-00ce-4faa-9a0e-4e06482e8a0e" (UID: "245510a3-00ce-4faa-9a0e-4e06482e8a0e"). InnerVolumeSpecName "kube-api-access-4zkcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.528063 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "245510a3-00ce-4faa-9a0e-4e06482e8a0e" (UID: "245510a3-00ce-4faa-9a0e-4e06482e8a0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.561891 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-config-data" (OuterVolumeSpecName: "config-data") pod "245510a3-00ce-4faa-9a0e-4e06482e8a0e" (UID: "245510a3-00ce-4faa-9a0e-4e06482e8a0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.596083 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.596117 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zkcz\" (UniqueName: \"kubernetes.io/projected/245510a3-00ce-4faa-9a0e-4e06482e8a0e-kube-api-access-4zkcz\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.596128 4854 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.596137 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/245510a3-00ce-4faa-9a0e-4e06482e8a0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.893396 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4jdkf" Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.893410 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4jdkf" event={"ID":"245510a3-00ce-4faa-9a0e-4e06482e8a0e","Type":"ContainerDied","Data":"dc6d5360754a6227b2280de78bf2815026f2f5c533a62e585c0e350666bfd1de"} Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.893488 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc6d5360754a6227b2280de78bf2815026f2f5c533a62e585c0e350666bfd1de" Oct 07 12:41:21 crc kubenswrapper[4854]: I1007 12:41:21.893538 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.286533 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-2ww9k"] Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.324904 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-gzl8s"] Oct 07 12:41:22 crc kubenswrapper[4854]: E1007 12:41:22.325798 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245510a3-00ce-4faa-9a0e-4e06482e8a0e" containerName="glance-db-sync" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.325819 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="245510a3-00ce-4faa-9a0e-4e06482e8a0e" containerName="glance-db-sync" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.326052 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="245510a3-00ce-4faa-9a0e-4e06482e8a0e" containerName="glance-db-sync" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.327036 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.361556 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-gzl8s"] Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.417935 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.418019 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.418053 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbckt\" (UniqueName: \"kubernetes.io/projected/45b87c2d-9143-42f2-84b8-14db6534b894-kube-api-access-xbckt\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.418085 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-config\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.418141 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.418214 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.519211 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-config\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.519287 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.519384 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.519414 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.519448 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.519476 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbckt\" (UniqueName: \"kubernetes.io/projected/45b87c2d-9143-42f2-84b8-14db6534b894-kube-api-access-xbckt\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.520355 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.520367 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.520421 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.520620 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.521083 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-config\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.537003 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbckt\" (UniqueName: \"kubernetes.io/projected/45b87c2d-9143-42f2-84b8-14db6534b894-kube-api-access-xbckt\") pod \"dnsmasq-dns-74f6bcbc87-gzl8s\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:22 crc kubenswrapper[4854]: I1007 12:41:22.643500 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:23 crc kubenswrapper[4854]: I1007 12:41:23.106286 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-gzl8s"] Oct 07 12:41:23 crc kubenswrapper[4854]: I1007 12:41:23.919518 4854 generic.go:334] "Generic (PLEG): container finished" podID="45b87c2d-9143-42f2-84b8-14db6534b894" containerID="6cfb1b6f86e8f5068f1ccdd5f7c19800635f6d93230702e09b129ba4a20e4333" exitCode=0 Oct 07 12:41:23 crc kubenswrapper[4854]: I1007 12:41:23.919758 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" event={"ID":"45b87c2d-9143-42f2-84b8-14db6534b894","Type":"ContainerDied","Data":"6cfb1b6f86e8f5068f1ccdd5f7c19800635f6d93230702e09b129ba4a20e4333"} Oct 07 12:41:23 crc kubenswrapper[4854]: I1007 12:41:23.920042 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" event={"ID":"45b87c2d-9143-42f2-84b8-14db6534b894","Type":"ContainerStarted","Data":"3ebbfc7d3ca75d5cc2b878d51648535518c5b2aae128a7fc6f858d32f848d511"} Oct 07 12:41:23 crc kubenswrapper[4854]: I1007 12:41:23.920063 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" podUID="0d48fc86-023d-4d59-9d7c-6ef250b11609" containerName="dnsmasq-dns" containerID="cri-o://62c9f89380455de2e9b73a0ae0e82321affae32a398760874a56d9f3fa68d9ac" gracePeriod=10 Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.312449 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.347332 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-ovsdbserver-sb\") pod \"0d48fc86-023d-4d59-9d7c-6ef250b11609\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.347418 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-dns-svc\") pod \"0d48fc86-023d-4d59-9d7c-6ef250b11609\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.347485 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-config\") pod \"0d48fc86-023d-4d59-9d7c-6ef250b11609\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.347507 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-ovsdbserver-nb\") pod \"0d48fc86-023d-4d59-9d7c-6ef250b11609\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.347584 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-dns-swift-storage-0\") pod \"0d48fc86-023d-4d59-9d7c-6ef250b11609\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.347620 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zzsv\" (UniqueName: \"kubernetes.io/projected/0d48fc86-023d-4d59-9d7c-6ef250b11609-kube-api-access-4zzsv\") pod \"0d48fc86-023d-4d59-9d7c-6ef250b11609\" (UID: \"0d48fc86-023d-4d59-9d7c-6ef250b11609\") " Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.351781 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d48fc86-023d-4d59-9d7c-6ef250b11609-kube-api-access-4zzsv" (OuterVolumeSpecName: "kube-api-access-4zzsv") pod "0d48fc86-023d-4d59-9d7c-6ef250b11609" (UID: "0d48fc86-023d-4d59-9d7c-6ef250b11609"). InnerVolumeSpecName "kube-api-access-4zzsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.404238 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0d48fc86-023d-4d59-9d7c-6ef250b11609" (UID: "0d48fc86-023d-4d59-9d7c-6ef250b11609"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.404776 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0d48fc86-023d-4d59-9d7c-6ef250b11609" (UID: "0d48fc86-023d-4d59-9d7c-6ef250b11609"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.407466 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0d48fc86-023d-4d59-9d7c-6ef250b11609" (UID: "0d48fc86-023d-4d59-9d7c-6ef250b11609"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.418614 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d48fc86-023d-4d59-9d7c-6ef250b11609" (UID: "0d48fc86-023d-4d59-9d7c-6ef250b11609"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.424961 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-config" (OuterVolumeSpecName: "config") pod "0d48fc86-023d-4d59-9d7c-6ef250b11609" (UID: "0d48fc86-023d-4d59-9d7c-6ef250b11609"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.449364 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.449396 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.449405 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.449413 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.449424 4854 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d48fc86-023d-4d59-9d7c-6ef250b11609-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.449433 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zzsv\" (UniqueName: \"kubernetes.io/projected/0d48fc86-023d-4d59-9d7c-6ef250b11609-kube-api-access-4zzsv\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.932512 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" event={"ID":"45b87c2d-9143-42f2-84b8-14db6534b894","Type":"ContainerStarted","Data":"a67ba1c39b438c39f6e876b841981461fe372cdfdbd8f11d401bf07296dce3ea"} Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.932705 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.934128 4854 generic.go:334] "Generic (PLEG): container finished" podID="0d48fc86-023d-4d59-9d7c-6ef250b11609" containerID="62c9f89380455de2e9b73a0ae0e82321affae32a398760874a56d9f3fa68d9ac" exitCode=0 Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.934174 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" event={"ID":"0d48fc86-023d-4d59-9d7c-6ef250b11609","Type":"ContainerDied","Data":"62c9f89380455de2e9b73a0ae0e82321affae32a398760874a56d9f3fa68d9ac"} Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.934197 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" event={"ID":"0d48fc86-023d-4d59-9d7c-6ef250b11609","Type":"ContainerDied","Data":"3052e7dee08bd0a1e6d22a6cfa3e2b256d9918d7b5159687e8332776afad286a"} Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.934218 4854 scope.go:117] "RemoveContainer" containerID="62c9f89380455de2e9b73a0ae0e82321affae32a398760874a56d9f3fa68d9ac" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.934254 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-2ww9k" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.960701 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" podStartSLOduration=2.960682171 podStartE2EDuration="2.960682171s" podCreationTimestamp="2025-10-07 12:41:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:41:24.953874683 +0000 UTC m=+1000.941706958" watchObservedRunningTime="2025-10-07 12:41:24.960682171 +0000 UTC m=+1000.948514426" Oct 07 12:41:24 crc kubenswrapper[4854]: I1007 12:41:24.995758 4854 scope.go:117] "RemoveContainer" containerID="0d49070f8fa61123631ed4df9d4d8a048b827eeaf435f873fe6b5ddf4ff6cd0d" Oct 07 12:41:25 crc kubenswrapper[4854]: I1007 12:41:25.007197 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-2ww9k"] Oct 07 12:41:25 crc kubenswrapper[4854]: I1007 12:41:25.010739 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-2ww9k"] Oct 07 12:41:25 crc kubenswrapper[4854]: I1007 12:41:25.016577 4854 scope.go:117] "RemoveContainer" containerID="62c9f89380455de2e9b73a0ae0e82321affae32a398760874a56d9f3fa68d9ac" Oct 07 12:41:25 crc kubenswrapper[4854]: E1007 12:41:25.018518 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62c9f89380455de2e9b73a0ae0e82321affae32a398760874a56d9f3fa68d9ac\": container with ID starting with 62c9f89380455de2e9b73a0ae0e82321affae32a398760874a56d9f3fa68d9ac not found: ID does not exist" containerID="62c9f89380455de2e9b73a0ae0e82321affae32a398760874a56d9f3fa68d9ac" Oct 07 12:41:25 crc kubenswrapper[4854]: I1007 12:41:25.018625 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62c9f89380455de2e9b73a0ae0e82321affae32a398760874a56d9f3fa68d9ac"} err="failed to get container status \"62c9f89380455de2e9b73a0ae0e82321affae32a398760874a56d9f3fa68d9ac\": rpc error: code = NotFound desc = could not find container \"62c9f89380455de2e9b73a0ae0e82321affae32a398760874a56d9f3fa68d9ac\": container with ID starting with 62c9f89380455de2e9b73a0ae0e82321affae32a398760874a56d9f3fa68d9ac not found: ID does not exist" Oct 07 12:41:25 crc kubenswrapper[4854]: I1007 12:41:25.018673 4854 scope.go:117] "RemoveContainer" containerID="0d49070f8fa61123631ed4df9d4d8a048b827eeaf435f873fe6b5ddf4ff6cd0d" Oct 07 12:41:25 crc kubenswrapper[4854]: E1007 12:41:25.019062 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d49070f8fa61123631ed4df9d4d8a048b827eeaf435f873fe6b5ddf4ff6cd0d\": container with ID starting with 0d49070f8fa61123631ed4df9d4d8a048b827eeaf435f873fe6b5ddf4ff6cd0d not found: ID does not exist" containerID="0d49070f8fa61123631ed4df9d4d8a048b827eeaf435f873fe6b5ddf4ff6cd0d" Oct 07 12:41:25 crc kubenswrapper[4854]: I1007 12:41:25.019099 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d49070f8fa61123631ed4df9d4d8a048b827eeaf435f873fe6b5ddf4ff6cd0d"} err="failed to get container status \"0d49070f8fa61123631ed4df9d4d8a048b827eeaf435f873fe6b5ddf4ff6cd0d\": rpc error: code = NotFound desc = could not find container \"0d49070f8fa61123631ed4df9d4d8a048b827eeaf435f873fe6b5ddf4ff6cd0d\": container with ID starting with 0d49070f8fa61123631ed4df9d4d8a048b827eeaf435f873fe6b5ddf4ff6cd0d not found: ID does not exist" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.230503 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.561418 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.619424 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-276gh"] Oct 07 12:41:26 crc kubenswrapper[4854]: E1007 12:41:26.620089 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d48fc86-023d-4d59-9d7c-6ef250b11609" containerName="init" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.620107 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d48fc86-023d-4d59-9d7c-6ef250b11609" containerName="init" Oct 07 12:41:26 crc kubenswrapper[4854]: E1007 12:41:26.620130 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d48fc86-023d-4d59-9d7c-6ef250b11609" containerName="dnsmasq-dns" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.620138 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d48fc86-023d-4d59-9d7c-6ef250b11609" containerName="dnsmasq-dns" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.620608 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d48fc86-023d-4d59-9d7c-6ef250b11609" containerName="dnsmasq-dns" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.621579 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-276gh" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.644022 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-276gh"] Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.714713 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d48fc86-023d-4d59-9d7c-6ef250b11609" path="/var/lib/kubelet/pods/0d48fc86-023d-4d59-9d7c-6ef250b11609/volumes" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.731006 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-hfzk2"] Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.732115 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hfzk2" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.743245 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hfzk2"] Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.796713 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8wvz\" (UniqueName: \"kubernetes.io/projected/96132415-18c4-42a4-bb2c-cec92945602e-kube-api-access-d8wvz\") pod \"barbican-db-create-276gh\" (UID: \"96132415-18c4-42a4-bb2c-cec92945602e\") " pod="openstack/barbican-db-create-276gh" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.819965 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7plvr"] Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.821747 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7plvr" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.833460 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7plvr"] Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.874606 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-f7kxk"] Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.876054 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f7kxk" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.878652 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.878808 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-m9mdj" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.878890 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.879495 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.886109 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-f7kxk"] Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.898667 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8wvz\" (UniqueName: \"kubernetes.io/projected/96132415-18c4-42a4-bb2c-cec92945602e-kube-api-access-d8wvz\") pod \"barbican-db-create-276gh\" (UID: \"96132415-18c4-42a4-bb2c-cec92945602e\") " pod="openstack/barbican-db-create-276gh" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.898792 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7d2c\" (UniqueName: \"kubernetes.io/projected/95bc842d-1bac-4c0e-b2c8-4f0838a630ad-kube-api-access-m7d2c\") pod \"cinder-db-create-hfzk2\" (UID: \"95bc842d-1bac-4c0e-b2c8-4f0838a630ad\") " pod="openstack/cinder-db-create-hfzk2" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.931501 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8wvz\" (UniqueName: \"kubernetes.io/projected/96132415-18c4-42a4-bb2c-cec92945602e-kube-api-access-d8wvz\") pod \"barbican-db-create-276gh\" (UID: \"96132415-18c4-42a4-bb2c-cec92945602e\") " pod="openstack/barbican-db-create-276gh" Oct 07 12:41:26 crc kubenswrapper[4854]: I1007 12:41:26.946323 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-276gh" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.000876 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-combined-ca-bundle\") pod \"keystone-db-sync-f7kxk\" (UID: \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\") " pod="openstack/keystone-db-sync-f7kxk" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.000916 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swdj6\" (UniqueName: \"kubernetes.io/projected/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-kube-api-access-swdj6\") pod \"keystone-db-sync-f7kxk\" (UID: \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\") " pod="openstack/keystone-db-sync-f7kxk" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.000946 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzzdv\" (UniqueName: \"kubernetes.io/projected/e2a2edca-4c99-45d7-b47e-12d31a7e649f-kube-api-access-pzzdv\") pod \"neutron-db-create-7plvr\" (UID: \"e2a2edca-4c99-45d7-b47e-12d31a7e649f\") " pod="openstack/neutron-db-create-7plvr" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.001137 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7d2c\" (UniqueName: \"kubernetes.io/projected/95bc842d-1bac-4c0e-b2c8-4f0838a630ad-kube-api-access-m7d2c\") pod \"cinder-db-create-hfzk2\" (UID: \"95bc842d-1bac-4c0e-b2c8-4f0838a630ad\") " pod="openstack/cinder-db-create-hfzk2" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.001274 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-config-data\") pod \"keystone-db-sync-f7kxk\" (UID: \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\") " pod="openstack/keystone-db-sync-f7kxk" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.025095 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7d2c\" (UniqueName: \"kubernetes.io/projected/95bc842d-1bac-4c0e-b2c8-4f0838a630ad-kube-api-access-m7d2c\") pod \"cinder-db-create-hfzk2\" (UID: \"95bc842d-1bac-4c0e-b2c8-4f0838a630ad\") " pod="openstack/cinder-db-create-hfzk2" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.056987 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hfzk2" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.104323 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-config-data\") pod \"keystone-db-sync-f7kxk\" (UID: \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\") " pod="openstack/keystone-db-sync-f7kxk" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.104655 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-combined-ca-bundle\") pod \"keystone-db-sync-f7kxk\" (UID: \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\") " pod="openstack/keystone-db-sync-f7kxk" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.104676 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swdj6\" (UniqueName: \"kubernetes.io/projected/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-kube-api-access-swdj6\") pod \"keystone-db-sync-f7kxk\" (UID: \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\") " pod="openstack/keystone-db-sync-f7kxk" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.104703 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzzdv\" (UniqueName: \"kubernetes.io/projected/e2a2edca-4c99-45d7-b47e-12d31a7e649f-kube-api-access-pzzdv\") pod \"neutron-db-create-7plvr\" (UID: \"e2a2edca-4c99-45d7-b47e-12d31a7e649f\") " pod="openstack/neutron-db-create-7plvr" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.109532 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-config-data\") pod \"keystone-db-sync-f7kxk\" (UID: \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\") " pod="openstack/keystone-db-sync-f7kxk" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.112301 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-combined-ca-bundle\") pod \"keystone-db-sync-f7kxk\" (UID: \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\") " pod="openstack/keystone-db-sync-f7kxk" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.120426 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swdj6\" (UniqueName: \"kubernetes.io/projected/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-kube-api-access-swdj6\") pod \"keystone-db-sync-f7kxk\" (UID: \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\") " pod="openstack/keystone-db-sync-f7kxk" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.129302 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzzdv\" (UniqueName: \"kubernetes.io/projected/e2a2edca-4c99-45d7-b47e-12d31a7e649f-kube-api-access-pzzdv\") pod \"neutron-db-create-7plvr\" (UID: \"e2a2edca-4c99-45d7-b47e-12d31a7e649f\") " pod="openstack/neutron-db-create-7plvr" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.140087 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7plvr" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.195490 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f7kxk" Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.391865 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-276gh"] Oct 07 12:41:27 crc kubenswrapper[4854]: W1007 12:41:27.396074 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96132415_18c4_42a4_bb2c_cec92945602e.slice/crio-682c6f226606471b06b1f7f85efa7621143fa0f00d70701ba8afaac29cde7076 WatchSource:0}: Error finding container 682c6f226606471b06b1f7f85efa7621143fa0f00d70701ba8afaac29cde7076: Status 404 returned error can't find the container with id 682c6f226606471b06b1f7f85efa7621143fa0f00d70701ba8afaac29cde7076 Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.537557 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hfzk2"] Oct 07 12:41:27 crc kubenswrapper[4854]: W1007 12:41:27.548126 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95bc842d_1bac_4c0e_b2c8_4f0838a630ad.slice/crio-5b856f729453608068fe8561c2ebc1da61ecf28ec2a3011a821967dfc4b60b8d WatchSource:0}: Error finding container 5b856f729453608068fe8561c2ebc1da61ecf28ec2a3011a821967dfc4b60b8d: Status 404 returned error can't find the container with id 5b856f729453608068fe8561c2ebc1da61ecf28ec2a3011a821967dfc4b60b8d Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.680091 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-f7kxk"] Oct 07 12:41:27 crc kubenswrapper[4854]: W1007 12:41:27.685106 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5683d2a3_82a3_4e7b_b5eb_9a2e46c10fc9.slice/crio-fcabc9730eb13681654f46f7ae5f6b8f435e5ee9ce3f188acf54bfa14ea79ec9 WatchSource:0}: Error finding container fcabc9730eb13681654f46f7ae5f6b8f435e5ee9ce3f188acf54bfa14ea79ec9: Status 404 returned error can't find the container with id fcabc9730eb13681654f46f7ae5f6b8f435e5ee9ce3f188acf54bfa14ea79ec9 Oct 07 12:41:27 crc kubenswrapper[4854]: W1007 12:41:27.745649 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2a2edca_4c99_45d7_b47e_12d31a7e649f.slice/crio-d136c21804dd295f9f1b7eea4ec8b0df46fa34fab1b3a3f46203bb0174aaaecb WatchSource:0}: Error finding container d136c21804dd295f9f1b7eea4ec8b0df46fa34fab1b3a3f46203bb0174aaaecb: Status 404 returned error can't find the container with id d136c21804dd295f9f1b7eea4ec8b0df46fa34fab1b3a3f46203bb0174aaaecb Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.746307 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7plvr"] Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.963830 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f7kxk" event={"ID":"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9","Type":"ContainerStarted","Data":"fcabc9730eb13681654f46f7ae5f6b8f435e5ee9ce3f188acf54bfa14ea79ec9"} Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.966561 4854 generic.go:334] "Generic (PLEG): container finished" podID="e2a2edca-4c99-45d7-b47e-12d31a7e649f" containerID="d6e466dc031d969ff433a05de16ba57f46bc96a928c3c245b5265fd6a8ec3281" exitCode=0 Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.966664 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7plvr" event={"ID":"e2a2edca-4c99-45d7-b47e-12d31a7e649f","Type":"ContainerDied","Data":"d6e466dc031d969ff433a05de16ba57f46bc96a928c3c245b5265fd6a8ec3281"} Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.966729 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7plvr" event={"ID":"e2a2edca-4c99-45d7-b47e-12d31a7e649f","Type":"ContainerStarted","Data":"d136c21804dd295f9f1b7eea4ec8b0df46fa34fab1b3a3f46203bb0174aaaecb"} Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.968751 4854 generic.go:334] "Generic (PLEG): container finished" podID="95bc842d-1bac-4c0e-b2c8-4f0838a630ad" containerID="1a711adb612268414ef00836cdfdcd54f8b6b0c7ae72e867963de67d1ce4ec42" exitCode=0 Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.968810 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hfzk2" event={"ID":"95bc842d-1bac-4c0e-b2c8-4f0838a630ad","Type":"ContainerDied","Data":"1a711adb612268414ef00836cdfdcd54f8b6b0c7ae72e867963de67d1ce4ec42"} Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.968828 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hfzk2" event={"ID":"95bc842d-1bac-4c0e-b2c8-4f0838a630ad","Type":"ContainerStarted","Data":"5b856f729453608068fe8561c2ebc1da61ecf28ec2a3011a821967dfc4b60b8d"} Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.970296 4854 generic.go:334] "Generic (PLEG): container finished" podID="96132415-18c4-42a4-bb2c-cec92945602e" containerID="9efa40bc71134d86579d5973dade48957f94cbd186d67ecf45f9895e68a1183d" exitCode=0 Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.970373 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-276gh" event={"ID":"96132415-18c4-42a4-bb2c-cec92945602e","Type":"ContainerDied","Data":"9efa40bc71134d86579d5973dade48957f94cbd186d67ecf45f9895e68a1183d"} Oct 07 12:41:27 crc kubenswrapper[4854]: I1007 12:41:27.970736 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-276gh" event={"ID":"96132415-18c4-42a4-bb2c-cec92945602e","Type":"ContainerStarted","Data":"682c6f226606471b06b1f7f85efa7621143fa0f00d70701ba8afaac29cde7076"} Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.374192 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7plvr" Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.445354 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzzdv\" (UniqueName: \"kubernetes.io/projected/e2a2edca-4c99-45d7-b47e-12d31a7e649f-kube-api-access-pzzdv\") pod \"e2a2edca-4c99-45d7-b47e-12d31a7e649f\" (UID: \"e2a2edca-4c99-45d7-b47e-12d31a7e649f\") " Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.466462 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a2edca-4c99-45d7-b47e-12d31a7e649f-kube-api-access-pzzdv" (OuterVolumeSpecName: "kube-api-access-pzzdv") pod "e2a2edca-4c99-45d7-b47e-12d31a7e649f" (UID: "e2a2edca-4c99-45d7-b47e-12d31a7e649f"). InnerVolumeSpecName "kube-api-access-pzzdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.517039 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hfzk2" Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.524811 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-276gh" Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.550848 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzzdv\" (UniqueName: \"kubernetes.io/projected/e2a2edca-4c99-45d7-b47e-12d31a7e649f-kube-api-access-pzzdv\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.651619 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8wvz\" (UniqueName: \"kubernetes.io/projected/96132415-18c4-42a4-bb2c-cec92945602e-kube-api-access-d8wvz\") pod \"96132415-18c4-42a4-bb2c-cec92945602e\" (UID: \"96132415-18c4-42a4-bb2c-cec92945602e\") " Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.651732 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7d2c\" (UniqueName: \"kubernetes.io/projected/95bc842d-1bac-4c0e-b2c8-4f0838a630ad-kube-api-access-m7d2c\") pod \"95bc842d-1bac-4c0e-b2c8-4f0838a630ad\" (UID: \"95bc842d-1bac-4c0e-b2c8-4f0838a630ad\") " Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.656274 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96132415-18c4-42a4-bb2c-cec92945602e-kube-api-access-d8wvz" (OuterVolumeSpecName: "kube-api-access-d8wvz") pod "96132415-18c4-42a4-bb2c-cec92945602e" (UID: "96132415-18c4-42a4-bb2c-cec92945602e"). InnerVolumeSpecName "kube-api-access-d8wvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.657972 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95bc842d-1bac-4c0e-b2c8-4f0838a630ad-kube-api-access-m7d2c" (OuterVolumeSpecName: "kube-api-access-m7d2c") pod "95bc842d-1bac-4c0e-b2c8-4f0838a630ad" (UID: "95bc842d-1bac-4c0e-b2c8-4f0838a630ad"). InnerVolumeSpecName "kube-api-access-m7d2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.753588 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7d2c\" (UniqueName: \"kubernetes.io/projected/95bc842d-1bac-4c0e-b2c8-4f0838a630ad-kube-api-access-m7d2c\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.753858 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8wvz\" (UniqueName: \"kubernetes.io/projected/96132415-18c4-42a4-bb2c-cec92945602e-kube-api-access-d8wvz\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.986524 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hfzk2" event={"ID":"95bc842d-1bac-4c0e-b2c8-4f0838a630ad","Type":"ContainerDied","Data":"5b856f729453608068fe8561c2ebc1da61ecf28ec2a3011a821967dfc4b60b8d"} Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.986567 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b856f729453608068fe8561c2ebc1da61ecf28ec2a3011a821967dfc4b60b8d" Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.986629 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hfzk2" Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.988042 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-276gh" event={"ID":"96132415-18c4-42a4-bb2c-cec92945602e","Type":"ContainerDied","Data":"682c6f226606471b06b1f7f85efa7621143fa0f00d70701ba8afaac29cde7076"} Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.988083 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="682c6f226606471b06b1f7f85efa7621143fa0f00d70701ba8afaac29cde7076" Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.988687 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-276gh" Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.990459 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7plvr" event={"ID":"e2a2edca-4c99-45d7-b47e-12d31a7e649f","Type":"ContainerDied","Data":"d136c21804dd295f9f1b7eea4ec8b0df46fa34fab1b3a3f46203bb0174aaaecb"} Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.990494 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d136c21804dd295f9f1b7eea4ec8b0df46fa34fab1b3a3f46203bb0174aaaecb" Oct 07 12:41:29 crc kubenswrapper[4854]: I1007 12:41:29.990553 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7plvr" Oct 07 12:41:32 crc kubenswrapper[4854]: I1007 12:41:32.645002 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:41:32 crc kubenswrapper[4854]: I1007 12:41:32.733981 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-g5xmp"] Oct 07 12:41:32 crc kubenswrapper[4854]: I1007 12:41:32.734222 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-g5xmp" podUID="a039d6d1-21b4-480b-b8ed-c50693487fba" containerName="dnsmasq-dns" containerID="cri-o://59e333b744eb6889433b1f20e4487a8bd3bebd36f78607ef3df410ee233120f6" gracePeriod=10 Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.032725 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f7kxk" event={"ID":"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9","Type":"ContainerStarted","Data":"6f2b3d88892a064ad90556a7c9f95263f2d028c6a003fde0f818fd8da3809ec3"} Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.036097 4854 generic.go:334] "Generic (PLEG): container finished" podID="a039d6d1-21b4-480b-b8ed-c50693487fba" containerID="59e333b744eb6889433b1f20e4487a8bd3bebd36f78607ef3df410ee233120f6" exitCode=0 Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.036157 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-g5xmp" event={"ID":"a039d6d1-21b4-480b-b8ed-c50693487fba","Type":"ContainerDied","Data":"59e333b744eb6889433b1f20e4487a8bd3bebd36f78607ef3df410ee233120f6"} Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.226899 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.250412 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-f7kxk" podStartSLOduration=2.569178256 podStartE2EDuration="7.250386421s" podCreationTimestamp="2025-10-07 12:41:26 +0000 UTC" firstStartedPulling="2025-10-07 12:41:27.688263647 +0000 UTC m=+1003.676095912" lastFinishedPulling="2025-10-07 12:41:32.369471802 +0000 UTC m=+1008.357304077" observedRunningTime="2025-10-07 12:41:33.052987978 +0000 UTC m=+1009.040820243" watchObservedRunningTime="2025-10-07 12:41:33.250386421 +0000 UTC m=+1009.238218686" Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.330092 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-ovsdbserver-sb\") pod \"a039d6d1-21b4-480b-b8ed-c50693487fba\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.330310 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-config\") pod \"a039d6d1-21b4-480b-b8ed-c50693487fba\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.330399 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmml6\" (UniqueName: \"kubernetes.io/projected/a039d6d1-21b4-480b-b8ed-c50693487fba-kube-api-access-fmml6\") pod \"a039d6d1-21b4-480b-b8ed-c50693487fba\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.330443 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-dns-svc\") pod \"a039d6d1-21b4-480b-b8ed-c50693487fba\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.330484 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-ovsdbserver-nb\") pod \"a039d6d1-21b4-480b-b8ed-c50693487fba\" (UID: \"a039d6d1-21b4-480b-b8ed-c50693487fba\") " Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.335421 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a039d6d1-21b4-480b-b8ed-c50693487fba-kube-api-access-fmml6" (OuterVolumeSpecName: "kube-api-access-fmml6") pod "a039d6d1-21b4-480b-b8ed-c50693487fba" (UID: "a039d6d1-21b4-480b-b8ed-c50693487fba"). InnerVolumeSpecName "kube-api-access-fmml6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.371185 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a039d6d1-21b4-480b-b8ed-c50693487fba" (UID: "a039d6d1-21b4-480b-b8ed-c50693487fba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.371264 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a039d6d1-21b4-480b-b8ed-c50693487fba" (UID: "a039d6d1-21b4-480b-b8ed-c50693487fba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.373931 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-config" (OuterVolumeSpecName: "config") pod "a039d6d1-21b4-480b-b8ed-c50693487fba" (UID: "a039d6d1-21b4-480b-b8ed-c50693487fba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.376511 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a039d6d1-21b4-480b-b8ed-c50693487fba" (UID: "a039d6d1-21b4-480b-b8ed-c50693487fba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.432598 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.432635 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmml6\" (UniqueName: \"kubernetes.io/projected/a039d6d1-21b4-480b-b8ed-c50693487fba-kube-api-access-fmml6\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.432645 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.432662 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:33 crc kubenswrapper[4854]: I1007 12:41:33.432672 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a039d6d1-21b4-480b-b8ed-c50693487fba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:34 crc kubenswrapper[4854]: I1007 12:41:34.047295 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-g5xmp" event={"ID":"a039d6d1-21b4-480b-b8ed-c50693487fba","Type":"ContainerDied","Data":"d139652ed59936e19247e2048e5ebbc0241fd32a04ffab193c1963c3e5f1fbd5"} Oct 07 12:41:34 crc kubenswrapper[4854]: I1007 12:41:34.047358 4854 scope.go:117] "RemoveContainer" containerID="59e333b744eb6889433b1f20e4487a8bd3bebd36f78607ef3df410ee233120f6" Oct 07 12:41:34 crc kubenswrapper[4854]: I1007 12:41:34.047360 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-g5xmp" Oct 07 12:41:34 crc kubenswrapper[4854]: I1007 12:41:34.069292 4854 scope.go:117] "RemoveContainer" containerID="83587e7eabfb609f59037051b59e8c8d9fc3ca00cdcbc65b9336aaea9034b914" Oct 07 12:41:34 crc kubenswrapper[4854]: I1007 12:41:34.098085 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-g5xmp"] Oct 07 12:41:34 crc kubenswrapper[4854]: I1007 12:41:34.104323 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-g5xmp"] Oct 07 12:41:34 crc kubenswrapper[4854]: I1007 12:41:34.716990 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a039d6d1-21b4-480b-b8ed-c50693487fba" path="/var/lib/kubelet/pods/a039d6d1-21b4-480b-b8ed-c50693487fba/volumes" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.071259 4854 generic.go:334] "Generic (PLEG): container finished" podID="5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9" containerID="6f2b3d88892a064ad90556a7c9f95263f2d028c6a003fde0f818fd8da3809ec3" exitCode=0 Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.071313 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f7kxk" event={"ID":"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9","Type":"ContainerDied","Data":"6f2b3d88892a064ad90556a7c9f95263f2d028c6a003fde0f818fd8da3809ec3"} Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.653914 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a01a-account-create-c6crg"] Oct 07 12:41:36 crc kubenswrapper[4854]: E1007 12:41:36.654289 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a039d6d1-21b4-480b-b8ed-c50693487fba" containerName="init" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.654305 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a039d6d1-21b4-480b-b8ed-c50693487fba" containerName="init" Oct 07 12:41:36 crc kubenswrapper[4854]: E1007 12:41:36.654322 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a039d6d1-21b4-480b-b8ed-c50693487fba" containerName="dnsmasq-dns" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.654330 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a039d6d1-21b4-480b-b8ed-c50693487fba" containerName="dnsmasq-dns" Oct 07 12:41:36 crc kubenswrapper[4854]: E1007 12:41:36.654356 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95bc842d-1bac-4c0e-b2c8-4f0838a630ad" containerName="mariadb-database-create" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.654362 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="95bc842d-1bac-4c0e-b2c8-4f0838a630ad" containerName="mariadb-database-create" Oct 07 12:41:36 crc kubenswrapper[4854]: E1007 12:41:36.654372 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96132415-18c4-42a4-bb2c-cec92945602e" containerName="mariadb-database-create" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.654378 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="96132415-18c4-42a4-bb2c-cec92945602e" containerName="mariadb-database-create" Oct 07 12:41:36 crc kubenswrapper[4854]: E1007 12:41:36.654388 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a2edca-4c99-45d7-b47e-12d31a7e649f" containerName="mariadb-database-create" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.654394 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a2edca-4c99-45d7-b47e-12d31a7e649f" containerName="mariadb-database-create" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.654565 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2a2edca-4c99-45d7-b47e-12d31a7e649f" containerName="mariadb-database-create" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.654579 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="96132415-18c4-42a4-bb2c-cec92945602e" containerName="mariadb-database-create" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.654593 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="95bc842d-1bac-4c0e-b2c8-4f0838a630ad" containerName="mariadb-database-create" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.654601 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a039d6d1-21b4-480b-b8ed-c50693487fba" containerName="dnsmasq-dns" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.655071 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a01a-account-create-c6crg" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.656807 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.661373 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a01a-account-create-c6crg"] Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.734486 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c7c1-account-create-4vmxx"] Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.735777 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c7c1-account-create-4vmxx" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.738821 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.756478 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c7c1-account-create-4vmxx"] Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.792233 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrfrv\" (UniqueName: \"kubernetes.io/projected/f512fcc4-c7f8-45ae-b165-c7367ee2d7fe-kube-api-access-jrfrv\") pod \"barbican-a01a-account-create-c6crg\" (UID: \"f512fcc4-c7f8-45ae-b165-c7367ee2d7fe\") " pod="openstack/barbican-a01a-account-create-c6crg" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.894236 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm6ct\" (UniqueName: \"kubernetes.io/projected/f3bea35c-bafd-44fc-bd6f-7878a27e1c9b-kube-api-access-zm6ct\") pod \"cinder-c7c1-account-create-4vmxx\" (UID: \"f3bea35c-bafd-44fc-bd6f-7878a27e1c9b\") " pod="openstack/cinder-c7c1-account-create-4vmxx" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.894600 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrfrv\" (UniqueName: \"kubernetes.io/projected/f512fcc4-c7f8-45ae-b165-c7367ee2d7fe-kube-api-access-jrfrv\") pod \"barbican-a01a-account-create-c6crg\" (UID: \"f512fcc4-c7f8-45ae-b165-c7367ee2d7fe\") " pod="openstack/barbican-a01a-account-create-c6crg" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.916227 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrfrv\" (UniqueName: \"kubernetes.io/projected/f512fcc4-c7f8-45ae-b165-c7367ee2d7fe-kube-api-access-jrfrv\") pod \"barbican-a01a-account-create-c6crg\" (UID: \"f512fcc4-c7f8-45ae-b165-c7367ee2d7fe\") " pod="openstack/barbican-a01a-account-create-c6crg" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.991980 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a01a-account-create-c6crg" Oct 07 12:41:36 crc kubenswrapper[4854]: I1007 12:41:36.996258 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm6ct\" (UniqueName: \"kubernetes.io/projected/f3bea35c-bafd-44fc-bd6f-7878a27e1c9b-kube-api-access-zm6ct\") pod \"cinder-c7c1-account-create-4vmxx\" (UID: \"f3bea35c-bafd-44fc-bd6f-7878a27e1c9b\") " pod="openstack/cinder-c7c1-account-create-4vmxx" Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.015558 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm6ct\" (UniqueName: \"kubernetes.io/projected/f3bea35c-bafd-44fc-bd6f-7878a27e1c9b-kube-api-access-zm6ct\") pod \"cinder-c7c1-account-create-4vmxx\" (UID: \"f3bea35c-bafd-44fc-bd6f-7878a27e1c9b\") " pod="openstack/cinder-c7c1-account-create-4vmxx" Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.034704 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b378-account-create-gnv6t"] Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.035753 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b378-account-create-gnv6t" Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.037430 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.053437 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c7c1-account-create-4vmxx" Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.054001 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b378-account-create-gnv6t"] Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.202082 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x4ll\" (UniqueName: \"kubernetes.io/projected/0a87eed9-2ee8-4185-b755-9a12163e51fc-kube-api-access-6x4ll\") pod \"neutron-b378-account-create-gnv6t\" (UID: \"0a87eed9-2ee8-4185-b755-9a12163e51fc\") " pod="openstack/neutron-b378-account-create-gnv6t" Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.304537 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x4ll\" (UniqueName: \"kubernetes.io/projected/0a87eed9-2ee8-4185-b755-9a12163e51fc-kube-api-access-6x4ll\") pod \"neutron-b378-account-create-gnv6t\" (UID: \"0a87eed9-2ee8-4185-b755-9a12163e51fc\") " pod="openstack/neutron-b378-account-create-gnv6t" Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.324603 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x4ll\" (UniqueName: \"kubernetes.io/projected/0a87eed9-2ee8-4185-b755-9a12163e51fc-kube-api-access-6x4ll\") pod \"neutron-b378-account-create-gnv6t\" (UID: \"0a87eed9-2ee8-4185-b755-9a12163e51fc\") " pod="openstack/neutron-b378-account-create-gnv6t" Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.402697 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f7kxk" Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.456966 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a01a-account-create-c6crg"] Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.459201 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b378-account-create-gnv6t" Oct 07 12:41:37 crc kubenswrapper[4854]: W1007 12:41:37.460039 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf512fcc4_c7f8_45ae_b165_c7367ee2d7fe.slice/crio-544dfd1152181ccac9564a6fc25b028cd4ce9de927e6ec7927aa72baa01f1c03 WatchSource:0}: Error finding container 544dfd1152181ccac9564a6fc25b028cd4ce9de927e6ec7927aa72baa01f1c03: Status 404 returned error can't find the container with id 544dfd1152181ccac9564a6fc25b028cd4ce9de927e6ec7927aa72baa01f1c03 Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.513442 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swdj6\" (UniqueName: \"kubernetes.io/projected/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-kube-api-access-swdj6\") pod \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\" (UID: \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\") " Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.513579 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-combined-ca-bundle\") pod \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\" (UID: \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\") " Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.513650 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-config-data\") pod \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\" (UID: \"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9\") " Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.517678 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-kube-api-access-swdj6" (OuterVolumeSpecName: "kube-api-access-swdj6") pod "5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9" (UID: "5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9"). InnerVolumeSpecName "kube-api-access-swdj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.561813 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9" (UID: "5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.566344 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-config-data" (OuterVolumeSpecName: "config-data") pod "5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9" (UID: "5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:37 crc kubenswrapper[4854]: W1007 12:41:37.608994 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3bea35c_bafd_44fc_bd6f_7878a27e1c9b.slice/crio-3ab982a833d06373f1c379ea64deb3aaf035a3adf6e76c1a94ca569f9a443ed4 WatchSource:0}: Error finding container 3ab982a833d06373f1c379ea64deb3aaf035a3adf6e76c1a94ca569f9a443ed4: Status 404 returned error can't find the container with id 3ab982a833d06373f1c379ea64deb3aaf035a3adf6e76c1a94ca569f9a443ed4 Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.611169 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c7c1-account-create-4vmxx"] Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.616721 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swdj6\" (UniqueName: \"kubernetes.io/projected/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-kube-api-access-swdj6\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.616750 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.616760 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:37 crc kubenswrapper[4854]: I1007 12:41:37.876848 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b378-account-create-gnv6t"] Oct 07 12:41:37 crc kubenswrapper[4854]: W1007 12:41:37.924340 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a87eed9_2ee8_4185_b755_9a12163e51fc.slice/crio-a4f3acbef61553390e3a9912a3b2a79d029e0e079c88d4821461a54d4e5082e4 WatchSource:0}: Error finding container a4f3acbef61553390e3a9912a3b2a79d029e0e079c88d4821461a54d4e5082e4: Status 404 returned error can't find the container with id a4f3acbef61553390e3a9912a3b2a79d029e0e079c88d4821461a54d4e5082e4 Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.090125 4854 generic.go:334] "Generic (PLEG): container finished" podID="f3bea35c-bafd-44fc-bd6f-7878a27e1c9b" containerID="cd9b6d596186c3b7a5af7dc88f6515e0c483d24ba77e4029222b5d08fd5ab2b3" exitCode=0 Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.090203 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c7c1-account-create-4vmxx" event={"ID":"f3bea35c-bafd-44fc-bd6f-7878a27e1c9b","Type":"ContainerDied","Data":"cd9b6d596186c3b7a5af7dc88f6515e0c483d24ba77e4029222b5d08fd5ab2b3"} Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.090540 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c7c1-account-create-4vmxx" event={"ID":"f3bea35c-bafd-44fc-bd6f-7878a27e1c9b","Type":"ContainerStarted","Data":"3ab982a833d06373f1c379ea64deb3aaf035a3adf6e76c1a94ca569f9a443ed4"} Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.092589 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b378-account-create-gnv6t" event={"ID":"0a87eed9-2ee8-4185-b755-9a12163e51fc","Type":"ContainerStarted","Data":"a4f3acbef61553390e3a9912a3b2a79d029e0e079c88d4821461a54d4e5082e4"} Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.095536 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f7kxk" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.095529 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f7kxk" event={"ID":"5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9","Type":"ContainerDied","Data":"fcabc9730eb13681654f46f7ae5f6b8f435e5ee9ce3f188acf54bfa14ea79ec9"} Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.095699 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcabc9730eb13681654f46f7ae5f6b8f435e5ee9ce3f188acf54bfa14ea79ec9" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.097003 4854 generic.go:334] "Generic (PLEG): container finished" podID="f512fcc4-c7f8-45ae-b165-c7367ee2d7fe" containerID="1e97e61af866a59485cfb0388fedb4446c940191621505d289d2386b79d94bba" exitCode=0 Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.097082 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a01a-account-create-c6crg" event={"ID":"f512fcc4-c7f8-45ae-b165-c7367ee2d7fe","Type":"ContainerDied","Data":"1e97e61af866a59485cfb0388fedb4446c940191621505d289d2386b79d94bba"} Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.097366 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a01a-account-create-c6crg" event={"ID":"f512fcc4-c7f8-45ae-b165-c7367ee2d7fe","Type":"ContainerStarted","Data":"544dfd1152181ccac9564a6fc25b028cd4ce9de927e6ec7927aa72baa01f1c03"} Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.324222 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ntvkx"] Oct 07 12:41:38 crc kubenswrapper[4854]: E1007 12:41:38.324639 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9" containerName="keystone-db-sync" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.324658 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9" containerName="keystone-db-sync" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.324894 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9" containerName="keystone-db-sync" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.326058 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.345403 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ntvkx"] Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.405658 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5464v"] Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.407219 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.413070 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.413501 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.413892 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.420014 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-m9mdj" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.430846 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5464v"] Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.432028 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwgz5\" (UniqueName: \"kubernetes.io/projected/24ded2b7-c301-400f-8eb2-4039e7fafd76-kube-api-access-dwgz5\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.432101 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.432159 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-config\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.432195 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.432237 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.432258 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-dns-svc\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.533898 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-config\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.533992 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.534071 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-credential-keys\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.534094 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzffg\" (UniqueName: \"kubernetes.io/projected/d8d81b1b-8918-46cf-9902-ee7952aea166-kube-api-access-nzffg\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.534253 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.534284 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-scripts\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.534978 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-config\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.535107 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.535218 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.535308 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-dns-svc\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.536094 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-dns-svc\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.536290 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwgz5\" (UniqueName: \"kubernetes.io/projected/24ded2b7-c301-400f-8eb2-4039e7fafd76-kube-api-access-dwgz5\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.536660 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-combined-ca-bundle\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.536752 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-config-data\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.536796 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-fernet-keys\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.536889 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.537740 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.557518 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwgz5\" (UniqueName: \"kubernetes.io/projected/24ded2b7-c301-400f-8eb2-4039e7fafd76-kube-api-access-dwgz5\") pod \"dnsmasq-dns-847c4cc679-ntvkx\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.587183 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.589376 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.604723 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.605082 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.609729 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.638866 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-credential-keys\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.638914 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzffg\" (UniqueName: \"kubernetes.io/projected/d8d81b1b-8918-46cf-9902-ee7952aea166-kube-api-access-nzffg\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.638962 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-scripts\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.639002 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-combined-ca-bundle\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.639033 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-config-data\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.639059 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-fernet-keys\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.645543 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-scripts\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.648969 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.649049 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-credential-keys\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.650222 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-config-data\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.655703 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-fernet-keys\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.656333 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-combined-ca-bundle\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.680011 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzffg\" (UniqueName: \"kubernetes.io/projected/d8d81b1b-8918-46cf-9902-ee7952aea166-kube-api-access-nzffg\") pod \"keystone-bootstrap-5464v\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.727872 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.743691 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.743745 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.743770 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-config-data\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.744025 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/211f81db-7ca9-43c6-bd31-aa60758129e6-log-httpd\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.744182 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/211f81db-7ca9-43c6-bd31-aa60758129e6-run-httpd\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.744257 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-scripts\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.744372 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll5l5\" (UniqueName: \"kubernetes.io/projected/211f81db-7ca9-43c6-bd31-aa60758129e6-kube-api-access-ll5l5\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.808743 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ntvkx"] Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.846540 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jbv7l"] Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.848364 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.852309 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.852820 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-phbmn" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.852835 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.853504 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.853577 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.853611 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-config-data\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.853830 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/211f81db-7ca9-43c6-bd31-aa60758129e6-log-httpd\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.853956 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/211f81db-7ca9-43c6-bd31-aa60758129e6-run-httpd\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.854040 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-scripts\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.854249 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll5l5\" (UniqueName: \"kubernetes.io/projected/211f81db-7ca9-43c6-bd31-aa60758129e6-kube-api-access-ll5l5\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.857638 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/211f81db-7ca9-43c6-bd31-aa60758129e6-run-httpd\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.861719 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.862003 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/211f81db-7ca9-43c6-bd31-aa60758129e6-log-httpd\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.867350 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-scripts\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.874751 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.877398 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jbv7l"] Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.880201 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-config-data\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.902178 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll5l5\" (UniqueName: \"kubernetes.io/projected/211f81db-7ca9-43c6-bd31-aa60758129e6-kube-api-access-ll5l5\") pod \"ceilometer-0\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.903897 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-9872c"] Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.906069 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.931946 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.948774 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-9872c"] Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.957023 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06a39602-5206-42d4-a283-31650db9bd54-logs\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.957093 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-scripts\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.957194 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf6c2\" (UniqueName: \"kubernetes.io/projected/06a39602-5206-42d4-a283-31650db9bd54-kube-api-access-sf6c2\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.957240 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-config-data\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:38 crc kubenswrapper[4854]: I1007 12:41:38.957334 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-combined-ca-bundle\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.058936 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-combined-ca-bundle\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.059052 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06a39602-5206-42d4-a283-31650db9bd54-logs\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.059121 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-scripts\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.059201 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.059262 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf6c2\" (UniqueName: \"kubernetes.io/projected/06a39602-5206-42d4-a283-31650db9bd54-kube-api-access-sf6c2\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.059302 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-config-data\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.059346 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-config\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.059376 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8jlw\" (UniqueName: \"kubernetes.io/projected/a1a00ddb-7ed9-45dd-98bf-a2be628047df-kube-api-access-r8jlw\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.059422 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.059463 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.059511 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.060049 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06a39602-5206-42d4-a283-31650db9bd54-logs\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.064426 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-combined-ca-bundle\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.066368 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-scripts\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.071179 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-config-data\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.084264 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf6c2\" (UniqueName: \"kubernetes.io/projected/06a39602-5206-42d4-a283-31650db9bd54-kube-api-access-sf6c2\") pod \"placement-db-sync-jbv7l\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.114672 4854 generic.go:334] "Generic (PLEG): container finished" podID="0a87eed9-2ee8-4185-b755-9a12163e51fc" containerID="9a8de10acfbda038f5f2220e688063f4e5b34be8fcfd2fa26c8f493b2c99adb4" exitCode=0 Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.115355 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b378-account-create-gnv6t" event={"ID":"0a87eed9-2ee8-4185-b755-9a12163e51fc","Type":"ContainerDied","Data":"9a8de10acfbda038f5f2220e688063f4e5b34be8fcfd2fa26c8f493b2c99adb4"} Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.161471 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.161599 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.161632 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-config\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.161655 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8jlw\" (UniqueName: \"kubernetes.io/projected/a1a00ddb-7ed9-45dd-98bf-a2be628047df-kube-api-access-r8jlw\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.161674 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.161719 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.162642 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.162776 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-config\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.163485 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.164024 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.170977 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.177583 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jbv7l" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.184392 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8jlw\" (UniqueName: \"kubernetes.io/projected/a1a00ddb-7ed9-45dd-98bf-a2be628047df-kube-api-access-r8jlw\") pod \"dnsmasq-dns-785d8bcb8c-9872c\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.292825 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5464v"] Oct 07 12:41:39 crc kubenswrapper[4854]: W1007 12:41:39.321840 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8d81b1b_8918_46cf_9902_ee7952aea166.slice/crio-79f72559a58c8740cc06ede79fe34e45041c37867e0ff61ada5281ee4b9e433b WatchSource:0}: Error finding container 79f72559a58c8740cc06ede79fe34e45041c37867e0ff61ada5281ee4b9e433b: Status 404 returned error can't find the container with id 79f72559a58c8740cc06ede79fe34e45041c37867e0ff61ada5281ee4b9e433b Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.360601 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ntvkx"] Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.365025 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:39 crc kubenswrapper[4854]: W1007 12:41:39.376694 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24ded2b7_c301_400f_8eb2_4039e7fafd76.slice/crio-3a6b9e8c75eb1d03d3ff435fa0336b6f552f2c3f1667ee81926d9600820f2330 WatchSource:0}: Error finding container 3a6b9e8c75eb1d03d3ff435fa0336b6f552f2c3f1667ee81926d9600820f2330: Status 404 returned error can't find the container with id 3a6b9e8c75eb1d03d3ff435fa0336b6f552f2c3f1667ee81926d9600820f2330 Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.494359 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.502725 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.507933 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.508304 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rplkj" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.508411 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.508571 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.512216 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.598865 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.605580 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.617803 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.619984 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.658078 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.686070 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.686125 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-scripts\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.686189 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.686430 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vn4r\" (UniqueName: \"kubernetes.io/projected/ca5c432d-812b-476b-b996-f54bd8d76eb6-kube-api-access-2vn4r\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.686499 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.686518 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca5c432d-812b-476b-b996-f54bd8d76eb6-logs\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.686540 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ca5c432d-812b-476b-b996-f54bd8d76eb6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.686634 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-config-data\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.687945 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.692364 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c7c1-account-create-4vmxx" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.752485 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a01a-account-create-c6crg" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.788794 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm6ct\" (UniqueName: \"kubernetes.io/projected/f3bea35c-bafd-44fc-bd6f-7878a27e1c9b-kube-api-access-zm6ct\") pod \"f3bea35c-bafd-44fc-bd6f-7878a27e1c9b\" (UID: \"f3bea35c-bafd-44fc-bd6f-7878a27e1c9b\") " Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789270 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789314 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-scripts\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789336 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789376 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vn4r\" (UniqueName: \"kubernetes.io/projected/ca5c432d-812b-476b-b996-f54bd8d76eb6-kube-api-access-2vn4r\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789413 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f195965c-64f5-46e5-aaed-7acdc7af3d64-logs\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789441 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f195965c-64f5-46e5-aaed-7acdc7af3d64-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789485 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789524 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca5c432d-812b-476b-b996-f54bd8d76eb6-logs\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789555 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ca5c432d-812b-476b-b996-f54bd8d76eb6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789579 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789595 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47nph\" (UniqueName: \"kubernetes.io/projected/f195965c-64f5-46e5-aaed-7acdc7af3d64-kube-api-access-47nph\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789638 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789670 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-config-data\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789690 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789716 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.789771 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.790398 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.790652 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ca5c432d-812b-476b-b996-f54bd8d76eb6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.790953 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca5c432d-812b-476b-b996-f54bd8d76eb6-logs\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.794946 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3bea35c-bafd-44fc-bd6f-7878a27e1c9b-kube-api-access-zm6ct" (OuterVolumeSpecName: "kube-api-access-zm6ct") pod "f3bea35c-bafd-44fc-bd6f-7878a27e1c9b" (UID: "f3bea35c-bafd-44fc-bd6f-7878a27e1c9b"). InnerVolumeSpecName "kube-api-access-zm6ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.797286 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-config-data\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.800626 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.801324 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-scripts\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.808332 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.811542 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vn4r\" (UniqueName: \"kubernetes.io/projected/ca5c432d-812b-476b-b996-f54bd8d76eb6-kube-api-access-2vn4r\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.830324 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.869624 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.880185 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jbv7l"] Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.891030 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrfrv\" (UniqueName: \"kubernetes.io/projected/f512fcc4-c7f8-45ae-b165-c7367ee2d7fe-kube-api-access-jrfrv\") pod \"f512fcc4-c7f8-45ae-b165-c7367ee2d7fe\" (UID: \"f512fcc4-c7f8-45ae-b165-c7367ee2d7fe\") " Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.891588 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.891703 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f195965c-64f5-46e5-aaed-7acdc7af3d64-logs\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.891730 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f195965c-64f5-46e5-aaed-7acdc7af3d64-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.891791 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.891811 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47nph\" (UniqueName: \"kubernetes.io/projected/f195965c-64f5-46e5-aaed-7acdc7af3d64-kube-api-access-47nph\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.891849 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.891886 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.891921 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.891985 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm6ct\" (UniqueName: \"kubernetes.io/projected/f3bea35c-bafd-44fc-bd6f-7878a27e1c9b-kube-api-access-zm6ct\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.892775 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.892957 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f195965c-64f5-46e5-aaed-7acdc7af3d64-logs\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.894028 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f195965c-64f5-46e5-aaed-7acdc7af3d64-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.898430 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f512fcc4-c7f8-45ae-b165-c7367ee2d7fe-kube-api-access-jrfrv" (OuterVolumeSpecName: "kube-api-access-jrfrv") pod "f512fcc4-c7f8-45ae-b165-c7367ee2d7fe" (UID: "f512fcc4-c7f8-45ae-b165-c7367ee2d7fe"). InnerVolumeSpecName "kube-api-access-jrfrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.898920 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.899514 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.899861 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: W1007 12:41:39.900471 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06a39602_5206_42d4_a283_31650db9bd54.slice/crio-4fef353d351699b99c8f1ea49ee901a8d5ff872a0f86307280b3c7dae97f5c2f WatchSource:0}: Error finding container 4fef353d351699b99c8f1ea49ee901a8d5ff872a0f86307280b3c7dae97f5c2f: Status 404 returned error can't find the container with id 4fef353d351699b99c8f1ea49ee901a8d5ff872a0f86307280b3c7dae97f5c2f Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.906310 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.913635 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47nph\" (UniqueName: \"kubernetes.io/projected/f195965c-64f5-46e5-aaed-7acdc7af3d64-kube-api-access-47nph\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.940553 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.948616 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 12:41:39 crc kubenswrapper[4854]: I1007 12:41:39.994430 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrfrv\" (UniqueName: \"kubernetes.io/projected/f512fcc4-c7f8-45ae-b165-c7367ee2d7fe-kube-api-access-jrfrv\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.052469 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-9872c"] Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.151733 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"211f81db-7ca9-43c6-bd31-aa60758129e6","Type":"ContainerStarted","Data":"1b0a68f6314056c43b63809a365e79c8d497826950a3610f6a1f7e88ff6bfa05"} Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.154923 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" event={"ID":"a1a00ddb-7ed9-45dd-98bf-a2be628047df","Type":"ContainerStarted","Data":"dc6378d82309d7d3a0d11d04f8a7ea0c91a0772e184d20d86c22de15ec20a98e"} Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.157496 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5464v" event={"ID":"d8d81b1b-8918-46cf-9902-ee7952aea166","Type":"ContainerStarted","Data":"214457d553a0c0b56dcc7fb564ba8c1329e444532bf12a604618bd94f53179fc"} Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.157532 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5464v" event={"ID":"d8d81b1b-8918-46cf-9902-ee7952aea166","Type":"ContainerStarted","Data":"79f72559a58c8740cc06ede79fe34e45041c37867e0ff61ada5281ee4b9e433b"} Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.166218 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c7c1-account-create-4vmxx" event={"ID":"f3bea35c-bafd-44fc-bd6f-7878a27e1c9b","Type":"ContainerDied","Data":"3ab982a833d06373f1c379ea64deb3aaf035a3adf6e76c1a94ca569f9a443ed4"} Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.166261 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ab982a833d06373f1c379ea64deb3aaf035a3adf6e76c1a94ca569f9a443ed4" Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.166335 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c7c1-account-create-4vmxx" Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.189688 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5464v" podStartSLOduration=2.189670043 podStartE2EDuration="2.189670043s" podCreationTimestamp="2025-10-07 12:41:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:41:40.176676045 +0000 UTC m=+1016.164508300" watchObservedRunningTime="2025-10-07 12:41:40.189670043 +0000 UTC m=+1016.177502298" Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.194321 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jbv7l" event={"ID":"06a39602-5206-42d4-a283-31650db9bd54","Type":"ContainerStarted","Data":"4fef353d351699b99c8f1ea49ee901a8d5ff872a0f86307280b3c7dae97f5c2f"} Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.206564 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a01a-account-create-c6crg" event={"ID":"f512fcc4-c7f8-45ae-b165-c7367ee2d7fe","Type":"ContainerDied","Data":"544dfd1152181ccac9564a6fc25b028cd4ce9de927e6ec7927aa72baa01f1c03"} Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.206606 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="544dfd1152181ccac9564a6fc25b028cd4ce9de927e6ec7927aa72baa01f1c03" Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.209025 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a01a-account-create-c6crg" Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.268740 4854 generic.go:334] "Generic (PLEG): container finished" podID="24ded2b7-c301-400f-8eb2-4039e7fafd76" containerID="d89b564630f8207190e62bf0321573fbc5f0179007383309715b8cfe6414823e" exitCode=0 Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.268884 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" event={"ID":"24ded2b7-c301-400f-8eb2-4039e7fafd76","Type":"ContainerDied","Data":"d89b564630f8207190e62bf0321573fbc5f0179007383309715b8cfe6414823e"} Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.268935 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" event={"ID":"24ded2b7-c301-400f-8eb2-4039e7fafd76","Type":"ContainerStarted","Data":"3a6b9e8c75eb1d03d3ff435fa0336b6f552f2c3f1667ee81926d9600820f2330"} Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.686775 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.757205 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.798286 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.807644 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.807706 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.809042 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b378-account-create-gnv6t" Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.825606 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.842293 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x4ll\" (UniqueName: \"kubernetes.io/projected/0a87eed9-2ee8-4185-b755-9a12163e51fc-kube-api-access-6x4ll\") pod \"0a87eed9-2ee8-4185-b755-9a12163e51fc\" (UID: \"0a87eed9-2ee8-4185-b755-9a12163e51fc\") " Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.852367 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a87eed9-2ee8-4185-b755-9a12163e51fc-kube-api-access-6x4ll" (OuterVolumeSpecName: "kube-api-access-6x4ll") pod "0a87eed9-2ee8-4185-b755-9a12163e51fc" (UID: "0a87eed9-2ee8-4185-b755-9a12163e51fc"). InnerVolumeSpecName "kube-api-access-6x4ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.854729 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.951036 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-ovsdbserver-nb\") pod \"24ded2b7-c301-400f-8eb2-4039e7fafd76\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.951118 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-config\") pod \"24ded2b7-c301-400f-8eb2-4039e7fafd76\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.951202 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwgz5\" (UniqueName: \"kubernetes.io/projected/24ded2b7-c301-400f-8eb2-4039e7fafd76-kube-api-access-dwgz5\") pod \"24ded2b7-c301-400f-8eb2-4039e7fafd76\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.951314 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-dns-swift-storage-0\") pod \"24ded2b7-c301-400f-8eb2-4039e7fafd76\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.951367 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-ovsdbserver-sb\") pod \"24ded2b7-c301-400f-8eb2-4039e7fafd76\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.951438 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-dns-svc\") pod \"24ded2b7-c301-400f-8eb2-4039e7fafd76\" (UID: \"24ded2b7-c301-400f-8eb2-4039e7fafd76\") " Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.951868 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x4ll\" (UniqueName: \"kubernetes.io/projected/0a87eed9-2ee8-4185-b755-9a12163e51fc-kube-api-access-6x4ll\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.960718 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:41:40 crc kubenswrapper[4854]: I1007 12:41:40.971716 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24ded2b7-c301-400f-8eb2-4039e7fafd76-kube-api-access-dwgz5" (OuterVolumeSpecName: "kube-api-access-dwgz5") pod "24ded2b7-c301-400f-8eb2-4039e7fafd76" (UID: "24ded2b7-c301-400f-8eb2-4039e7fafd76"). InnerVolumeSpecName "kube-api-access-dwgz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.017723 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "24ded2b7-c301-400f-8eb2-4039e7fafd76" (UID: "24ded2b7-c301-400f-8eb2-4039e7fafd76"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.023224 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "24ded2b7-c301-400f-8eb2-4039e7fafd76" (UID: "24ded2b7-c301-400f-8eb2-4039e7fafd76"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.025791 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "24ded2b7-c301-400f-8eb2-4039e7fafd76" (UID: "24ded2b7-c301-400f-8eb2-4039e7fafd76"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.028864 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-config" (OuterVolumeSpecName: "config") pod "24ded2b7-c301-400f-8eb2-4039e7fafd76" (UID: "24ded2b7-c301-400f-8eb2-4039e7fafd76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.029695 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "24ded2b7-c301-400f-8eb2-4039e7fafd76" (UID: "24ded2b7-c301-400f-8eb2-4039e7fafd76"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.055349 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.055393 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.055406 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.055418 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwgz5\" (UniqueName: \"kubernetes.io/projected/24ded2b7-c301-400f-8eb2-4039e7fafd76-kube-api-access-dwgz5\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.055430 4854 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.055441 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24ded2b7-c301-400f-8eb2-4039e7fafd76-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:41 crc kubenswrapper[4854]: E1007 12:41:41.067743 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1a00ddb_7ed9_45dd_98bf_a2be628047df.slice/crio-conmon-5eff53ddae156645ff120293c2b04f9524dfccaeeb600fce9803a546cd9ac95a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1a00ddb_7ed9_45dd_98bf_a2be628047df.slice/crio-5eff53ddae156645ff120293c2b04f9524dfccaeeb600fce9803a546cd9ac95a.scope\": RecentStats: unable to find data in memory cache]" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.280811 4854 generic.go:334] "Generic (PLEG): container finished" podID="a1a00ddb-7ed9-45dd-98bf-a2be628047df" containerID="5eff53ddae156645ff120293c2b04f9524dfccaeeb600fce9803a546cd9ac95a" exitCode=0 Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.280894 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" event={"ID":"a1a00ddb-7ed9-45dd-98bf-a2be628047df","Type":"ContainerDied","Data":"5eff53ddae156645ff120293c2b04f9524dfccaeeb600fce9803a546cd9ac95a"} Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.286893 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b378-account-create-gnv6t" event={"ID":"0a87eed9-2ee8-4185-b755-9a12163e51fc","Type":"ContainerDied","Data":"a4f3acbef61553390e3a9912a3b2a79d029e0e079c88d4821461a54d4e5082e4"} Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.286929 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4f3acbef61553390e3a9912a3b2a79d029e0e079c88d4821461a54d4e5082e4" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.286993 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b378-account-create-gnv6t" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.295188 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ca5c432d-812b-476b-b996-f54bd8d76eb6","Type":"ContainerStarted","Data":"033bb90bf6173bd9f8d575695fbb8ad3302d612ed4e7c7df3b6c8fbe1b2da540"} Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.322238 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.322269 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-ntvkx" event={"ID":"24ded2b7-c301-400f-8eb2-4039e7fafd76","Type":"ContainerDied","Data":"3a6b9e8c75eb1d03d3ff435fa0336b6f552f2c3f1667ee81926d9600820f2330"} Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.322471 4854 scope.go:117] "RemoveContainer" containerID="d89b564630f8207190e62bf0321573fbc5f0179007383309715b8cfe6414823e" Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.324769 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f195965c-64f5-46e5-aaed-7acdc7af3d64","Type":"ContainerStarted","Data":"7ef27292003a2e68dd4757f53f3f1a652ca7692f683cbf7a028c601cf3d08582"} Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.415183 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ntvkx"] Oct 07 12:41:41 crc kubenswrapper[4854]: I1007 12:41:41.419289 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ntvkx"] Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.003659 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-hnhnd"] Oct 07 12:41:42 crc kubenswrapper[4854]: E1007 12:41:42.004386 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3bea35c-bafd-44fc-bd6f-7878a27e1c9b" containerName="mariadb-account-create" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.004403 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3bea35c-bafd-44fc-bd6f-7878a27e1c9b" containerName="mariadb-account-create" Oct 07 12:41:42 crc kubenswrapper[4854]: E1007 12:41:42.004429 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f512fcc4-c7f8-45ae-b165-c7367ee2d7fe" containerName="mariadb-account-create" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.004436 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f512fcc4-c7f8-45ae-b165-c7367ee2d7fe" containerName="mariadb-account-create" Oct 07 12:41:42 crc kubenswrapper[4854]: E1007 12:41:42.004449 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a87eed9-2ee8-4185-b755-9a12163e51fc" containerName="mariadb-account-create" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.004458 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a87eed9-2ee8-4185-b755-9a12163e51fc" containerName="mariadb-account-create" Oct 07 12:41:42 crc kubenswrapper[4854]: E1007 12:41:42.004468 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24ded2b7-c301-400f-8eb2-4039e7fafd76" containerName="init" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.004476 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="24ded2b7-c301-400f-8eb2-4039e7fafd76" containerName="init" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.004661 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="24ded2b7-c301-400f-8eb2-4039e7fafd76" containerName="init" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.004678 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f512fcc4-c7f8-45ae-b165-c7367ee2d7fe" containerName="mariadb-account-create" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.004696 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a87eed9-2ee8-4185-b755-9a12163e51fc" containerName="mariadb-account-create" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.004704 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3bea35c-bafd-44fc-bd6f-7878a27e1c9b" containerName="mariadb-account-create" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.005242 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hnhnd" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.010338 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5vxvs" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.010463 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.025068 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hnhnd"] Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.039012 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-7zmqn"] Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.040394 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.057135 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.057490 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.057799 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7hxgb" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.087855 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57eafbaf-440c-432a-a1ab-55032cd2f54a-combined-ca-bundle\") pod \"barbican-db-sync-hnhnd\" (UID: \"57eafbaf-440c-432a-a1ab-55032cd2f54a\") " pod="openstack/barbican-db-sync-hnhnd" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.087932 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jkcz\" (UniqueName: \"kubernetes.io/projected/57eafbaf-440c-432a-a1ab-55032cd2f54a-kube-api-access-4jkcz\") pod \"barbican-db-sync-hnhnd\" (UID: \"57eafbaf-440c-432a-a1ab-55032cd2f54a\") " pod="openstack/barbican-db-sync-hnhnd" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.087965 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74714c8f-dea6-40be-9985-d254729920c9-etc-machine-id\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.087998 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-combined-ca-bundle\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.088055 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57eafbaf-440c-432a-a1ab-55032cd2f54a-db-sync-config-data\") pod \"barbican-db-sync-hnhnd\" (UID: \"57eafbaf-440c-432a-a1ab-55032cd2f54a\") " pod="openstack/barbican-db-sync-hnhnd" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.088110 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bm65\" (UniqueName: \"kubernetes.io/projected/74714c8f-dea6-40be-9985-d254729920c9-kube-api-access-5bm65\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.088204 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-db-sync-config-data\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.088279 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-config-data\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.088326 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-scripts\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.097307 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7zmqn"] Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.189781 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57eafbaf-440c-432a-a1ab-55032cd2f54a-combined-ca-bundle\") pod \"barbican-db-sync-hnhnd\" (UID: \"57eafbaf-440c-432a-a1ab-55032cd2f54a\") " pod="openstack/barbican-db-sync-hnhnd" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.189851 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jkcz\" (UniqueName: \"kubernetes.io/projected/57eafbaf-440c-432a-a1ab-55032cd2f54a-kube-api-access-4jkcz\") pod \"barbican-db-sync-hnhnd\" (UID: \"57eafbaf-440c-432a-a1ab-55032cd2f54a\") " pod="openstack/barbican-db-sync-hnhnd" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.189876 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74714c8f-dea6-40be-9985-d254729920c9-etc-machine-id\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.189898 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-combined-ca-bundle\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.189938 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57eafbaf-440c-432a-a1ab-55032cd2f54a-db-sync-config-data\") pod \"barbican-db-sync-hnhnd\" (UID: \"57eafbaf-440c-432a-a1ab-55032cd2f54a\") " pod="openstack/barbican-db-sync-hnhnd" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.189982 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bm65\" (UniqueName: \"kubernetes.io/projected/74714c8f-dea6-40be-9985-d254729920c9-kube-api-access-5bm65\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.190038 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-db-sync-config-data\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.190100 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-config-data\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.190139 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-scripts\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.190784 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74714c8f-dea6-40be-9985-d254729920c9-etc-machine-id\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.197442 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57eafbaf-440c-432a-a1ab-55032cd2f54a-combined-ca-bundle\") pod \"barbican-db-sync-hnhnd\" (UID: \"57eafbaf-440c-432a-a1ab-55032cd2f54a\") " pod="openstack/barbican-db-sync-hnhnd" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.198491 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-scripts\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.201549 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-config-data\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.207590 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-combined-ca-bundle\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.207746 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57eafbaf-440c-432a-a1ab-55032cd2f54a-db-sync-config-data\") pod \"barbican-db-sync-hnhnd\" (UID: \"57eafbaf-440c-432a-a1ab-55032cd2f54a\") " pod="openstack/barbican-db-sync-hnhnd" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.211384 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bm65\" (UniqueName: \"kubernetes.io/projected/74714c8f-dea6-40be-9985-d254729920c9-kube-api-access-5bm65\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.211544 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-db-sync-config-data\") pod \"cinder-db-sync-7zmqn\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.214484 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jkcz\" (UniqueName: \"kubernetes.io/projected/57eafbaf-440c-432a-a1ab-55032cd2f54a-kube-api-access-4jkcz\") pod \"barbican-db-sync-hnhnd\" (UID: \"57eafbaf-440c-432a-a1ab-55032cd2f54a\") " pod="openstack/barbican-db-sync-hnhnd" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.319993 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-qk4ws"] Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.321350 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qk4ws" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.325134 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.326571 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.326613 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-45gdk" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.329793 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qk4ws"] Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.339003 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hnhnd" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.362858 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f195965c-64f5-46e5-aaed-7acdc7af3d64","Type":"ContainerStarted","Data":"fac286218a4ce4d48b5aa6d65b2f8184e66d1fb966ae5d7227795ac793e05cb9"} Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.369499 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" event={"ID":"a1a00ddb-7ed9-45dd-98bf-a2be628047df","Type":"ContainerStarted","Data":"55fbf956fe0dfb40eef07681e52b8b94d3944ac0ae84361d414261f85bd29c80"} Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.369565 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.373368 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ca5c432d-812b-476b-b996-f54bd8d76eb6","Type":"ContainerStarted","Data":"ca0db39e60e9d204fd72ee919b6b446ed0dae75bed3ce0f642c58490cabd0159"} Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.394425 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" podStartSLOduration=4.394399677 podStartE2EDuration="4.394399677s" podCreationTimestamp="2025-10-07 12:41:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:41:42.385870318 +0000 UTC m=+1018.373702583" watchObservedRunningTime="2025-10-07 12:41:42.394399677 +0000 UTC m=+1018.382231922" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.395210 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.395697 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8md7\" (UniqueName: \"kubernetes.io/projected/c462d02f-dfcd-48f7-b755-fb203afcb213-kube-api-access-d8md7\") pod \"neutron-db-sync-qk4ws\" (UID: \"c462d02f-dfcd-48f7-b755-fb203afcb213\") " pod="openstack/neutron-db-sync-qk4ws" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.395970 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c462d02f-dfcd-48f7-b755-fb203afcb213-combined-ca-bundle\") pod \"neutron-db-sync-qk4ws\" (UID: \"c462d02f-dfcd-48f7-b755-fb203afcb213\") " pod="openstack/neutron-db-sync-qk4ws" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.396081 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c462d02f-dfcd-48f7-b755-fb203afcb213-config\") pod \"neutron-db-sync-qk4ws\" (UID: \"c462d02f-dfcd-48f7-b755-fb203afcb213\") " pod="openstack/neutron-db-sync-qk4ws" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.497586 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8md7\" (UniqueName: \"kubernetes.io/projected/c462d02f-dfcd-48f7-b755-fb203afcb213-kube-api-access-d8md7\") pod \"neutron-db-sync-qk4ws\" (UID: \"c462d02f-dfcd-48f7-b755-fb203afcb213\") " pod="openstack/neutron-db-sync-qk4ws" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.498242 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c462d02f-dfcd-48f7-b755-fb203afcb213-combined-ca-bundle\") pod \"neutron-db-sync-qk4ws\" (UID: \"c462d02f-dfcd-48f7-b755-fb203afcb213\") " pod="openstack/neutron-db-sync-qk4ws" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.498314 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c462d02f-dfcd-48f7-b755-fb203afcb213-config\") pod \"neutron-db-sync-qk4ws\" (UID: \"c462d02f-dfcd-48f7-b755-fb203afcb213\") " pod="openstack/neutron-db-sync-qk4ws" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.506706 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c462d02f-dfcd-48f7-b755-fb203afcb213-combined-ca-bundle\") pod \"neutron-db-sync-qk4ws\" (UID: \"c462d02f-dfcd-48f7-b755-fb203afcb213\") " pod="openstack/neutron-db-sync-qk4ws" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.508082 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c462d02f-dfcd-48f7-b755-fb203afcb213-config\") pod \"neutron-db-sync-qk4ws\" (UID: \"c462d02f-dfcd-48f7-b755-fb203afcb213\") " pod="openstack/neutron-db-sync-qk4ws" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.517852 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8md7\" (UniqueName: \"kubernetes.io/projected/c462d02f-dfcd-48f7-b755-fb203afcb213-kube-api-access-d8md7\") pod \"neutron-db-sync-qk4ws\" (UID: \"c462d02f-dfcd-48f7-b755-fb203afcb213\") " pod="openstack/neutron-db-sync-qk4ws" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.653895 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qk4ws" Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.696671 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hnhnd"] Oct 07 12:41:42 crc kubenswrapper[4854]: I1007 12:41:42.726353 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24ded2b7-c301-400f-8eb2-4039e7fafd76" path="/var/lib/kubelet/pods/24ded2b7-c301-400f-8eb2-4039e7fafd76/volumes" Oct 07 12:41:42 crc kubenswrapper[4854]: W1007 12:41:42.731129 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57eafbaf_440c_432a_a1ab_55032cd2f54a.slice/crio-6933ceca5b59314f6307fe8916bb7d3623069b2569d09370d903ba6019f8cb0d WatchSource:0}: Error finding container 6933ceca5b59314f6307fe8916bb7d3623069b2569d09370d903ba6019f8cb0d: Status 404 returned error can't find the container with id 6933ceca5b59314f6307fe8916bb7d3623069b2569d09370d903ba6019f8cb0d Oct 07 12:41:43 crc kubenswrapper[4854]: I1007 12:41:43.002699 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7zmqn"] Oct 07 12:41:43 crc kubenswrapper[4854]: I1007 12:41:43.174012 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qk4ws"] Oct 07 12:41:43 crc kubenswrapper[4854]: W1007 12:41:43.188110 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc462d02f_dfcd_48f7_b755_fb203afcb213.slice/crio-c5f24997feb51a8f3b35e75eef469dc17b34d575e0f69fbf639636cde3383383 WatchSource:0}: Error finding container c5f24997feb51a8f3b35e75eef469dc17b34d575e0f69fbf639636cde3383383: Status 404 returned error can't find the container with id c5f24997feb51a8f3b35e75eef469dc17b34d575e0f69fbf639636cde3383383 Oct 07 12:41:43 crc kubenswrapper[4854]: I1007 12:41:43.384407 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f195965c-64f5-46e5-aaed-7acdc7af3d64","Type":"ContainerStarted","Data":"9abe1ec76e1bc3b2e6c1fbb61b65dab7971238d75d1de7af6e076363473ef615"} Oct 07 12:41:43 crc kubenswrapper[4854]: I1007 12:41:43.384515 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f195965c-64f5-46e5-aaed-7acdc7af3d64" containerName="glance-log" containerID="cri-o://fac286218a4ce4d48b5aa6d65b2f8184e66d1fb966ae5d7227795ac793e05cb9" gracePeriod=30 Oct 07 12:41:43 crc kubenswrapper[4854]: I1007 12:41:43.384622 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f195965c-64f5-46e5-aaed-7acdc7af3d64" containerName="glance-httpd" containerID="cri-o://9abe1ec76e1bc3b2e6c1fbb61b65dab7971238d75d1de7af6e076363473ef615" gracePeriod=30 Oct 07 12:41:43 crc kubenswrapper[4854]: I1007 12:41:43.389257 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hnhnd" event={"ID":"57eafbaf-440c-432a-a1ab-55032cd2f54a","Type":"ContainerStarted","Data":"6933ceca5b59314f6307fe8916bb7d3623069b2569d09370d903ba6019f8cb0d"} Oct 07 12:41:43 crc kubenswrapper[4854]: I1007 12:41:43.392029 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7zmqn" event={"ID":"74714c8f-dea6-40be-9985-d254729920c9","Type":"ContainerStarted","Data":"aa6ab021343becef4a411d5f7376aa4406dae9aa269d7bc64b7af99e09053dba"} Oct 07 12:41:43 crc kubenswrapper[4854]: I1007 12:41:43.395975 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ca5c432d-812b-476b-b996-f54bd8d76eb6","Type":"ContainerStarted","Data":"033723c2d3fe66822608ed294192bb053c3e6f5a1e3050f8582dbdd9eee28d65"} Oct 07 12:41:43 crc kubenswrapper[4854]: I1007 12:41:43.396173 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ca5c432d-812b-476b-b996-f54bd8d76eb6" containerName="glance-log" containerID="cri-o://ca0db39e60e9d204fd72ee919b6b446ed0dae75bed3ce0f642c58490cabd0159" gracePeriod=30 Oct 07 12:41:43 crc kubenswrapper[4854]: I1007 12:41:43.396476 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ca5c432d-812b-476b-b996-f54bd8d76eb6" containerName="glance-httpd" containerID="cri-o://033723c2d3fe66822608ed294192bb053c3e6f5a1e3050f8582dbdd9eee28d65" gracePeriod=30 Oct 07 12:41:43 crc kubenswrapper[4854]: I1007 12:41:43.400531 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qk4ws" event={"ID":"c462d02f-dfcd-48f7-b755-fb203afcb213","Type":"ContainerStarted","Data":"c5f24997feb51a8f3b35e75eef469dc17b34d575e0f69fbf639636cde3383383"} Oct 07 12:41:43 crc kubenswrapper[4854]: I1007 12:41:43.436752 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.436735882 podStartE2EDuration="5.436735882s" podCreationTimestamp="2025-10-07 12:41:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:41:43.414901027 +0000 UTC m=+1019.402733282" watchObservedRunningTime="2025-10-07 12:41:43.436735882 +0000 UTC m=+1019.424568137" Oct 07 12:41:43 crc kubenswrapper[4854]: I1007 12:41:43.438591 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.438585006 podStartE2EDuration="5.438585006s" podCreationTimestamp="2025-10-07 12:41:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:41:43.433430786 +0000 UTC m=+1019.421263061" watchObservedRunningTime="2025-10-07 12:41:43.438585006 +0000 UTC m=+1019.426417261" Oct 07 12:41:44 crc kubenswrapper[4854]: I1007 12:41:44.415609 4854 generic.go:334] "Generic (PLEG): container finished" podID="d8d81b1b-8918-46cf-9902-ee7952aea166" containerID="214457d553a0c0b56dcc7fb564ba8c1329e444532bf12a604618bd94f53179fc" exitCode=0 Oct 07 12:41:44 crc kubenswrapper[4854]: I1007 12:41:44.415697 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5464v" event={"ID":"d8d81b1b-8918-46cf-9902-ee7952aea166","Type":"ContainerDied","Data":"214457d553a0c0b56dcc7fb564ba8c1329e444532bf12a604618bd94f53179fc"} Oct 07 12:41:44 crc kubenswrapper[4854]: I1007 12:41:44.428474 4854 generic.go:334] "Generic (PLEG): container finished" podID="ca5c432d-812b-476b-b996-f54bd8d76eb6" containerID="033723c2d3fe66822608ed294192bb053c3e6f5a1e3050f8582dbdd9eee28d65" exitCode=0 Oct 07 12:41:44 crc kubenswrapper[4854]: I1007 12:41:44.428515 4854 generic.go:334] "Generic (PLEG): container finished" podID="ca5c432d-812b-476b-b996-f54bd8d76eb6" containerID="ca0db39e60e9d204fd72ee919b6b446ed0dae75bed3ce0f642c58490cabd0159" exitCode=143 Oct 07 12:41:44 crc kubenswrapper[4854]: I1007 12:41:44.428665 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ca5c432d-812b-476b-b996-f54bd8d76eb6","Type":"ContainerDied","Data":"033723c2d3fe66822608ed294192bb053c3e6f5a1e3050f8582dbdd9eee28d65"} Oct 07 12:41:44 crc kubenswrapper[4854]: I1007 12:41:44.428724 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ca5c432d-812b-476b-b996-f54bd8d76eb6","Type":"ContainerDied","Data":"ca0db39e60e9d204fd72ee919b6b446ed0dae75bed3ce0f642c58490cabd0159"} Oct 07 12:41:44 crc kubenswrapper[4854]: I1007 12:41:44.438428 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qk4ws" event={"ID":"c462d02f-dfcd-48f7-b755-fb203afcb213","Type":"ContainerStarted","Data":"86def19086aabe002c256380aee9698cfdc030d6deb57886de5b569a36e128f9"} Oct 07 12:41:44 crc kubenswrapper[4854]: I1007 12:41:44.442420 4854 generic.go:334] "Generic (PLEG): container finished" podID="f195965c-64f5-46e5-aaed-7acdc7af3d64" containerID="9abe1ec76e1bc3b2e6c1fbb61b65dab7971238d75d1de7af6e076363473ef615" exitCode=143 Oct 07 12:41:44 crc kubenswrapper[4854]: I1007 12:41:44.442476 4854 generic.go:334] "Generic (PLEG): container finished" podID="f195965c-64f5-46e5-aaed-7acdc7af3d64" containerID="fac286218a4ce4d48b5aa6d65b2f8184e66d1fb966ae5d7227795ac793e05cb9" exitCode=143 Oct 07 12:41:44 crc kubenswrapper[4854]: I1007 12:41:44.442501 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f195965c-64f5-46e5-aaed-7acdc7af3d64","Type":"ContainerDied","Data":"9abe1ec76e1bc3b2e6c1fbb61b65dab7971238d75d1de7af6e076363473ef615"} Oct 07 12:41:44 crc kubenswrapper[4854]: I1007 12:41:44.442550 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f195965c-64f5-46e5-aaed-7acdc7af3d64","Type":"ContainerDied","Data":"fac286218a4ce4d48b5aa6d65b2f8184e66d1fb966ae5d7227795ac793e05cb9"} Oct 07 12:41:44 crc kubenswrapper[4854]: I1007 12:41:44.459260 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-qk4ws" podStartSLOduration=2.45924254 podStartE2EDuration="2.45924254s" podCreationTimestamp="2025-10-07 12:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:41:44.452459463 +0000 UTC m=+1020.440291738" watchObservedRunningTime="2025-10-07 12:41:44.45924254 +0000 UTC m=+1020.447074795" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.022913 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.029593 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.077848 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"f195965c-64f5-46e5-aaed-7acdc7af3d64\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.077937 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzffg\" (UniqueName: \"kubernetes.io/projected/d8d81b1b-8918-46cf-9902-ee7952aea166-kube-api-access-nzffg\") pod \"d8d81b1b-8918-46cf-9902-ee7952aea166\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.077991 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-combined-ca-bundle\") pod \"d8d81b1b-8918-46cf-9902-ee7952aea166\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.078033 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-combined-ca-bundle\") pod \"f195965c-64f5-46e5-aaed-7acdc7af3d64\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.078087 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-credential-keys\") pod \"d8d81b1b-8918-46cf-9902-ee7952aea166\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.078167 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47nph\" (UniqueName: \"kubernetes.io/projected/f195965c-64f5-46e5-aaed-7acdc7af3d64-kube-api-access-47nph\") pod \"f195965c-64f5-46e5-aaed-7acdc7af3d64\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.078221 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-scripts\") pod \"f195965c-64f5-46e5-aaed-7acdc7af3d64\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.078246 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-config-data\") pod \"f195965c-64f5-46e5-aaed-7acdc7af3d64\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.078299 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-scripts\") pod \"d8d81b1b-8918-46cf-9902-ee7952aea166\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.078330 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f195965c-64f5-46e5-aaed-7acdc7af3d64-httpd-run\") pod \"f195965c-64f5-46e5-aaed-7acdc7af3d64\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.078355 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-config-data\") pod \"d8d81b1b-8918-46cf-9902-ee7952aea166\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.078393 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-fernet-keys\") pod \"d8d81b1b-8918-46cf-9902-ee7952aea166\" (UID: \"d8d81b1b-8918-46cf-9902-ee7952aea166\") " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.078430 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f195965c-64f5-46e5-aaed-7acdc7af3d64-logs\") pod \"f195965c-64f5-46e5-aaed-7acdc7af3d64\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.078499 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-internal-tls-certs\") pod \"f195965c-64f5-46e5-aaed-7acdc7af3d64\" (UID: \"f195965c-64f5-46e5-aaed-7acdc7af3d64\") " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.087716 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f195965c-64f5-46e5-aaed-7acdc7af3d64-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f195965c-64f5-46e5-aaed-7acdc7af3d64" (UID: "f195965c-64f5-46e5-aaed-7acdc7af3d64"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.090048 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-scripts" (OuterVolumeSpecName: "scripts") pod "f195965c-64f5-46e5-aaed-7acdc7af3d64" (UID: "f195965c-64f5-46e5-aaed-7acdc7af3d64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.099181 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d8d81b1b-8918-46cf-9902-ee7952aea166" (UID: "d8d81b1b-8918-46cf-9902-ee7952aea166"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.101025 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d8d81b1b-8918-46cf-9902-ee7952aea166" (UID: "d8d81b1b-8918-46cf-9902-ee7952aea166"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.101031 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "f195965c-64f5-46e5-aaed-7acdc7af3d64" (UID: "f195965c-64f5-46e5-aaed-7acdc7af3d64"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.101115 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8d81b1b-8918-46cf-9902-ee7952aea166-kube-api-access-nzffg" (OuterVolumeSpecName: "kube-api-access-nzffg") pod "d8d81b1b-8918-46cf-9902-ee7952aea166" (UID: "d8d81b1b-8918-46cf-9902-ee7952aea166"). InnerVolumeSpecName "kube-api-access-nzffg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.101101 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-scripts" (OuterVolumeSpecName: "scripts") pod "d8d81b1b-8918-46cf-9902-ee7952aea166" (UID: "d8d81b1b-8918-46cf-9902-ee7952aea166"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.101192 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f195965c-64f5-46e5-aaed-7acdc7af3d64-kube-api-access-47nph" (OuterVolumeSpecName: "kube-api-access-47nph") pod "f195965c-64f5-46e5-aaed-7acdc7af3d64" (UID: "f195965c-64f5-46e5-aaed-7acdc7af3d64"). InnerVolumeSpecName "kube-api-access-47nph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.101579 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f195965c-64f5-46e5-aaed-7acdc7af3d64-logs" (OuterVolumeSpecName: "logs") pod "f195965c-64f5-46e5-aaed-7acdc7af3d64" (UID: "f195965c-64f5-46e5-aaed-7acdc7af3d64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.123065 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-config-data" (OuterVolumeSpecName: "config-data") pod "d8d81b1b-8918-46cf-9902-ee7952aea166" (UID: "d8d81b1b-8918-46cf-9902-ee7952aea166"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.165488 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f195965c-64f5-46e5-aaed-7acdc7af3d64" (UID: "f195965c-64f5-46e5-aaed-7acdc7af3d64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.166287 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8d81b1b-8918-46cf-9902-ee7952aea166" (UID: "d8d81b1b-8918-46cf-9902-ee7952aea166"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.171305 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-config-data" (OuterVolumeSpecName: "config-data") pod "f195965c-64f5-46e5-aaed-7acdc7af3d64" (UID: "f195965c-64f5-46e5-aaed-7acdc7af3d64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.181315 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.181349 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzffg\" (UniqueName: \"kubernetes.io/projected/d8d81b1b-8918-46cf-9902-ee7952aea166-kube-api-access-nzffg\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.181364 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.181376 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.181388 4854 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.181404 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47nph\" (UniqueName: \"kubernetes.io/projected/f195965c-64f5-46e5-aaed-7acdc7af3d64-kube-api-access-47nph\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.181417 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.181432 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.181446 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.181458 4854 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f195965c-64f5-46e5-aaed-7acdc7af3d64-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.181468 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.181479 4854 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d8d81b1b-8918-46cf-9902-ee7952aea166-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.181490 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f195965c-64f5-46e5-aaed-7acdc7af3d64-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.205881 4854 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.216064 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f195965c-64f5-46e5-aaed-7acdc7af3d64" (UID: "f195965c-64f5-46e5-aaed-7acdc7af3d64"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.283242 4854 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f195965c-64f5-46e5-aaed-7acdc7af3d64-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.283282 4854 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.479504 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5464v" event={"ID":"d8d81b1b-8918-46cf-9902-ee7952aea166","Type":"ContainerDied","Data":"79f72559a58c8740cc06ede79fe34e45041c37867e0ff61ada5281ee4b9e433b"} Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.479789 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79f72559a58c8740cc06ede79fe34e45041c37867e0ff61ada5281ee4b9e433b" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.479528 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5464v" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.481324 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f195965c-64f5-46e5-aaed-7acdc7af3d64","Type":"ContainerDied","Data":"7ef27292003a2e68dd4757f53f3f1a652ca7692f683cbf7a028c601cf3d08582"} Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.481376 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.481381 4854 scope.go:117] "RemoveContainer" containerID="9abe1ec76e1bc3b2e6c1fbb61b65dab7971238d75d1de7af6e076363473ef615" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.528103 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.543078 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.556166 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:41:46 crc kubenswrapper[4854]: E1007 12:41:46.556798 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f195965c-64f5-46e5-aaed-7acdc7af3d64" containerName="glance-httpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.556862 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f195965c-64f5-46e5-aaed-7acdc7af3d64" containerName="glance-httpd" Oct 07 12:41:46 crc kubenswrapper[4854]: E1007 12:41:46.556936 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f195965c-64f5-46e5-aaed-7acdc7af3d64" containerName="glance-log" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.556985 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f195965c-64f5-46e5-aaed-7acdc7af3d64" containerName="glance-log" Oct 07 12:41:46 crc kubenswrapper[4854]: E1007 12:41:46.557044 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8d81b1b-8918-46cf-9902-ee7952aea166" containerName="keystone-bootstrap" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.557094 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8d81b1b-8918-46cf-9902-ee7952aea166" containerName="keystone-bootstrap" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.557337 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f195965c-64f5-46e5-aaed-7acdc7af3d64" containerName="glance-httpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.557420 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8d81b1b-8918-46cf-9902-ee7952aea166" containerName="keystone-bootstrap" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.557479 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f195965c-64f5-46e5-aaed-7acdc7af3d64" containerName="glance-log" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.558468 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.560769 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.560909 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.563623 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5464v"] Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.575121 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5464v"] Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.586610 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.587446 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.587618 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd5sn\" (UniqueName: \"kubernetes.io/projected/b6837285-79fa-44f5-83bd-5d1e5959cb8a-kube-api-access-cd5sn\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.587651 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6837285-79fa-44f5-83bd-5d1e5959cb8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.587722 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.587814 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6837285-79fa-44f5-83bd-5d1e5959cb8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.587885 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.587926 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.587959 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.647787 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-frlpd"] Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.649173 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.653166 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.653467 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-m9mdj" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.653224 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.653280 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.659565 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-frlpd"] Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.692458 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6837285-79fa-44f5-83bd-5d1e5959cb8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.692587 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.692628 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.692677 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.692741 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.693327 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd5sn\" (UniqueName: \"kubernetes.io/projected/b6837285-79fa-44f5-83bd-5d1e5959cb8a-kube-api-access-cd5sn\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.693393 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6837285-79fa-44f5-83bd-5d1e5959cb8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.693406 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6837285-79fa-44f5-83bd-5d1e5959cb8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.693568 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.693725 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.695430 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6837285-79fa-44f5-83bd-5d1e5959cb8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.701260 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.701283 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.701440 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.701750 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.720345 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd5sn\" (UniqueName: \"kubernetes.io/projected/b6837285-79fa-44f5-83bd-5d1e5959cb8a-kube-api-access-cd5sn\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.724515 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8d81b1b-8918-46cf-9902-ee7952aea166" path="/var/lib/kubelet/pods/d8d81b1b-8918-46cf-9902-ee7952aea166/volumes" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.725108 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f195965c-64f5-46e5-aaed-7acdc7af3d64" path="/var/lib/kubelet/pods/f195965c-64f5-46e5-aaed-7acdc7af3d64/volumes" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.741543 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.795647 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-config-data\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.795723 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-fernet-keys\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.795765 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-credential-keys\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.795923 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82wmb\" (UniqueName: \"kubernetes.io/projected/0c69757e-7365-47af-a7ef-03c00a3aae33-kube-api-access-82wmb\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.795968 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-scripts\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.796244 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-combined-ca-bundle\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.885651 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.897416 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-config-data\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.897491 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-fernet-keys\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.897511 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-credential-keys\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.897546 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82wmb\" (UniqueName: \"kubernetes.io/projected/0c69757e-7365-47af-a7ef-03c00a3aae33-kube-api-access-82wmb\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.897563 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-scripts\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.897626 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-combined-ca-bundle\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.901142 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-config-data\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.901205 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-scripts\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.901482 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-combined-ca-bundle\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.903015 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-credential-keys\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.909553 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-fernet-keys\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.922975 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82wmb\" (UniqueName: \"kubernetes.io/projected/0c69757e-7365-47af-a7ef-03c00a3aae33-kube-api-access-82wmb\") pod \"keystone-bootstrap-frlpd\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:46 crc kubenswrapper[4854]: I1007 12:41:46.975425 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.367455 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.444909 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-gzl8s"] Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.456596 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" podUID="45b87c2d-9143-42f2-84b8-14db6534b894" containerName="dnsmasq-dns" containerID="cri-o://a67ba1c39b438c39f6e876b841981461fe372cdfdbd8f11d401bf07296dce3ea" gracePeriod=10 Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.528161 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ca5c432d-812b-476b-b996-f54bd8d76eb6","Type":"ContainerDied","Data":"033bb90bf6173bd9f8d575695fbb8ad3302d612ed4e7c7df3b6c8fbe1b2da540"} Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.528218 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="033bb90bf6173bd9f8d575695fbb8ad3302d612ed4e7c7df3b6c8fbe1b2da540" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.528197 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.649771 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ca5c432d-812b-476b-b996-f54bd8d76eb6\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.649829 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-public-tls-certs\") pod \"ca5c432d-812b-476b-b996-f54bd8d76eb6\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.649856 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-config-data\") pod \"ca5c432d-812b-476b-b996-f54bd8d76eb6\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.649889 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ca5c432d-812b-476b-b996-f54bd8d76eb6-httpd-run\") pod \"ca5c432d-812b-476b-b996-f54bd8d76eb6\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.649944 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-combined-ca-bundle\") pod \"ca5c432d-812b-476b-b996-f54bd8d76eb6\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.650066 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca5c432d-812b-476b-b996-f54bd8d76eb6-logs\") pod \"ca5c432d-812b-476b-b996-f54bd8d76eb6\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.650120 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-scripts\") pod \"ca5c432d-812b-476b-b996-f54bd8d76eb6\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.650167 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vn4r\" (UniqueName: \"kubernetes.io/projected/ca5c432d-812b-476b-b996-f54bd8d76eb6-kube-api-access-2vn4r\") pod \"ca5c432d-812b-476b-b996-f54bd8d76eb6\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.650428 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca5c432d-812b-476b-b996-f54bd8d76eb6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ca5c432d-812b-476b-b996-f54bd8d76eb6" (UID: "ca5c432d-812b-476b-b996-f54bd8d76eb6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.650585 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca5c432d-812b-476b-b996-f54bd8d76eb6-logs" (OuterVolumeSpecName: "logs") pod "ca5c432d-812b-476b-b996-f54bd8d76eb6" (UID: "ca5c432d-812b-476b-b996-f54bd8d76eb6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.650703 4854 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ca5c432d-812b-476b-b996-f54bd8d76eb6-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.650728 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca5c432d-812b-476b-b996-f54bd8d76eb6-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.655903 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "ca5c432d-812b-476b-b996-f54bd8d76eb6" (UID: "ca5c432d-812b-476b-b996-f54bd8d76eb6"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.656867 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca5c432d-812b-476b-b996-f54bd8d76eb6-kube-api-access-2vn4r" (OuterVolumeSpecName: "kube-api-access-2vn4r") pod "ca5c432d-812b-476b-b996-f54bd8d76eb6" (UID: "ca5c432d-812b-476b-b996-f54bd8d76eb6"). InnerVolumeSpecName "kube-api-access-2vn4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.657061 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-scripts" (OuterVolumeSpecName: "scripts") pod "ca5c432d-812b-476b-b996-f54bd8d76eb6" (UID: "ca5c432d-812b-476b-b996-f54bd8d76eb6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.691955 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca5c432d-812b-476b-b996-f54bd8d76eb6" (UID: "ca5c432d-812b-476b-b996-f54bd8d76eb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:49 crc kubenswrapper[4854]: E1007 12:41:49.718865 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-config-data podName:ca5c432d-812b-476b-b996-f54bd8d76eb6 nodeName:}" failed. No retries permitted until 2025-10-07 12:41:50.218833983 +0000 UTC m=+1026.206666238 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-config-data") pod "ca5c432d-812b-476b-b996-f54bd8d76eb6" (UID: "ca5c432d-812b-476b-b996-f54bd8d76eb6") : error deleting /var/lib/kubelet/pods/ca5c432d-812b-476b-b996-f54bd8d76eb6/volume-subpaths: remove /var/lib/kubelet/pods/ca5c432d-812b-476b-b996-f54bd8d76eb6/volume-subpaths: no such file or directory Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.726760 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ca5c432d-812b-476b-b996-f54bd8d76eb6" (UID: "ca5c432d-812b-476b-b996-f54bd8d76eb6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.772008 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.772051 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vn4r\" (UniqueName: \"kubernetes.io/projected/ca5c432d-812b-476b-b996-f54bd8d76eb6-kube-api-access-2vn4r\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.772073 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.772087 4854 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.772100 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.821783 4854 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 07 12:41:49 crc kubenswrapper[4854]: I1007 12:41:49.873329 4854 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.280358 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-config-data\") pod \"ca5c432d-812b-476b-b996-f54bd8d76eb6\" (UID: \"ca5c432d-812b-476b-b996-f54bd8d76eb6\") " Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.285218 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-config-data" (OuterVolumeSpecName: "config-data") pod "ca5c432d-812b-476b-b996-f54bd8d76eb6" (UID: "ca5c432d-812b-476b-b996-f54bd8d76eb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.382882 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5c432d-812b-476b-b996-f54bd8d76eb6-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.536444 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.569100 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.577661 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.606710 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:41:50 crc kubenswrapper[4854]: E1007 12:41:50.607136 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5c432d-812b-476b-b996-f54bd8d76eb6" containerName="glance-httpd" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.607163 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5c432d-812b-476b-b996-f54bd8d76eb6" containerName="glance-httpd" Oct 07 12:41:50 crc kubenswrapper[4854]: E1007 12:41:50.607195 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5c432d-812b-476b-b996-f54bd8d76eb6" containerName="glance-log" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.607206 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5c432d-812b-476b-b996-f54bd8d76eb6" containerName="glance-log" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.607409 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5c432d-812b-476b-b996-f54bd8d76eb6" containerName="glance-httpd" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.607440 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5c432d-812b-476b-b996-f54bd8d76eb6" containerName="glance-log" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.608451 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.610317 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.614573 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.622071 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.713661 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca5c432d-812b-476b-b996-f54bd8d76eb6" path="/var/lib/kubelet/pods/ca5c432d-812b-476b-b996-f54bd8d76eb6/volumes" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.789105 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clwd6\" (UniqueName: \"kubernetes.io/projected/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-kube-api-access-clwd6\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.789189 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.789221 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.789242 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-logs\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.789267 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-config-data\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.789297 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-scripts\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.789325 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.789394 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.890944 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.891009 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clwd6\" (UniqueName: \"kubernetes.io/projected/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-kube-api-access-clwd6\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.891045 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.891070 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.891096 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-logs\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.891130 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-config-data\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.891165 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-scripts\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.891203 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.892005 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.892002 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-logs\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.892127 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.896342 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-scripts\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.899691 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.899696 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-config-data\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.901312 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.911073 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clwd6\" (UniqueName: \"kubernetes.io/projected/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-kube-api-access-clwd6\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:50 crc kubenswrapper[4854]: I1007 12:41:50.937057 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:41:51 crc kubenswrapper[4854]: I1007 12:41:51.238669 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 12:41:51 crc kubenswrapper[4854]: I1007 12:41:51.544707 4854 generic.go:334] "Generic (PLEG): container finished" podID="45b87c2d-9143-42f2-84b8-14db6534b894" containerID="a67ba1c39b438c39f6e876b841981461fe372cdfdbd8f11d401bf07296dce3ea" exitCode=0 Oct 07 12:41:51 crc kubenswrapper[4854]: I1007 12:41:51.544753 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" event={"ID":"45b87c2d-9143-42f2-84b8-14db6534b894","Type":"ContainerDied","Data":"a67ba1c39b438c39f6e876b841981461fe372cdfdbd8f11d401bf07296dce3ea"} Oct 07 12:41:52 crc kubenswrapper[4854]: I1007 12:41:52.644870 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" podUID="45b87c2d-9143-42f2-84b8-14db6534b894" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Oct 07 12:41:53 crc kubenswrapper[4854]: I1007 12:41:53.659323 4854 scope.go:117] "RemoveContainer" containerID="fac286218a4ce4d48b5aa6d65b2f8184e66d1fb966ae5d7227795ac793e05cb9" Oct 07 12:41:56 crc kubenswrapper[4854]: E1007 12:41:56.884850 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Oct 07 12:41:56 crc kubenswrapper[4854]: E1007 12:41:56.885421 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n65hc9h96h645h59dhcchf4h556hc8h5ch5cdhd8h5d9h65dh65dh86h5fdhdfh575hcdh64fh94h85h67bhb8h5bfh5fdh666h599h7ch548h54dq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ll5l5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(211f81db-7ca9-43c6-bd31-aa60758129e6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 12:41:57 crc kubenswrapper[4854]: I1007 12:41:57.645881 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" podUID="45b87c2d-9143-42f2-84b8-14db6534b894" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Oct 07 12:42:00 crc kubenswrapper[4854]: E1007 12:42:00.510899 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Oct 07 12:42:00 crc kubenswrapper[4854]: E1007 12:42:00.511394 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sf6c2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-jbv7l_openstack(06a39602-5206-42d4-a283-31650db9bd54): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 12:42:00 crc kubenswrapper[4854]: E1007 12:42:00.512958 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-jbv7l" podUID="06a39602-5206-42d4-a283-31650db9bd54" Oct 07 12:42:00 crc kubenswrapper[4854]: E1007 12:42:00.632409 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-jbv7l" podUID="06a39602-5206-42d4-a283-31650db9bd54" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.115437 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.200879 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-config\") pod \"45b87c2d-9143-42f2-84b8-14db6534b894\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.200947 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-dns-svc\") pod \"45b87c2d-9143-42f2-84b8-14db6534b894\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.200986 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-ovsdbserver-nb\") pod \"45b87c2d-9143-42f2-84b8-14db6534b894\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.201008 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-ovsdbserver-sb\") pod \"45b87c2d-9143-42f2-84b8-14db6534b894\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.201041 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-dns-swift-storage-0\") pod \"45b87c2d-9143-42f2-84b8-14db6534b894\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.201206 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbckt\" (UniqueName: \"kubernetes.io/projected/45b87c2d-9143-42f2-84b8-14db6534b894-kube-api-access-xbckt\") pod \"45b87c2d-9143-42f2-84b8-14db6534b894\" (UID: \"45b87c2d-9143-42f2-84b8-14db6534b894\") " Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.210608 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b87c2d-9143-42f2-84b8-14db6534b894-kube-api-access-xbckt" (OuterVolumeSpecName: "kube-api-access-xbckt") pod "45b87c2d-9143-42f2-84b8-14db6534b894" (UID: "45b87c2d-9143-42f2-84b8-14db6534b894"). InnerVolumeSpecName "kube-api-access-xbckt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.253888 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-config" (OuterVolumeSpecName: "config") pod "45b87c2d-9143-42f2-84b8-14db6534b894" (UID: "45b87c2d-9143-42f2-84b8-14db6534b894"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.259377 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45b87c2d-9143-42f2-84b8-14db6534b894" (UID: "45b87c2d-9143-42f2-84b8-14db6534b894"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.260123 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45b87c2d-9143-42f2-84b8-14db6534b894" (UID: "45b87c2d-9143-42f2-84b8-14db6534b894"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.273748 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "45b87c2d-9143-42f2-84b8-14db6534b894" (UID: "45b87c2d-9143-42f2-84b8-14db6534b894"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.279749 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45b87c2d-9143-42f2-84b8-14db6534b894" (UID: "45b87c2d-9143-42f2-84b8-14db6534b894"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.304374 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbckt\" (UniqueName: \"kubernetes.io/projected/45b87c2d-9143-42f2-84b8-14db6534b894-kube-api-access-xbckt\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.304405 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.304416 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.304425 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.304433 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.304443 4854 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45b87c2d-9143-42f2-84b8-14db6534b894-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.639330 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" event={"ID":"45b87c2d-9143-42f2-84b8-14db6534b894","Type":"ContainerDied","Data":"3ebbfc7d3ca75d5cc2b878d51648535518c5b2aae128a7fc6f858d32f848d511"} Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.639428 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-gzl8s" Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.669499 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-gzl8s"] Oct 07 12:42:01 crc kubenswrapper[4854]: I1007 12:42:01.676581 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-gzl8s"] Oct 07 12:42:01 crc kubenswrapper[4854]: E1007 12:42:01.875259 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 07 12:42:01 crc kubenswrapper[4854]: E1007 12:42:01.875446 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4jkcz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-hnhnd_openstack(57eafbaf-440c-432a-a1ab-55032cd2f54a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 12:42:01 crc kubenswrapper[4854]: E1007 12:42:01.876805 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-hnhnd" podUID="57eafbaf-440c-432a-a1ab-55032cd2f54a" Oct 07 12:42:02 crc kubenswrapper[4854]: E1007 12:42:02.651894 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-hnhnd" podUID="57eafbaf-440c-432a-a1ab-55032cd2f54a" Oct 07 12:42:02 crc kubenswrapper[4854]: I1007 12:42:02.713303 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45b87c2d-9143-42f2-84b8-14db6534b894" path="/var/lib/kubelet/pods/45b87c2d-9143-42f2-84b8-14db6534b894/volumes" Oct 07 12:42:10 crc kubenswrapper[4854]: I1007 12:42:10.808078 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:42:10 crc kubenswrapper[4854]: I1007 12:42:10.809459 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:42:10 crc kubenswrapper[4854]: I1007 12:42:10.809582 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:42:10 crc kubenswrapper[4854]: I1007 12:42:10.810932 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce4378e583bfb2c39ca2c2683e3f5d09095f8b3086e8373f80ece7dbb8bdaa5a"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 12:42:10 crc kubenswrapper[4854]: I1007 12:42:10.811003 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://ce4378e583bfb2c39ca2c2683e3f5d09095f8b3086e8373f80ece7dbb8bdaa5a" gracePeriod=600 Oct 07 12:42:13 crc kubenswrapper[4854]: I1007 12:42:13.755946 4854 scope.go:117] "RemoveContainer" containerID="a67ba1c39b438c39f6e876b841981461fe372cdfdbd8f11d401bf07296dce3ea" Oct 07 12:42:13 crc kubenswrapper[4854]: I1007 12:42:13.791035 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="ce4378e583bfb2c39ca2c2683e3f5d09095f8b3086e8373f80ece7dbb8bdaa5a" exitCode=0 Oct 07 12:42:13 crc kubenswrapper[4854]: I1007 12:42:13.791123 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"ce4378e583bfb2c39ca2c2683e3f5d09095f8b3086e8373f80ece7dbb8bdaa5a"} Oct 07 12:42:14 crc kubenswrapper[4854]: I1007 12:42:14.222763 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-frlpd"] Oct 07 12:42:14 crc kubenswrapper[4854]: I1007 12:42:14.306691 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:42:15 crc kubenswrapper[4854]: W1007 12:42:15.080629 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c69757e_7365_47af_a7ef_03c00a3aae33.slice/crio-1c2f97b2e6b6704dd843badd4006d7c83da483d66d964c05ea4b614471e9641e WatchSource:0}: Error finding container 1c2f97b2e6b6704dd843badd4006d7c83da483d66d964c05ea4b614471e9641e: Status 404 returned error can't find the container with id 1c2f97b2e6b6704dd843badd4006d7c83da483d66d964c05ea4b614471e9641e Oct 07 12:42:15 crc kubenswrapper[4854]: W1007 12:42:15.085095 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6837285_79fa_44f5_83bd_5d1e5959cb8a.slice/crio-a662bbf6531bbfad94c1be7360720244bcf72dcf71612b1f6b296d9dd20dd422 WatchSource:0}: Error finding container a662bbf6531bbfad94c1be7360720244bcf72dcf71612b1f6b296d9dd20dd422: Status 404 returned error can't find the container with id a662bbf6531bbfad94c1be7360720244bcf72dcf71612b1f6b296d9dd20dd422 Oct 07 12:42:15 crc kubenswrapper[4854]: I1007 12:42:15.112587 4854 scope.go:117] "RemoveContainer" containerID="6cfb1b6f86e8f5068f1ccdd5f7c19800635f6d93230702e09b129ba4a20e4333" Oct 07 12:42:15 crc kubenswrapper[4854]: E1007 12:42:15.129514 4854 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 07 12:42:15 crc kubenswrapper[4854]: E1007 12:42:15.129699 4854 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5bm65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-7zmqn_openstack(74714c8f-dea6-40be-9985-d254729920c9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 12:42:15 crc kubenswrapper[4854]: E1007 12:42:15.130897 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-7zmqn" podUID="74714c8f-dea6-40be-9985-d254729920c9" Oct 07 12:42:15 crc kubenswrapper[4854]: I1007 12:42:15.565809 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:42:15 crc kubenswrapper[4854]: I1007 12:42:15.572434 4854 scope.go:117] "RemoveContainer" containerID="ad58428eb7a4eee4282ef9e4c324813616eaa8e2cac7c60386af842a1cd061ba" Oct 07 12:42:15 crc kubenswrapper[4854]: W1007 12:42:15.612685 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dc203ca_4a7b_4508_87d6_4922a76a1fb7.slice/crio-952f6d163b565dd3772a045351572ebe9fb6a9e6aae6019001c353abc93b1a75 WatchSource:0}: Error finding container 952f6d163b565dd3772a045351572ebe9fb6a9e6aae6019001c353abc93b1a75: Status 404 returned error can't find the container with id 952f6d163b565dd3772a045351572ebe9fb6a9e6aae6019001c353abc93b1a75 Oct 07 12:42:15 crc kubenswrapper[4854]: I1007 12:42:15.816375 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b6837285-79fa-44f5-83bd-5d1e5959cb8a","Type":"ContainerStarted","Data":"a662bbf6531bbfad94c1be7360720244bcf72dcf71612b1f6b296d9dd20dd422"} Oct 07 12:42:15 crc kubenswrapper[4854]: I1007 12:42:15.825788 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-frlpd" event={"ID":"0c69757e-7365-47af-a7ef-03c00a3aae33","Type":"ContainerStarted","Data":"1c2f97b2e6b6704dd843badd4006d7c83da483d66d964c05ea4b614471e9641e"} Oct 07 12:42:15 crc kubenswrapper[4854]: I1007 12:42:15.832654 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dc203ca-4a7b-4508-87d6-4922a76a1fb7","Type":"ContainerStarted","Data":"952f6d163b565dd3772a045351572ebe9fb6a9e6aae6019001c353abc93b1a75"} Oct 07 12:42:15 crc kubenswrapper[4854]: I1007 12:42:15.836604 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"294ebdf1eb05d7d685f741557c635439a170a1959729e348cac4935535da7799"} Oct 07 12:42:15 crc kubenswrapper[4854]: E1007 12:42:15.838656 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-7zmqn" podUID="74714c8f-dea6-40be-9985-d254729920c9" Oct 07 12:42:16 crc kubenswrapper[4854]: I1007 12:42:16.847043 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jbv7l" event={"ID":"06a39602-5206-42d4-a283-31650db9bd54","Type":"ContainerStarted","Data":"71c7c2899fa0ebe6be9173463561bc932028e5a6606d93d0e217bab53577b794"} Oct 07 12:42:16 crc kubenswrapper[4854]: I1007 12:42:16.858101 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hnhnd" event={"ID":"57eafbaf-440c-432a-a1ab-55032cd2f54a","Type":"ContainerStarted","Data":"22df39064c8d3fc8a015bb827b717afaeaa993111ac4bb03c31fa5e53e7438f3"} Oct 07 12:42:16 crc kubenswrapper[4854]: I1007 12:42:16.874673 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jbv7l" podStartSLOduration=3.145656345 podStartE2EDuration="38.874655312s" podCreationTimestamp="2025-10-07 12:41:38 +0000 UTC" firstStartedPulling="2025-10-07 12:41:39.903130096 +0000 UTC m=+1015.890962351" lastFinishedPulling="2025-10-07 12:42:15.632129063 +0000 UTC m=+1051.619961318" observedRunningTime="2025-10-07 12:42:16.86531924 +0000 UTC m=+1052.853151495" watchObservedRunningTime="2025-10-07 12:42:16.874655312 +0000 UTC m=+1052.862487567" Oct 07 12:42:16 crc kubenswrapper[4854]: I1007 12:42:16.877590 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dc203ca-4a7b-4508-87d6-4922a76a1fb7","Type":"ContainerStarted","Data":"374a7faa09b1f2933202f50219d10eed6a8601aaa44361171ee0ef98440e6c0c"} Oct 07 12:42:16 crc kubenswrapper[4854]: I1007 12:42:16.881685 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-hnhnd" podStartSLOduration=2.42429222 podStartE2EDuration="35.881662036s" podCreationTimestamp="2025-10-07 12:41:41 +0000 UTC" firstStartedPulling="2025-10-07 12:41:42.746288775 +0000 UTC m=+1018.734121040" lastFinishedPulling="2025-10-07 12:42:16.203658601 +0000 UTC m=+1052.191490856" observedRunningTime="2025-10-07 12:42:16.880705448 +0000 UTC m=+1052.868537703" watchObservedRunningTime="2025-10-07 12:42:16.881662036 +0000 UTC m=+1052.869494291" Oct 07 12:42:16 crc kubenswrapper[4854]: I1007 12:42:16.883628 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"211f81db-7ca9-43c6-bd31-aa60758129e6","Type":"ContainerStarted","Data":"b5e263b306dfcc4c0f407799fb0b62d3e3994d3c3b32517a3c1fbdeafab18602"} Oct 07 12:42:16 crc kubenswrapper[4854]: I1007 12:42:16.887173 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b6837285-79fa-44f5-83bd-5d1e5959cb8a","Type":"ContainerStarted","Data":"5a5aba34501996755646bff327da7dc06f8295afd26fd3399b7b27ec146c57c0"} Oct 07 12:42:16 crc kubenswrapper[4854]: I1007 12:42:16.891582 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-frlpd" event={"ID":"0c69757e-7365-47af-a7ef-03c00a3aae33","Type":"ContainerStarted","Data":"4c83e5b21d149d18964b200c85f6be36b591a3e55657f010be96d19c43e54add"} Oct 07 12:42:17 crc kubenswrapper[4854]: I1007 12:42:17.902677 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b6837285-79fa-44f5-83bd-5d1e5959cb8a","Type":"ContainerStarted","Data":"00e4df36e042f5a5d48c09c816b2ba5f585be063679da01da6210f205b34bcd6"} Oct 07 12:42:17 crc kubenswrapper[4854]: I1007 12:42:17.914398 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dc203ca-4a7b-4508-87d6-4922a76a1fb7","Type":"ContainerStarted","Data":"a95bd116db9ea390605319534fc00572f56d0206068c040f76d694264158b473"} Oct 07 12:42:17 crc kubenswrapper[4854]: I1007 12:42:17.926419 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-frlpd" podStartSLOduration=31.926401612 podStartE2EDuration="31.926401612s" podCreationTimestamp="2025-10-07 12:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:42:16.907916 +0000 UTC m=+1052.895748255" watchObservedRunningTime="2025-10-07 12:42:17.926401612 +0000 UTC m=+1053.914233867" Oct 07 12:42:17 crc kubenswrapper[4854]: I1007 12:42:17.930473 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=31.93046056 podStartE2EDuration="31.93046056s" podCreationTimestamp="2025-10-07 12:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:42:17.928374739 +0000 UTC m=+1053.916207004" watchObservedRunningTime="2025-10-07 12:42:17.93046056 +0000 UTC m=+1053.918292805" Oct 07 12:42:21 crc kubenswrapper[4854]: I1007 12:42:21.238907 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 12:42:21 crc kubenswrapper[4854]: I1007 12:42:21.239575 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 12:42:21 crc kubenswrapper[4854]: I1007 12:42:21.239599 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 12:42:21 crc kubenswrapper[4854]: I1007 12:42:21.239620 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 12:42:21 crc kubenswrapper[4854]: I1007 12:42:21.279430 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 12:42:21 crc kubenswrapper[4854]: I1007 12:42:21.290574 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 12:42:21 crc kubenswrapper[4854]: I1007 12:42:21.305569 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=31.305548494 podStartE2EDuration="31.305548494s" podCreationTimestamp="2025-10-07 12:41:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:42:17.960448372 +0000 UTC m=+1053.948280647" watchObservedRunningTime="2025-10-07 12:42:21.305548494 +0000 UTC m=+1057.293380759" Oct 07 12:42:22 crc kubenswrapper[4854]: I1007 12:42:22.964182 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"211f81db-7ca9-43c6-bd31-aa60758129e6","Type":"ContainerStarted","Data":"d4b7f229a72f8b20f9d2daabf20383996d124c9b9aa548544b423f15eadcaaf0"} Oct 07 12:42:22 crc kubenswrapper[4854]: I1007 12:42:22.967183 4854 generic.go:334] "Generic (PLEG): container finished" podID="0c69757e-7365-47af-a7ef-03c00a3aae33" containerID="4c83e5b21d149d18964b200c85f6be36b591a3e55657f010be96d19c43e54add" exitCode=0 Oct 07 12:42:22 crc kubenswrapper[4854]: I1007 12:42:22.967227 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-frlpd" event={"ID":"0c69757e-7365-47af-a7ef-03c00a3aae33","Type":"ContainerDied","Data":"4c83e5b21d149d18964b200c85f6be36b591a3e55657f010be96d19c43e54add"} Oct 07 12:42:23 crc kubenswrapper[4854]: I1007 12:42:23.977639 4854 generic.go:334] "Generic (PLEG): container finished" podID="06a39602-5206-42d4-a283-31650db9bd54" containerID="71c7c2899fa0ebe6be9173463561bc932028e5a6606d93d0e217bab53577b794" exitCode=0 Oct 07 12:42:23 crc kubenswrapper[4854]: I1007 12:42:23.978118 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jbv7l" event={"ID":"06a39602-5206-42d4-a283-31650db9bd54","Type":"ContainerDied","Data":"71c7c2899fa0ebe6be9173463561bc932028e5a6606d93d0e217bab53577b794"} Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.098752 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.415006 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.505350 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-credential-keys\") pod \"0c69757e-7365-47af-a7ef-03c00a3aae33\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.505404 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-config-data\") pod \"0c69757e-7365-47af-a7ef-03c00a3aae33\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.505446 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-combined-ca-bundle\") pod \"0c69757e-7365-47af-a7ef-03c00a3aae33\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.505566 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82wmb\" (UniqueName: \"kubernetes.io/projected/0c69757e-7365-47af-a7ef-03c00a3aae33-kube-api-access-82wmb\") pod \"0c69757e-7365-47af-a7ef-03c00a3aae33\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.505586 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-fernet-keys\") pod \"0c69757e-7365-47af-a7ef-03c00a3aae33\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.505607 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-scripts\") pod \"0c69757e-7365-47af-a7ef-03c00a3aae33\" (UID: \"0c69757e-7365-47af-a7ef-03c00a3aae33\") " Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.511222 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c69757e-7365-47af-a7ef-03c00a3aae33-kube-api-access-82wmb" (OuterVolumeSpecName: "kube-api-access-82wmb") pod "0c69757e-7365-47af-a7ef-03c00a3aae33" (UID: "0c69757e-7365-47af-a7ef-03c00a3aae33"). InnerVolumeSpecName "kube-api-access-82wmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.525223 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-scripts" (OuterVolumeSpecName: "scripts") pod "0c69757e-7365-47af-a7ef-03c00a3aae33" (UID: "0c69757e-7365-47af-a7ef-03c00a3aae33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.526287 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0c69757e-7365-47af-a7ef-03c00a3aae33" (UID: "0c69757e-7365-47af-a7ef-03c00a3aae33"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.527085 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0c69757e-7365-47af-a7ef-03c00a3aae33" (UID: "0c69757e-7365-47af-a7ef-03c00a3aae33"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.539641 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c69757e-7365-47af-a7ef-03c00a3aae33" (UID: "0c69757e-7365-47af-a7ef-03c00a3aae33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.541376 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-config-data" (OuterVolumeSpecName: "config-data") pod "0c69757e-7365-47af-a7ef-03c00a3aae33" (UID: "0c69757e-7365-47af-a7ef-03c00a3aae33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.608900 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82wmb\" (UniqueName: \"kubernetes.io/projected/0c69757e-7365-47af-a7ef-03c00a3aae33-kube-api-access-82wmb\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.608934 4854 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.608945 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.608954 4854 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.608961 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.608971 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c69757e-7365-47af-a7ef-03c00a3aae33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.883733 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.995305 4854 generic.go:334] "Generic (PLEG): container finished" podID="57eafbaf-440c-432a-a1ab-55032cd2f54a" containerID="22df39064c8d3fc8a015bb827b717afaeaa993111ac4bb03c31fa5e53e7438f3" exitCode=0 Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.995384 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hnhnd" event={"ID":"57eafbaf-440c-432a-a1ab-55032cd2f54a","Type":"ContainerDied","Data":"22df39064c8d3fc8a015bb827b717afaeaa993111ac4bb03c31fa5e53e7438f3"} Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.997294 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-frlpd" Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.997290 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-frlpd" event={"ID":"0c69757e-7365-47af-a7ef-03c00a3aae33","Type":"ContainerDied","Data":"1c2f97b2e6b6704dd843badd4006d7c83da483d66d964c05ea4b614471e9641e"} Oct 07 12:42:24 crc kubenswrapper[4854]: I1007 12:42:24.997684 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c2f97b2e6b6704dd843badd4006d7c83da483d66d964c05ea4b614471e9641e" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.136012 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-775676c768-frcfp"] Oct 07 12:42:25 crc kubenswrapper[4854]: E1007 12:42:25.136754 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b87c2d-9143-42f2-84b8-14db6534b894" containerName="dnsmasq-dns" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.136772 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b87c2d-9143-42f2-84b8-14db6534b894" containerName="dnsmasq-dns" Oct 07 12:42:25 crc kubenswrapper[4854]: E1007 12:42:25.136924 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c69757e-7365-47af-a7ef-03c00a3aae33" containerName="keystone-bootstrap" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.136934 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c69757e-7365-47af-a7ef-03c00a3aae33" containerName="keystone-bootstrap" Oct 07 12:42:25 crc kubenswrapper[4854]: E1007 12:42:25.136951 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b87c2d-9143-42f2-84b8-14db6534b894" containerName="init" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.136958 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b87c2d-9143-42f2-84b8-14db6534b894" containerName="init" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.137126 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c69757e-7365-47af-a7ef-03c00a3aae33" containerName="keystone-bootstrap" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.137164 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="45b87c2d-9143-42f2-84b8-14db6534b894" containerName="dnsmasq-dns" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.137992 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.140953 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.141162 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.141800 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.141930 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.142355 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.144217 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-m9mdj" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.148622 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-775676c768-frcfp"] Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.221247 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-public-tls-certs\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.221310 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-credential-keys\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.221337 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-config-data\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.221361 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-fernet-keys\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.221594 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-internal-tls-certs\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.221705 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clfkj\" (UniqueName: \"kubernetes.io/projected/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-kube-api-access-clfkj\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.221745 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-combined-ca-bundle\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.221769 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-scripts\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.322084 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jbv7l" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.323278 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-internal-tls-certs\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.323342 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clfkj\" (UniqueName: \"kubernetes.io/projected/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-kube-api-access-clfkj\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.323963 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-combined-ca-bundle\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.324031 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-scripts\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.324100 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-public-tls-certs\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.324190 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-credential-keys\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.324312 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-config-data\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.324383 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-fernet-keys\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.328876 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-internal-tls-certs\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.330750 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-fernet-keys\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.338933 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-public-tls-certs\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.343079 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-config-data\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.344783 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-credential-keys\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.349839 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clfkj\" (UniqueName: \"kubernetes.io/projected/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-kube-api-access-clfkj\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.350791 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-combined-ca-bundle\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.354735 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-scripts\") pod \"keystone-775676c768-frcfp\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.425422 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-combined-ca-bundle\") pod \"06a39602-5206-42d4-a283-31650db9bd54\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.425483 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-scripts\") pod \"06a39602-5206-42d4-a283-31650db9bd54\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.425505 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06a39602-5206-42d4-a283-31650db9bd54-logs\") pod \"06a39602-5206-42d4-a283-31650db9bd54\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.426000 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06a39602-5206-42d4-a283-31650db9bd54-logs" (OuterVolumeSpecName: "logs") pod "06a39602-5206-42d4-a283-31650db9bd54" (UID: "06a39602-5206-42d4-a283-31650db9bd54"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.425546 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf6c2\" (UniqueName: \"kubernetes.io/projected/06a39602-5206-42d4-a283-31650db9bd54-kube-api-access-sf6c2\") pod \"06a39602-5206-42d4-a283-31650db9bd54\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.426563 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-config-data\") pod \"06a39602-5206-42d4-a283-31650db9bd54\" (UID: \"06a39602-5206-42d4-a283-31650db9bd54\") " Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.427005 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06a39602-5206-42d4-a283-31650db9bd54-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.429766 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a39602-5206-42d4-a283-31650db9bd54-kube-api-access-sf6c2" (OuterVolumeSpecName: "kube-api-access-sf6c2") pod "06a39602-5206-42d4-a283-31650db9bd54" (UID: "06a39602-5206-42d4-a283-31650db9bd54"). InnerVolumeSpecName "kube-api-access-sf6c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.429795 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-scripts" (OuterVolumeSpecName: "scripts") pod "06a39602-5206-42d4-a283-31650db9bd54" (UID: "06a39602-5206-42d4-a283-31650db9bd54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.449131 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06a39602-5206-42d4-a283-31650db9bd54" (UID: "06a39602-5206-42d4-a283-31650db9bd54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.459068 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-config-data" (OuterVolumeSpecName: "config-data") pod "06a39602-5206-42d4-a283-31650db9bd54" (UID: "06a39602-5206-42d4-a283-31650db9bd54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.466276 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.528390 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.528709 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.528720 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf6c2\" (UniqueName: \"kubernetes.io/projected/06a39602-5206-42d4-a283-31650db9bd54-kube-api-access-sf6c2\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.528731 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06a39602-5206-42d4-a283-31650db9bd54-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:25 crc kubenswrapper[4854]: I1007 12:42:25.920052 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-775676c768-frcfp"] Oct 07 12:42:25 crc kubenswrapper[4854]: W1007 12:42:25.930189 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9869a8d4_db8e_4aba_82d1_6d02c3cf988e.slice/crio-f09c0ef913c08c234c9f9000f73fe2841754c0ba4b940bdb749ba13056666d4a WatchSource:0}: Error finding container f09c0ef913c08c234c9f9000f73fe2841754c0ba4b940bdb749ba13056666d4a: Status 404 returned error can't find the container with id f09c0ef913c08c234c9f9000f73fe2841754c0ba4b940bdb749ba13056666d4a Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.008481 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-775676c768-frcfp" event={"ID":"9869a8d4-db8e-4aba-82d1-6d02c3cf988e","Type":"ContainerStarted","Data":"f09c0ef913c08c234c9f9000f73fe2841754c0ba4b940bdb749ba13056666d4a"} Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.010580 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jbv7l" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.010650 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jbv7l" event={"ID":"06a39602-5206-42d4-a283-31650db9bd54","Type":"ContainerDied","Data":"4fef353d351699b99c8f1ea49ee901a8d5ff872a0f86307280b3c7dae97f5c2f"} Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.010685 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fef353d351699b99c8f1ea49ee901a8d5ff872a0f86307280b3c7dae97f5c2f" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.161394 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6f54bfd6b4-g5gq4"] Oct 07 12:42:26 crc kubenswrapper[4854]: E1007 12:42:26.161759 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a39602-5206-42d4-a283-31650db9bd54" containerName="placement-db-sync" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.161776 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a39602-5206-42d4-a283-31650db9bd54" containerName="placement-db-sync" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.161972 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="06a39602-5206-42d4-a283-31650db9bd54" containerName="placement-db-sync" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.162854 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.167122 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.167333 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-phbmn" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.167469 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.171819 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.172674 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.184354 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f54bfd6b4-g5gq4"] Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.241735 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-scripts\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.241866 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a03c4a0d-6346-43e4-8db1-f653b5dfa420-logs\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.241932 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-internal-tls-certs\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.242211 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-combined-ca-bundle\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.242254 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-public-tls-certs\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.242289 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwgrc\" (UniqueName: \"kubernetes.io/projected/a03c4a0d-6346-43e4-8db1-f653b5dfa420-kube-api-access-vwgrc\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.242328 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-config-data\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.344272 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-scripts\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.344343 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a03c4a0d-6346-43e4-8db1-f653b5dfa420-logs\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.344379 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-internal-tls-certs\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.344484 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-combined-ca-bundle\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.344505 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-public-tls-certs\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.344538 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwgrc\" (UniqueName: \"kubernetes.io/projected/a03c4a0d-6346-43e4-8db1-f653b5dfa420-kube-api-access-vwgrc\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.344558 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-config-data\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.345435 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a03c4a0d-6346-43e4-8db1-f653b5dfa420-logs\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.355730 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-scripts\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.355769 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-internal-tls-certs\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.355885 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-combined-ca-bundle\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.356116 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-config-data\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.356116 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-public-tls-certs\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.365918 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwgrc\" (UniqueName: \"kubernetes.io/projected/a03c4a0d-6346-43e4-8db1-f653b5dfa420-kube-api-access-vwgrc\") pod \"placement-6f54bfd6b4-g5gq4\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.529880 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.888451 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.888501 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.922819 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 12:42:26 crc kubenswrapper[4854]: I1007 12:42:26.934426 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 12:42:27 crc kubenswrapper[4854]: I1007 12:42:27.018179 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 12:42:27 crc kubenswrapper[4854]: I1007 12:42:27.018217 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 12:42:29 crc kubenswrapper[4854]: I1007 12:42:29.077421 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 12:42:29 crc kubenswrapper[4854]: I1007 12:42:29.077808 4854 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 12:42:29 crc kubenswrapper[4854]: I1007 12:42:29.082310 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 12:42:30 crc kubenswrapper[4854]: I1007 12:42:30.139320 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hnhnd" Oct 07 12:42:30 crc kubenswrapper[4854]: I1007 12:42:30.219500 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jkcz\" (UniqueName: \"kubernetes.io/projected/57eafbaf-440c-432a-a1ab-55032cd2f54a-kube-api-access-4jkcz\") pod \"57eafbaf-440c-432a-a1ab-55032cd2f54a\" (UID: \"57eafbaf-440c-432a-a1ab-55032cd2f54a\") " Oct 07 12:42:30 crc kubenswrapper[4854]: I1007 12:42:30.219665 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57eafbaf-440c-432a-a1ab-55032cd2f54a-db-sync-config-data\") pod \"57eafbaf-440c-432a-a1ab-55032cd2f54a\" (UID: \"57eafbaf-440c-432a-a1ab-55032cd2f54a\") " Oct 07 12:42:30 crc kubenswrapper[4854]: I1007 12:42:30.219717 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57eafbaf-440c-432a-a1ab-55032cd2f54a-combined-ca-bundle\") pod \"57eafbaf-440c-432a-a1ab-55032cd2f54a\" (UID: \"57eafbaf-440c-432a-a1ab-55032cd2f54a\") " Oct 07 12:42:30 crc kubenswrapper[4854]: I1007 12:42:30.230297 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57eafbaf-440c-432a-a1ab-55032cd2f54a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "57eafbaf-440c-432a-a1ab-55032cd2f54a" (UID: "57eafbaf-440c-432a-a1ab-55032cd2f54a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:30 crc kubenswrapper[4854]: I1007 12:42:30.230454 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57eafbaf-440c-432a-a1ab-55032cd2f54a-kube-api-access-4jkcz" (OuterVolumeSpecName: "kube-api-access-4jkcz") pod "57eafbaf-440c-432a-a1ab-55032cd2f54a" (UID: "57eafbaf-440c-432a-a1ab-55032cd2f54a"). InnerVolumeSpecName "kube-api-access-4jkcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:42:30 crc kubenswrapper[4854]: I1007 12:42:30.251338 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57eafbaf-440c-432a-a1ab-55032cd2f54a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57eafbaf-440c-432a-a1ab-55032cd2f54a" (UID: "57eafbaf-440c-432a-a1ab-55032cd2f54a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:30 crc kubenswrapper[4854]: I1007 12:42:30.321003 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jkcz\" (UniqueName: \"kubernetes.io/projected/57eafbaf-440c-432a-a1ab-55032cd2f54a-kube-api-access-4jkcz\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:30 crc kubenswrapper[4854]: I1007 12:42:30.321046 4854 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57eafbaf-440c-432a-a1ab-55032cd2f54a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:30 crc kubenswrapper[4854]: I1007 12:42:30.321058 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57eafbaf-440c-432a-a1ab-55032cd2f54a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.051052 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hnhnd" event={"ID":"57eafbaf-440c-432a-a1ab-55032cd2f54a","Type":"ContainerDied","Data":"6933ceca5b59314f6307fe8916bb7d3623069b2569d09370d903ba6019f8cb0d"} Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.051433 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6933ceca5b59314f6307fe8916bb7d3623069b2569d09370d903ba6019f8cb0d" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.051541 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hnhnd" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.408589 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7b8cbbf7f5-8gzrj"] Oct 07 12:42:31 crc kubenswrapper[4854]: E1007 12:42:31.409090 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57eafbaf-440c-432a-a1ab-55032cd2f54a" containerName="barbican-db-sync" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.409434 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="57eafbaf-440c-432a-a1ab-55032cd2f54a" containerName="barbican-db-sync" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.409675 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="57eafbaf-440c-432a-a1ab-55032cd2f54a" containerName="barbican-db-sync" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.410837 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.413673 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.413923 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.414065 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5vxvs" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.449086 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b8cbbf7f5-8gzrj"] Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.486764 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-77f6984dd6-vxjwr"] Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.488240 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.490277 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.532990 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-77f6984dd6-vxjwr"] Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.541762 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g47gl\" (UniqueName: \"kubernetes.io/projected/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-kube-api-access-g47gl\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.541844 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-config-data\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.541912 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-combined-ca-bundle\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.541972 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-config-data-custom\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.541999 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-logs\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.548765 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-w4hwc"] Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.551753 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.575889 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-w4hwc"] Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.622381 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6fbf67c4f8-wmrbx"] Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.623758 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.631635 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.638675 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fbf67c4f8-wmrbx"] Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.647918 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-logs\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.647980 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vcvs\" (UniqueName: \"kubernetes.io/projected/5673e957-d032-4112-b620-9f255b01d0d9-kube-api-access-2vcvs\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648025 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648060 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-config-data-custom\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648095 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-config-data\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648136 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39b6e215-6643-4003-9c53-d33f6af39494-logs\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648178 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-config-data-custom\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648209 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648236 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n7tb\" (UniqueName: \"kubernetes.io/projected/7e713bda-f49c-4d97-98c7-4ae024be86f4-kube-api-access-4n7tb\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648269 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5673e957-d032-4112-b620-9f255b01d0d9-logs\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648300 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-config-data\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648337 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvmrg\" (UniqueName: \"kubernetes.io/projected/39b6e215-6643-4003-9c53-d33f6af39494-kube-api-access-jvmrg\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648370 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g47gl\" (UniqueName: \"kubernetes.io/projected/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-kube-api-access-g47gl\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648409 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-config-data\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648429 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-combined-ca-bundle\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648466 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-combined-ca-bundle\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648485 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648510 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-config\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648544 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648569 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-combined-ca-bundle\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.648600 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-config-data-custom\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.649379 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-logs\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.655329 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-config-data\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.657894 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-combined-ca-bundle\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.662853 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-config-data-custom\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.665842 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g47gl\" (UniqueName: \"kubernetes.io/projected/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-kube-api-access-g47gl\") pod \"barbican-worker-7b8cbbf7f5-8gzrj\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.737662 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.751511 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-config-data\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.751619 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39b6e215-6643-4003-9c53-d33f6af39494-logs\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.751665 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-config-data-custom\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.751715 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.751753 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n7tb\" (UniqueName: \"kubernetes.io/projected/7e713bda-f49c-4d97-98c7-4ae024be86f4-kube-api-access-4n7tb\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.751798 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5673e957-d032-4112-b620-9f255b01d0d9-logs\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.751845 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-config-data\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.751903 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvmrg\" (UniqueName: \"kubernetes.io/projected/39b6e215-6643-4003-9c53-d33f6af39494-kube-api-access-jvmrg\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.751957 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-combined-ca-bundle\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.752022 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.752056 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-config\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.752103 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.752135 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-combined-ca-bundle\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.752204 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vcvs\" (UniqueName: \"kubernetes.io/projected/5673e957-d032-4112-b620-9f255b01d0d9-kube-api-access-2vcvs\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.752262 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.752313 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-config-data-custom\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.752338 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39b6e215-6643-4003-9c53-d33f6af39494-logs\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.753811 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.754688 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5673e957-d032-4112-b620-9f255b01d0d9-logs\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.757697 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-combined-ca-bundle\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.758372 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.759461 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.763175 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.763618 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-config-data-custom\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.763733 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-combined-ca-bundle\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.764582 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-config-data\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.766544 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-config-data\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.768434 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-config-data-custom\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.769610 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-config\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.776091 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvmrg\" (UniqueName: \"kubernetes.io/projected/39b6e215-6643-4003-9c53-d33f6af39494-kube-api-access-jvmrg\") pod \"barbican-keystone-listener-77f6984dd6-vxjwr\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.780660 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vcvs\" (UniqueName: \"kubernetes.io/projected/5673e957-d032-4112-b620-9f255b01d0d9-kube-api-access-2vcvs\") pod \"barbican-api-6fbf67c4f8-wmrbx\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.781376 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n7tb\" (UniqueName: \"kubernetes.io/projected/7e713bda-f49c-4d97-98c7-4ae024be86f4-kube-api-access-4n7tb\") pod \"dnsmasq-dns-586bdc5f9-w4hwc\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.837678 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.883406 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:31 crc kubenswrapper[4854]: I1007 12:42:31.943077 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:32 crc kubenswrapper[4854]: I1007 12:42:32.107053 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-775676c768-frcfp" event={"ID":"9869a8d4-db8e-4aba-82d1-6d02c3cf988e","Type":"ContainerStarted","Data":"d16e3dc32eb57dea6b6b77bcedd6210c8586d9780a519f65e63564f3af418dfd"} Oct 07 12:42:32 crc kubenswrapper[4854]: I1007 12:42:32.108168 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:32 crc kubenswrapper[4854]: I1007 12:42:32.129633 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-775676c768-frcfp" podStartSLOduration=7.129614888 podStartE2EDuration="7.129614888s" podCreationTimestamp="2025-10-07 12:42:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:42:32.129605938 +0000 UTC m=+1068.117438213" watchObservedRunningTime="2025-10-07 12:42:32.129614888 +0000 UTC m=+1068.117447143" Oct 07 12:42:32 crc kubenswrapper[4854]: I1007 12:42:32.170874 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b8cbbf7f5-8gzrj"] Oct 07 12:42:32 crc kubenswrapper[4854]: W1007 12:42:32.224088 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda03c4a0d_6346_43e4_8db1_f653b5dfa420.slice/crio-37ba3c6481940bfbdda3c0521fdf02c60e40bf5891809d93abc2401822c90ae2 WatchSource:0}: Error finding container 37ba3c6481940bfbdda3c0521fdf02c60e40bf5891809d93abc2401822c90ae2: Status 404 returned error can't find the container with id 37ba3c6481940bfbdda3c0521fdf02c60e40bf5891809d93abc2401822c90ae2 Oct 07 12:42:32 crc kubenswrapper[4854]: I1007 12:42:32.225301 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f54bfd6b4-g5gq4"] Oct 07 12:42:32 crc kubenswrapper[4854]: I1007 12:42:32.512936 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-w4hwc"] Oct 07 12:42:32 crc kubenswrapper[4854]: I1007 12:42:32.598627 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fbf67c4f8-wmrbx"] Oct 07 12:42:32 crc kubenswrapper[4854]: W1007 12:42:32.601037 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5673e957_d032_4112_b620_9f255b01d0d9.slice/crio-7ff2c9bf035fd17367d877c52e09ee7a97496092d8ee1ef0df7600f78ea4287c WatchSource:0}: Error finding container 7ff2c9bf035fd17367d877c52e09ee7a97496092d8ee1ef0df7600f78ea4287c: Status 404 returned error can't find the container with id 7ff2c9bf035fd17367d877c52e09ee7a97496092d8ee1ef0df7600f78ea4287c Oct 07 12:42:32 crc kubenswrapper[4854]: I1007 12:42:32.616271 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-77f6984dd6-vxjwr"] Oct 07 12:42:32 crc kubenswrapper[4854]: E1007 12:42:32.700225 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="211f81db-7ca9-43c6-bd31-aa60758129e6" Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.129531 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7zmqn" event={"ID":"74714c8f-dea6-40be-9985-d254729920c9","Type":"ContainerStarted","Data":"a426a0855633dd84dc768e5b4c01a36c6fac3eb00da8f75a86ee52fe9fa8ecd2"} Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.132069 4854 generic.go:334] "Generic (PLEG): container finished" podID="7e713bda-f49c-4d97-98c7-4ae024be86f4" containerID="8535090bf8d2c155a19528aa29438dd2f963c8e3b2b710e4f24168be011e8eba" exitCode=0 Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.132120 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" event={"ID":"7e713bda-f49c-4d97-98c7-4ae024be86f4","Type":"ContainerDied","Data":"8535090bf8d2c155a19528aa29438dd2f963c8e3b2b710e4f24168be011e8eba"} Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.132179 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" event={"ID":"7e713bda-f49c-4d97-98c7-4ae024be86f4","Type":"ContainerStarted","Data":"371ec7b9d1f32cf275ea0f62dddd4da8ab44a0cbdae2d387ec51f71114c588db"} Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.135566 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"211f81db-7ca9-43c6-bd31-aa60758129e6","Type":"ContainerStarted","Data":"f08c4fa6aa9391409f1bb69cd9cec917fc07862c51f5ae56b7da2f92e87b83c9"} Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.135851 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.135839 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerName="ceilometer-notification-agent" containerID="cri-o://b5e263b306dfcc4c0f407799fb0b62d3e3994d3c3b32517a3c1fbdeafab18602" gracePeriod=30 Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.135857 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerName="proxy-httpd" containerID="cri-o://f08c4fa6aa9391409f1bb69cd9cec917fc07862c51f5ae56b7da2f92e87b83c9" gracePeriod=30 Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.135874 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerName="sg-core" containerID="cri-o://d4b7f229a72f8b20f9d2daabf20383996d124c9b9aa548544b423f15eadcaaf0" gracePeriod=30 Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.142039 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f54bfd6b4-g5gq4" event={"ID":"a03c4a0d-6346-43e4-8db1-f653b5dfa420","Type":"ContainerStarted","Data":"dda148e71ce3d2269edc8e05951c34651b383998b2ea4117e0f5773827d60523"} Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.142082 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f54bfd6b4-g5gq4" event={"ID":"a03c4a0d-6346-43e4-8db1-f653b5dfa420","Type":"ContainerStarted","Data":"85e71836a3ed17c4ff8324140a6164289c633f406fdb437654545bf3071cb059"} Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.142092 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f54bfd6b4-g5gq4" event={"ID":"a03c4a0d-6346-43e4-8db1-f653b5dfa420","Type":"ContainerStarted","Data":"37ba3c6481940bfbdda3c0521fdf02c60e40bf5891809d93abc2401822c90ae2"} Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.142868 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.142896 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.144185 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" event={"ID":"39b6e215-6643-4003-9c53-d33f6af39494","Type":"ContainerStarted","Data":"f73b792b25607fa6b1c25ff040b16a00fcfb986450169a010234f049614c0558"} Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.148802 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" event={"ID":"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e","Type":"ContainerStarted","Data":"62b231c9792c08f3532e265cd8d38d6096ed1f5c4336b1ec0de88c85e3d1a9a5"} Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.148765 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-7zmqn" podStartSLOduration=3.26775338 podStartE2EDuration="52.148740459s" podCreationTimestamp="2025-10-07 12:41:41 +0000 UTC" firstStartedPulling="2025-10-07 12:41:43.022986945 +0000 UTC m=+1019.010819200" lastFinishedPulling="2025-10-07 12:42:31.903974024 +0000 UTC m=+1067.891806279" observedRunningTime="2025-10-07 12:42:33.145888076 +0000 UTC m=+1069.133720331" watchObservedRunningTime="2025-10-07 12:42:33.148740459 +0000 UTC m=+1069.136572734" Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.154509 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" event={"ID":"5673e957-d032-4112-b620-9f255b01d0d9","Type":"ContainerStarted","Data":"8138d4b4267015e79d13138c0c538582f0f03d240febe0ced67849accae5eda1"} Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.154585 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" event={"ID":"5673e957-d032-4112-b620-9f255b01d0d9","Type":"ContainerStarted","Data":"7ff2c9bf035fd17367d877c52e09ee7a97496092d8ee1ef0df7600f78ea4287c"} Oct 07 12:42:33 crc kubenswrapper[4854]: I1007 12:42:33.222452 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6f54bfd6b4-g5gq4" podStartSLOduration=7.222431923 podStartE2EDuration="7.222431923s" podCreationTimestamp="2025-10-07 12:42:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:42:33.207158669 +0000 UTC m=+1069.194990924" watchObservedRunningTime="2025-10-07 12:42:33.222431923 +0000 UTC m=+1069.210264188" Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.201499 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" event={"ID":"7e713bda-f49c-4d97-98c7-4ae024be86f4","Type":"ContainerStarted","Data":"c2e5b1927b214e05f2b1fbfb109fc80be6a75e65f99b2c3b05086606f024b163"} Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.202137 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.221008 4854 generic.go:334] "Generic (PLEG): container finished" podID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerID="f08c4fa6aa9391409f1bb69cd9cec917fc07862c51f5ae56b7da2f92e87b83c9" exitCode=0 Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.221039 4854 generic.go:334] "Generic (PLEG): container finished" podID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerID="d4b7f229a72f8b20f9d2daabf20383996d124c9b9aa548544b423f15eadcaaf0" exitCode=2 Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.221088 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"211f81db-7ca9-43c6-bd31-aa60758129e6","Type":"ContainerDied","Data":"f08c4fa6aa9391409f1bb69cd9cec917fc07862c51f5ae56b7da2f92e87b83c9"} Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.221114 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"211f81db-7ca9-43c6-bd31-aa60758129e6","Type":"ContainerDied","Data":"d4b7f229a72f8b20f9d2daabf20383996d124c9b9aa548544b423f15eadcaaf0"} Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.225980 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" event={"ID":"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e","Type":"ContainerStarted","Data":"0b7c87e468f2b8ba32d54b131d0c6194d6f52f79c88bd597eeb0b8875705a1d6"} Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.226030 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" event={"ID":"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e","Type":"ContainerStarted","Data":"6a8cb87e42de1baa0fe321962804fbc2f8635887612d2573377887b576e22a6c"} Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.235204 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" event={"ID":"5673e957-d032-4112-b620-9f255b01d0d9","Type":"ContainerStarted","Data":"4693cbd94ea9abb6515ae8732113c7cf02b75e3ac39ffd080462cbb8f082a46e"} Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.235237 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.235620 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.237945 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" podStartSLOduration=3.237923508 podStartE2EDuration="3.237923508s" podCreationTimestamp="2025-10-07 12:42:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:42:34.234055675 +0000 UTC m=+1070.221887950" watchObservedRunningTime="2025-10-07 12:42:34.237923508 +0000 UTC m=+1070.225755763" Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.260246 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" podStartSLOduration=1.948668549 podStartE2EDuration="3.260227517s" podCreationTimestamp="2025-10-07 12:42:31 +0000 UTC" firstStartedPulling="2025-10-07 12:42:32.218437543 +0000 UTC m=+1068.206269798" lastFinishedPulling="2025-10-07 12:42:33.529996511 +0000 UTC m=+1069.517828766" observedRunningTime="2025-10-07 12:42:34.249018191 +0000 UTC m=+1070.236850446" watchObservedRunningTime="2025-10-07 12:42:34.260227517 +0000 UTC m=+1070.248059772" Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.273048 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" podStartSLOduration=3.273028869 podStartE2EDuration="3.273028869s" podCreationTimestamp="2025-10-07 12:42:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:42:34.267417566 +0000 UTC m=+1070.255249891" watchObservedRunningTime="2025-10-07 12:42:34.273028869 +0000 UTC m=+1070.260861124" Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.890916 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7595d98994-smt7c"] Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.893910 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.897853 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.898429 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 07 12:42:34 crc kubenswrapper[4854]: I1007 12:42:34.903005 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7595d98994-smt7c"] Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.029559 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-config-data-custom\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.029608 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-public-tls-certs\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.029685 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-internal-tls-certs\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.029706 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-config-data\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.029736 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-combined-ca-bundle\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.029790 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002f7a4-27d6-4554-a486-87926ebcf57e-logs\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.029849 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2jwc\" (UniqueName: \"kubernetes.io/projected/6002f7a4-27d6-4554-a486-87926ebcf57e-kube-api-access-q2jwc\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.131059 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2jwc\" (UniqueName: \"kubernetes.io/projected/6002f7a4-27d6-4554-a486-87926ebcf57e-kube-api-access-q2jwc\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.131164 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-config-data-custom\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.131188 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-public-tls-certs\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.131251 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-internal-tls-certs\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.131270 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-config-data\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.131297 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-combined-ca-bundle\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.131357 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002f7a4-27d6-4554-a486-87926ebcf57e-logs\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.131944 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002f7a4-27d6-4554-a486-87926ebcf57e-logs\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.136251 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-combined-ca-bundle\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.136328 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-internal-tls-certs\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.137258 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-config-data\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.137602 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-config-data-custom\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.142303 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-public-tls-certs\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.150640 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2jwc\" (UniqueName: \"kubernetes.io/projected/6002f7a4-27d6-4554-a486-87926ebcf57e-kube-api-access-q2jwc\") pod \"barbican-api-7595d98994-smt7c\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.245410 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" event={"ID":"39b6e215-6643-4003-9c53-d33f6af39494","Type":"ContainerStarted","Data":"5e0219a6ee2ee21252dc5506f149e4cf4652bd9388a64c5f4c96dec5453578f6"} Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.245668 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" event={"ID":"39b6e215-6643-4003-9c53-d33f6af39494","Type":"ContainerStarted","Data":"55c2f083da705ca36f1812e92722e74d07fb8806209d2dfd59e204e7c4638f72"} Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.263871 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" podStartSLOduration=2.244077223 podStartE2EDuration="4.263851015s" podCreationTimestamp="2025-10-07 12:42:31 +0000 UTC" firstStartedPulling="2025-10-07 12:42:32.625737653 +0000 UTC m=+1068.613569908" lastFinishedPulling="2025-10-07 12:42:34.645511445 +0000 UTC m=+1070.633343700" observedRunningTime="2025-10-07 12:42:35.259440377 +0000 UTC m=+1071.247272632" watchObservedRunningTime="2025-10-07 12:42:35.263851015 +0000 UTC m=+1071.251683270" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.346595 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:35 crc kubenswrapper[4854]: I1007 12:42:35.830614 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7595d98994-smt7c"] Oct 07 12:42:36 crc kubenswrapper[4854]: I1007 12:42:36.262043 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7595d98994-smt7c" event={"ID":"6002f7a4-27d6-4554-a486-87926ebcf57e","Type":"ContainerStarted","Data":"f0ff8ce507731e5f1fa6188aabef5933b1d02dc7dee5ef5454692c6f625fabed"} Oct 07 12:42:36 crc kubenswrapper[4854]: I1007 12:42:36.262312 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7595d98994-smt7c" event={"ID":"6002f7a4-27d6-4554-a486-87926ebcf57e","Type":"ContainerStarted","Data":"ab6b9b6df7a6df7e7a7e15ca4a8d64babc1a2ae8feac7c3dc87e742af29e9944"} Oct 07 12:42:36 crc kubenswrapper[4854]: I1007 12:42:36.262329 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7595d98994-smt7c" event={"ID":"6002f7a4-27d6-4554-a486-87926ebcf57e","Type":"ContainerStarted","Data":"9aa2233d29f445f6dbb48e7902cbb60c9de1fb72b3cf982ec1c19a1afd9da099"} Oct 07 12:42:36 crc kubenswrapper[4854]: I1007 12:42:36.291628 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7595d98994-smt7c" podStartSLOduration=2.291603307 podStartE2EDuration="2.291603307s" podCreationTimestamp="2025-10-07 12:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:42:36.28174572 +0000 UTC m=+1072.269577975" watchObservedRunningTime="2025-10-07 12:42:36.291603307 +0000 UTC m=+1072.279435572" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.270707 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.270774 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.719018 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.884746 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-scripts\") pod \"211f81db-7ca9-43c6-bd31-aa60758129e6\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.884840 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-config-data\") pod \"211f81db-7ca9-43c6-bd31-aa60758129e6\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.884950 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/211f81db-7ca9-43c6-bd31-aa60758129e6-run-httpd\") pod \"211f81db-7ca9-43c6-bd31-aa60758129e6\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.885074 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-sg-core-conf-yaml\") pod \"211f81db-7ca9-43c6-bd31-aa60758129e6\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.885116 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll5l5\" (UniqueName: \"kubernetes.io/projected/211f81db-7ca9-43c6-bd31-aa60758129e6-kube-api-access-ll5l5\") pod \"211f81db-7ca9-43c6-bd31-aa60758129e6\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.885200 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-combined-ca-bundle\") pod \"211f81db-7ca9-43c6-bd31-aa60758129e6\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.885306 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/211f81db-7ca9-43c6-bd31-aa60758129e6-log-httpd\") pod \"211f81db-7ca9-43c6-bd31-aa60758129e6\" (UID: \"211f81db-7ca9-43c6-bd31-aa60758129e6\") " Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.885529 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/211f81db-7ca9-43c6-bd31-aa60758129e6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "211f81db-7ca9-43c6-bd31-aa60758129e6" (UID: "211f81db-7ca9-43c6-bd31-aa60758129e6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.886450 4854 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/211f81db-7ca9-43c6-bd31-aa60758129e6-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.887523 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/211f81db-7ca9-43c6-bd31-aa60758129e6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "211f81db-7ca9-43c6-bd31-aa60758129e6" (UID: "211f81db-7ca9-43c6-bd31-aa60758129e6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.891195 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-scripts" (OuterVolumeSpecName: "scripts") pod "211f81db-7ca9-43c6-bd31-aa60758129e6" (UID: "211f81db-7ca9-43c6-bd31-aa60758129e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.893346 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/211f81db-7ca9-43c6-bd31-aa60758129e6-kube-api-access-ll5l5" (OuterVolumeSpecName: "kube-api-access-ll5l5") pod "211f81db-7ca9-43c6-bd31-aa60758129e6" (UID: "211f81db-7ca9-43c6-bd31-aa60758129e6"). InnerVolumeSpecName "kube-api-access-ll5l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.923530 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "211f81db-7ca9-43c6-bd31-aa60758129e6" (UID: "211f81db-7ca9-43c6-bd31-aa60758129e6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.955070 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "211f81db-7ca9-43c6-bd31-aa60758129e6" (UID: "211f81db-7ca9-43c6-bd31-aa60758129e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.985914 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-config-data" (OuterVolumeSpecName: "config-data") pod "211f81db-7ca9-43c6-bd31-aa60758129e6" (UID: "211f81db-7ca9-43c6-bd31-aa60758129e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.988056 4854 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.988097 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll5l5\" (UniqueName: \"kubernetes.io/projected/211f81db-7ca9-43c6-bd31-aa60758129e6-kube-api-access-ll5l5\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.988109 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.988119 4854 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/211f81db-7ca9-43c6-bd31-aa60758129e6-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.988129 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:37 crc kubenswrapper[4854]: I1007 12:42:37.988138 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/211f81db-7ca9-43c6-bd31-aa60758129e6-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.294912 4854 generic.go:334] "Generic (PLEG): container finished" podID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerID="b5e263b306dfcc4c0f407799fb0b62d3e3994d3c3b32517a3c1fbdeafab18602" exitCode=0 Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.295066 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.295094 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"211f81db-7ca9-43c6-bd31-aa60758129e6","Type":"ContainerDied","Data":"b5e263b306dfcc4c0f407799fb0b62d3e3994d3c3b32517a3c1fbdeafab18602"} Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.295864 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"211f81db-7ca9-43c6-bd31-aa60758129e6","Type":"ContainerDied","Data":"1b0a68f6314056c43b63809a365e79c8d497826950a3610f6a1f7e88ff6bfa05"} Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.295919 4854 scope.go:117] "RemoveContainer" containerID="f08c4fa6aa9391409f1bb69cd9cec917fc07862c51f5ae56b7da2f92e87b83c9" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.348306 4854 scope.go:117] "RemoveContainer" containerID="d4b7f229a72f8b20f9d2daabf20383996d124c9b9aa548544b423f15eadcaaf0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.395317 4854 scope.go:117] "RemoveContainer" containerID="b5e263b306dfcc4c0f407799fb0b62d3e3994d3c3b32517a3c1fbdeafab18602" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.403263 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.441318 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.462034 4854 scope.go:117] "RemoveContainer" containerID="f08c4fa6aa9391409f1bb69cd9cec917fc07862c51f5ae56b7da2f92e87b83c9" Oct 07 12:42:38 crc kubenswrapper[4854]: E1007 12:42:38.466255 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f08c4fa6aa9391409f1bb69cd9cec917fc07862c51f5ae56b7da2f92e87b83c9\": container with ID starting with f08c4fa6aa9391409f1bb69cd9cec917fc07862c51f5ae56b7da2f92e87b83c9 not found: ID does not exist" containerID="f08c4fa6aa9391409f1bb69cd9cec917fc07862c51f5ae56b7da2f92e87b83c9" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.466299 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f08c4fa6aa9391409f1bb69cd9cec917fc07862c51f5ae56b7da2f92e87b83c9"} err="failed to get container status \"f08c4fa6aa9391409f1bb69cd9cec917fc07862c51f5ae56b7da2f92e87b83c9\": rpc error: code = NotFound desc = could not find container \"f08c4fa6aa9391409f1bb69cd9cec917fc07862c51f5ae56b7da2f92e87b83c9\": container with ID starting with f08c4fa6aa9391409f1bb69cd9cec917fc07862c51f5ae56b7da2f92e87b83c9 not found: ID does not exist" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.466518 4854 scope.go:117] "RemoveContainer" containerID="d4b7f229a72f8b20f9d2daabf20383996d124c9b9aa548544b423f15eadcaaf0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.470219 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:42:38 crc kubenswrapper[4854]: E1007 12:42:38.470767 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerName="sg-core" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.470783 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerName="sg-core" Oct 07 12:42:38 crc kubenswrapper[4854]: E1007 12:42:38.470802 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerName="ceilometer-notification-agent" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.470809 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerName="ceilometer-notification-agent" Oct 07 12:42:38 crc kubenswrapper[4854]: E1007 12:42:38.470842 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerName="proxy-httpd" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.470850 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerName="proxy-httpd" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.471061 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerName="proxy-httpd" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.471080 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerName="ceilometer-notification-agent" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.471103 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="211f81db-7ca9-43c6-bd31-aa60758129e6" containerName="sg-core" Oct 07 12:42:38 crc kubenswrapper[4854]: E1007 12:42:38.473716 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b7f229a72f8b20f9d2daabf20383996d124c9b9aa548544b423f15eadcaaf0\": container with ID starting with d4b7f229a72f8b20f9d2daabf20383996d124c9b9aa548544b423f15eadcaaf0 not found: ID does not exist" containerID="d4b7f229a72f8b20f9d2daabf20383996d124c9b9aa548544b423f15eadcaaf0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.473755 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b7f229a72f8b20f9d2daabf20383996d124c9b9aa548544b423f15eadcaaf0"} err="failed to get container status \"d4b7f229a72f8b20f9d2daabf20383996d124c9b9aa548544b423f15eadcaaf0\": rpc error: code = NotFound desc = could not find container \"d4b7f229a72f8b20f9d2daabf20383996d124c9b9aa548544b423f15eadcaaf0\": container with ID starting with d4b7f229a72f8b20f9d2daabf20383996d124c9b9aa548544b423f15eadcaaf0 not found: ID does not exist" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.473779 4854 scope.go:117] "RemoveContainer" containerID="b5e263b306dfcc4c0f407799fb0b62d3e3994d3c3b32517a3c1fbdeafab18602" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.474718 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: E1007 12:42:38.474743 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5e263b306dfcc4c0f407799fb0b62d3e3994d3c3b32517a3c1fbdeafab18602\": container with ID starting with b5e263b306dfcc4c0f407799fb0b62d3e3994d3c3b32517a3c1fbdeafab18602 not found: ID does not exist" containerID="b5e263b306dfcc4c0f407799fb0b62d3e3994d3c3b32517a3c1fbdeafab18602" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.475793 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e263b306dfcc4c0f407799fb0b62d3e3994d3c3b32517a3c1fbdeafab18602"} err="failed to get container status \"b5e263b306dfcc4c0f407799fb0b62d3e3994d3c3b32517a3c1fbdeafab18602\": rpc error: code = NotFound desc = could not find container \"b5e263b306dfcc4c0f407799fb0b62d3e3994d3c3b32517a3c1fbdeafab18602\": container with ID starting with b5e263b306dfcc4c0f407799fb0b62d3e3994d3c3b32517a3c1fbdeafab18602 not found: ID does not exist" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.482221 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.482339 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.492485 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.602018 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7cdv\" (UniqueName: \"kubernetes.io/projected/efc4aa31-c1b9-4615-bca1-04c5f81d3024-kube-api-access-z7cdv\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.602134 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efc4aa31-c1b9-4615-bca1-04c5f81d3024-log-httpd\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.602380 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efc4aa31-c1b9-4615-bca1-04c5f81d3024-run-httpd\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.602442 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.602587 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-scripts\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.602656 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-config-data\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.602697 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.704427 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efc4aa31-c1b9-4615-bca1-04c5f81d3024-log-httpd\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.704499 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efc4aa31-c1b9-4615-bca1-04c5f81d3024-run-httpd\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.704586 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.704708 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-scripts\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.704765 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-config-data\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.704807 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.704865 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7cdv\" (UniqueName: \"kubernetes.io/projected/efc4aa31-c1b9-4615-bca1-04c5f81d3024-kube-api-access-z7cdv\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.705014 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efc4aa31-c1b9-4615-bca1-04c5f81d3024-run-httpd\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.705119 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efc4aa31-c1b9-4615-bca1-04c5f81d3024-log-httpd\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.712500 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-scripts\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.714040 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-config-data\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.714442 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.714587 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.724341 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="211f81db-7ca9-43c6-bd31-aa60758129e6" path="/var/lib/kubelet/pods/211f81db-7ca9-43c6-bd31-aa60758129e6/volumes" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.726778 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7cdv\" (UniqueName: \"kubernetes.io/projected/efc4aa31-c1b9-4615-bca1-04c5f81d3024-kube-api-access-z7cdv\") pod \"ceilometer-0\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " pod="openstack/ceilometer-0" Oct 07 12:42:38 crc kubenswrapper[4854]: I1007 12:42:38.834367 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:42:39 crc kubenswrapper[4854]: I1007 12:42:39.309588 4854 generic.go:334] "Generic (PLEG): container finished" podID="74714c8f-dea6-40be-9985-d254729920c9" containerID="a426a0855633dd84dc768e5b4c01a36c6fac3eb00da8f75a86ee52fe9fa8ecd2" exitCode=0 Oct 07 12:42:39 crc kubenswrapper[4854]: I1007 12:42:39.309667 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7zmqn" event={"ID":"74714c8f-dea6-40be-9985-d254729920c9","Type":"ContainerDied","Data":"a426a0855633dd84dc768e5b4c01a36c6fac3eb00da8f75a86ee52fe9fa8ecd2"} Oct 07 12:42:39 crc kubenswrapper[4854]: I1007 12:42:39.316338 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.322224 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efc4aa31-c1b9-4615-bca1-04c5f81d3024","Type":"ContainerStarted","Data":"6f33c5e04ac8af8ed233bdc01118f9feb7f99398c45ef3115fd12dfe9b7dcf34"} Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.324964 4854 generic.go:334] "Generic (PLEG): container finished" podID="c462d02f-dfcd-48f7-b755-fb203afcb213" containerID="86def19086aabe002c256380aee9698cfdc030d6deb57886de5b569a36e128f9" exitCode=0 Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.325096 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qk4ws" event={"ID":"c462d02f-dfcd-48f7-b755-fb203afcb213","Type":"ContainerDied","Data":"86def19086aabe002c256380aee9698cfdc030d6deb57886de5b569a36e128f9"} Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.739753 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.743704 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-db-sync-config-data\") pod \"74714c8f-dea6-40be-9985-d254729920c9\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.743779 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74714c8f-dea6-40be-9985-d254729920c9-etc-machine-id\") pod \"74714c8f-dea6-40be-9985-d254729920c9\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.743822 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bm65\" (UniqueName: \"kubernetes.io/projected/74714c8f-dea6-40be-9985-d254729920c9-kube-api-access-5bm65\") pod \"74714c8f-dea6-40be-9985-d254729920c9\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.743885 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-scripts\") pod \"74714c8f-dea6-40be-9985-d254729920c9\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.744039 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-combined-ca-bundle\") pod \"74714c8f-dea6-40be-9985-d254729920c9\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.744068 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-config-data\") pod \"74714c8f-dea6-40be-9985-d254729920c9\" (UID: \"74714c8f-dea6-40be-9985-d254729920c9\") " Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.749176 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74714c8f-dea6-40be-9985-d254729920c9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "74714c8f-dea6-40be-9985-d254729920c9" (UID: "74714c8f-dea6-40be-9985-d254729920c9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.749457 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74714c8f-dea6-40be-9985-d254729920c9-kube-api-access-5bm65" (OuterVolumeSpecName: "kube-api-access-5bm65") pod "74714c8f-dea6-40be-9985-d254729920c9" (UID: "74714c8f-dea6-40be-9985-d254729920c9"). InnerVolumeSpecName "kube-api-access-5bm65". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.752016 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "74714c8f-dea6-40be-9985-d254729920c9" (UID: "74714c8f-dea6-40be-9985-d254729920c9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.758029 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-scripts" (OuterVolumeSpecName: "scripts") pod "74714c8f-dea6-40be-9985-d254729920c9" (UID: "74714c8f-dea6-40be-9985-d254729920c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.812765 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74714c8f-dea6-40be-9985-d254729920c9" (UID: "74714c8f-dea6-40be-9985-d254729920c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.829536 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-config-data" (OuterVolumeSpecName: "config-data") pod "74714c8f-dea6-40be-9985-d254729920c9" (UID: "74714c8f-dea6-40be-9985-d254729920c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.846254 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.846287 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.846298 4854 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.846308 4854 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74714c8f-dea6-40be-9985-d254729920c9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.846319 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bm65\" (UniqueName: \"kubernetes.io/projected/74714c8f-dea6-40be-9985-d254729920c9-kube-api-access-5bm65\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:40 crc kubenswrapper[4854]: I1007 12:42:40.846333 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74714c8f-dea6-40be-9985-d254729920c9-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.338633 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7zmqn" event={"ID":"74714c8f-dea6-40be-9985-d254729920c9","Type":"ContainerDied","Data":"aa6ab021343becef4a411d5f7376aa4406dae9aa269d7bc64b7af99e09053dba"} Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.338901 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa6ab021343becef4a411d5f7376aa4406dae9aa269d7bc64b7af99e09053dba" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.338676 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7zmqn" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.341548 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efc4aa31-c1b9-4615-bca1-04c5f81d3024","Type":"ContainerStarted","Data":"3ab4bbcf97db87a62ff3a0e0264374c75b015402a12f1a9cec1680d13ecfcb9d"} Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.713619 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 12:42:41 crc kubenswrapper[4854]: E1007 12:42:41.714044 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74714c8f-dea6-40be-9985-d254729920c9" containerName="cinder-db-sync" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.714057 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="74714c8f-dea6-40be-9985-d254729920c9" containerName="cinder-db-sync" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.714310 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="74714c8f-dea6-40be-9985-d254729920c9" containerName="cinder-db-sync" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.715307 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.719629 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.719790 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7hxgb" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.719926 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.722060 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.723732 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.760500 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-w4hwc"] Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.760741 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" podUID="7e713bda-f49c-4d97-98c7-4ae024be86f4" containerName="dnsmasq-dns" containerID="cri-o://c2e5b1927b214e05f2b1fbfb109fc80be6a75e65f99b2c3b05086606f024b163" gracePeriod=10 Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.769179 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.808126 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-scripts\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.808264 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.808286 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.808312 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7xrx\" (UniqueName: \"kubernetes.io/projected/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-kube-api-access-d7xrx\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.808337 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.808416 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-config-data\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.821297 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-89c6h"] Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.823124 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.884580 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-89c6h"] Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.885653 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" podUID="7e713bda-f49c-4d97-98c7-4ae024be86f4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: connect: connection refused" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.905102 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.906976 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.910490 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-config\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.910626 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.910706 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.910779 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.911090 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.911183 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7xrx\" (UniqueName: \"kubernetes.io/projected/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-kube-api-access-d7xrx\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.911363 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.911445 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqfmj\" (UniqueName: \"kubernetes.io/projected/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-kube-api-access-hqfmj\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.911610 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-config-data\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.911698 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.912709 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-scripts\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.913270 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.911259 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.917126 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.925350 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.929703 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-scripts\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.930705 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.937162 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-config-data\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.942688 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 12:42:41 crc kubenswrapper[4854]: I1007 12:42:41.958942 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7xrx\" (UniqueName: \"kubernetes.io/projected/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-kube-api-access-d7xrx\") pod \"cinder-scheduler-0\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.024178 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-config\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.024461 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-config-data\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.024570 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.024671 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.024795 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqfmj\" (UniqueName: \"kubernetes.io/projected/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-kube-api-access-hqfmj\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.025031 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-config-data-custom\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.025138 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dmwg\" (UniqueName: \"kubernetes.io/projected/2214523f-cc36-4152-b4cd-ad6731ceecfd-kube-api-access-5dmwg\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.025393 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2214523f-cc36-4152-b4cd-ad6731ceecfd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.025546 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2214523f-cc36-4152-b4cd-ad6731ceecfd-logs\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.025703 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-scripts\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.025806 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.025923 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.026035 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.026037 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.026647 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-config\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.027139 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.027716 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.047336 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.062232 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqfmj\" (UniqueName: \"kubernetes.io/projected/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-kube-api-access-hqfmj\") pod \"dnsmasq-dns-795f4db4bc-89c6h\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.072674 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.083976 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.106449 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qk4ws" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.127295 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-config-data\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.127368 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-config-data-custom\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.127398 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dmwg\" (UniqueName: \"kubernetes.io/projected/2214523f-cc36-4152-b4cd-ad6731ceecfd-kube-api-access-5dmwg\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.127414 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2214523f-cc36-4152-b4cd-ad6731ceecfd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.127435 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2214523f-cc36-4152-b4cd-ad6731ceecfd-logs\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.127477 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-scripts\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.127514 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.127972 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2214523f-cc36-4152-b4cd-ad6731ceecfd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.128275 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2214523f-cc36-4152-b4cd-ad6731ceecfd-logs\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.142237 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-config-data\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.142723 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-config-data-custom\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.147619 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-scripts\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.147848 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.154710 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dmwg\" (UniqueName: \"kubernetes.io/projected/2214523f-cc36-4152-b4cd-ad6731ceecfd-kube-api-access-5dmwg\") pod \"cinder-api-0\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.233380 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.237619 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c462d02f-dfcd-48f7-b755-fb203afcb213-combined-ca-bundle\") pod \"c462d02f-dfcd-48f7-b755-fb203afcb213\" (UID: \"c462d02f-dfcd-48f7-b755-fb203afcb213\") " Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.237806 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8md7\" (UniqueName: \"kubernetes.io/projected/c462d02f-dfcd-48f7-b755-fb203afcb213-kube-api-access-d8md7\") pod \"c462d02f-dfcd-48f7-b755-fb203afcb213\" (UID: \"c462d02f-dfcd-48f7-b755-fb203afcb213\") " Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.237824 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c462d02f-dfcd-48f7-b755-fb203afcb213-config\") pod \"c462d02f-dfcd-48f7-b755-fb203afcb213\" (UID: \"c462d02f-dfcd-48f7-b755-fb203afcb213\") " Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.245340 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c462d02f-dfcd-48f7-b755-fb203afcb213-kube-api-access-d8md7" (OuterVolumeSpecName: "kube-api-access-d8md7") pod "c462d02f-dfcd-48f7-b755-fb203afcb213" (UID: "c462d02f-dfcd-48f7-b755-fb203afcb213"). InnerVolumeSpecName "kube-api-access-d8md7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.331269 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c462d02f-dfcd-48f7-b755-fb203afcb213-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c462d02f-dfcd-48f7-b755-fb203afcb213" (UID: "c462d02f-dfcd-48f7-b755-fb203afcb213"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.343052 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8md7\" (UniqueName: \"kubernetes.io/projected/c462d02f-dfcd-48f7-b755-fb203afcb213-kube-api-access-d8md7\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.343320 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c462d02f-dfcd-48f7-b755-fb203afcb213-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.360364 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c462d02f-dfcd-48f7-b755-fb203afcb213-config" (OuterVolumeSpecName: "config") pod "c462d02f-dfcd-48f7-b755-fb203afcb213" (UID: "c462d02f-dfcd-48f7-b755-fb203afcb213"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.401045 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qk4ws" event={"ID":"c462d02f-dfcd-48f7-b755-fb203afcb213","Type":"ContainerDied","Data":"c5f24997feb51a8f3b35e75eef469dc17b34d575e0f69fbf639636cde3383383"} Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.401083 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5f24997feb51a8f3b35e75eef469dc17b34d575e0f69fbf639636cde3383383" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.401159 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qk4ws" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.417595 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.445353 4854 generic.go:334] "Generic (PLEG): container finished" podID="7e713bda-f49c-4d97-98c7-4ae024be86f4" containerID="c2e5b1927b214e05f2b1fbfb109fc80be6a75e65f99b2c3b05086606f024b163" exitCode=0 Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.445398 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" event={"ID":"7e713bda-f49c-4d97-98c7-4ae024be86f4","Type":"ContainerDied","Data":"c2e5b1927b214e05f2b1fbfb109fc80be6a75e65f99b2c3b05086606f024b163"} Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.446264 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c462d02f-dfcd-48f7-b755-fb203afcb213-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.645946 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-89c6h"] Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.689120 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b8c6459d6-knb66"] Oct 07 12:42:42 crc kubenswrapper[4854]: E1007 12:42:42.689507 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c462d02f-dfcd-48f7-b755-fb203afcb213" containerName="neutron-db-sync" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.689581 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c462d02f-dfcd-48f7-b755-fb203afcb213" containerName="neutron-db-sync" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.689786 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="c462d02f-dfcd-48f7-b755-fb203afcb213" containerName="neutron-db-sync" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.690687 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.692884 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-45gdk" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.693029 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.693484 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.693872 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.769547 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b8c6459d6-knb66"] Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.769576 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-89c6h"] Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.769587 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-wjtgg"] Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.770909 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.792873 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-wjtgg"] Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.850225 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.858468 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-ovndb-tls-certs\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.858614 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-combined-ca-bundle\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.860499 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-httpd-config\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.860560 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-config\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.860805 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-465dq\" (UniqueName: \"kubernetes.io/projected/1279e99a-d16c-441f-9aa8-33efa1cd351f-kube-api-access-465dq\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.961976 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-ovndb-tls-certs\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.962015 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-combined-ca-bundle\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.962058 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-httpd-config\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.962079 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.962099 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-config\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.962244 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pxtf\" (UniqueName: \"kubernetes.io/projected/3d29bd97-53f9-4154-8712-c564d07a07e0-kube-api-access-6pxtf\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.962275 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-config\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.962298 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-465dq\" (UniqueName: \"kubernetes.io/projected/1279e99a-d16c-441f-9aa8-33efa1cd351f-kube-api-access-465dq\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.962318 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.962363 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.962381 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.980584 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-config\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.980773 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-httpd-config\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.982467 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-ovndb-tls-certs\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:42 crc kubenswrapper[4854]: I1007 12:42:42.993975 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-combined-ca-bundle\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.002586 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-465dq\" (UniqueName: \"kubernetes.io/projected/1279e99a-d16c-441f-9aa8-33efa1cd351f-kube-api-access-465dq\") pod \"neutron-b8c6459d6-knb66\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.065124 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.065876 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pxtf\" (UniqueName: \"kubernetes.io/projected/3d29bd97-53f9-4154-8712-c564d07a07e0-kube-api-access-6pxtf\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.065959 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-config\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.066014 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.066106 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.066315 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.066853 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-config\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.067701 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.067722 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.068518 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.070240 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.079519 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.091776 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pxtf\" (UniqueName: \"kubernetes.io/projected/3d29bd97-53f9-4154-8712-c564d07a07e0-kube-api-access-6pxtf\") pod \"dnsmasq-dns-5c9776ccc5-wjtgg\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.093600 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.097666 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.272646 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-ovsdbserver-sb\") pod \"7e713bda-f49c-4d97-98c7-4ae024be86f4\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.272913 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-ovsdbserver-nb\") pod \"7e713bda-f49c-4d97-98c7-4ae024be86f4\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.272942 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-dns-swift-storage-0\") pod \"7e713bda-f49c-4d97-98c7-4ae024be86f4\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.273029 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n7tb\" (UniqueName: \"kubernetes.io/projected/7e713bda-f49c-4d97-98c7-4ae024be86f4-kube-api-access-4n7tb\") pod \"7e713bda-f49c-4d97-98c7-4ae024be86f4\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.273068 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-config\") pod \"7e713bda-f49c-4d97-98c7-4ae024be86f4\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.273362 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-dns-svc\") pod \"7e713bda-f49c-4d97-98c7-4ae024be86f4\" (UID: \"7e713bda-f49c-4d97-98c7-4ae024be86f4\") " Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.280414 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e713bda-f49c-4d97-98c7-4ae024be86f4-kube-api-access-4n7tb" (OuterVolumeSpecName: "kube-api-access-4n7tb") pod "7e713bda-f49c-4d97-98c7-4ae024be86f4" (UID: "7e713bda-f49c-4d97-98c7-4ae024be86f4"). InnerVolumeSpecName "kube-api-access-4n7tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.324660 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.375118 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n7tb\" (UniqueName: \"kubernetes.io/projected/7e713bda-f49c-4d97-98c7-4ae024be86f4-kube-api-access-4n7tb\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.378657 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e713bda-f49c-4d97-98c7-4ae024be86f4" (UID: "7e713bda-f49c-4d97-98c7-4ae024be86f4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.477723 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.497788 4854 generic.go:334] "Generic (PLEG): container finished" podID="21abefeb-fe0f-4c5b-b908-ae201a4cabdb" containerID="93aa871019615946934170062639bc63d3ce8dfb98daaff8a261f9e9809150d8" exitCode=0 Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.497891 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" event={"ID":"21abefeb-fe0f-4c5b-b908-ae201a4cabdb","Type":"ContainerDied","Data":"93aa871019615946934170062639bc63d3ce8dfb98daaff8a261f9e9809150d8"} Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.497922 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" event={"ID":"21abefeb-fe0f-4c5b-b908-ae201a4cabdb","Type":"ContainerStarted","Data":"c6bd684bb3d1c71f1fad398b17113934273c2784b068d832b2639208cbebe342"} Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.525852 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-config" (OuterVolumeSpecName: "config") pod "7e713bda-f49c-4d97-98c7-4ae024be86f4" (UID: "7e713bda-f49c-4d97-98c7-4ae024be86f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.528208 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" event={"ID":"7e713bda-f49c-4d97-98c7-4ae024be86f4","Type":"ContainerDied","Data":"371ec7b9d1f32cf275ea0f62dddd4da8ab44a0cbdae2d387ec51f71114c588db"} Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.528268 4854 scope.go:117] "RemoveContainer" containerID="c2e5b1927b214e05f2b1fbfb109fc80be6a75e65f99b2c3b05086606f024b163" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.528427 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-w4hwc" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.552839 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e713bda-f49c-4d97-98c7-4ae024be86f4" (UID: "7e713bda-f49c-4d97-98c7-4ae024be86f4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.562819 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efc4aa31-c1b9-4615-bca1-04c5f81d3024","Type":"ContainerStarted","Data":"dd6037f3dd57699d1eb4185e084b92eb28906c0fadfe662dc09d6034dba57d4b"} Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.564724 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2214523f-cc36-4152-b4cd-ad6731ceecfd","Type":"ContainerStarted","Data":"e0158e5e4326c510908493c3c500819d3f97e7b2c09c2a0f184d87e0e4991116"} Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.567570 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266","Type":"ContainerStarted","Data":"5d1e15c52508850fd4e996e3f2ffeaae496156c9c816b894a664b514a4278c2d"} Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.568684 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e713bda-f49c-4d97-98c7-4ae024be86f4" (UID: "7e713bda-f49c-4d97-98c7-4ae024be86f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.580048 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.580089 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.580102 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.613888 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7e713bda-f49c-4d97-98c7-4ae024be86f4" (UID: "7e713bda-f49c-4d97-98c7-4ae024be86f4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.681318 4854 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e713bda-f49c-4d97-98c7-4ae024be86f4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.746022 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-wjtgg"] Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.916131 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-w4hwc"] Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.925306 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-w4hwc"] Oct 07 12:42:43 crc kubenswrapper[4854]: I1007 12:42:43.943449 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b8c6459d6-knb66"] Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.021017 4854 scope.go:117] "RemoveContainer" containerID="8535090bf8d2c155a19528aa29438dd2f963c8e3b2b710e4f24168be011e8eba" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.167984 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.303511 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-ovsdbserver-sb\") pod \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.303586 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-dns-svc\") pod \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.303653 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-ovsdbserver-nb\") pod \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.303694 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqfmj\" (UniqueName: \"kubernetes.io/projected/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-kube-api-access-hqfmj\") pod \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.303780 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-dns-swift-storage-0\") pod \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.303859 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-config\") pod \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\" (UID: \"21abefeb-fe0f-4c5b-b908-ae201a4cabdb\") " Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.326382 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-kube-api-access-hqfmj" (OuterVolumeSpecName: "kube-api-access-hqfmj") pod "21abefeb-fe0f-4c5b-b908-ae201a4cabdb" (UID: "21abefeb-fe0f-4c5b-b908-ae201a4cabdb"). InnerVolumeSpecName "kube-api-access-hqfmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.359453 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "21abefeb-fe0f-4c5b-b908-ae201a4cabdb" (UID: "21abefeb-fe0f-4c5b-b908-ae201a4cabdb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.361703 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "21abefeb-fe0f-4c5b-b908-ae201a4cabdb" (UID: "21abefeb-fe0f-4c5b-b908-ae201a4cabdb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.383647 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21abefeb-fe0f-4c5b-b908-ae201a4cabdb" (UID: "21abefeb-fe0f-4c5b-b908-ae201a4cabdb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.396049 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "21abefeb-fe0f-4c5b-b908-ae201a4cabdb" (UID: "21abefeb-fe0f-4c5b-b908-ae201a4cabdb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.396319 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-config" (OuterVolumeSpecName: "config") pod "21abefeb-fe0f-4c5b-b908-ae201a4cabdb" (UID: "21abefeb-fe0f-4c5b-b908-ae201a4cabdb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.416720 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.416751 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.416762 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.416773 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqfmj\" (UniqueName: \"kubernetes.io/projected/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-kube-api-access-hqfmj\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.416785 4854 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.416793 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21abefeb-fe0f-4c5b-b908-ae201a4cabdb-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.565498 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.621515 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.631193 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efc4aa31-c1b9-4615-bca1-04c5f81d3024","Type":"ContainerStarted","Data":"fc76adf34e01c5b363c966a85cb79552f2e821e0a6ad3c35ba4df05b7ba60bec"} Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.633003 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" event={"ID":"3d29bd97-53f9-4154-8712-c564d07a07e0","Type":"ContainerStarted","Data":"e7f09dfd7d37a98d0b037c431da83d6af53cdfed5c21d6ee01524c266f4fe0f0"} Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.692083 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8c6459d6-knb66" event={"ID":"1279e99a-d16c-441f-9aa8-33efa1cd351f","Type":"ContainerStarted","Data":"6c0a474801b8f493e4770e6082eaf3f4b418df172e291fd80d1aee9313faf60c"} Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.692115 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8c6459d6-knb66" event={"ID":"1279e99a-d16c-441f-9aa8-33efa1cd351f","Type":"ContainerStarted","Data":"8b9cf9f1e5f913a2b0d2a38d6340ce6fb1474fa2f985aa231489e03e27ec7e48"} Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.840805 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.843419 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e713bda-f49c-4d97-98c7-4ae024be86f4" path="/var/lib/kubelet/pods/7e713bda-f49c-4d97-98c7-4ae024be86f4/volumes" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.844346 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-89c6h" event={"ID":"21abefeb-fe0f-4c5b-b908-ae201a4cabdb","Type":"ContainerDied","Data":"c6bd684bb3d1c71f1fad398b17113934273c2784b068d832b2639208cbebe342"} Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.844372 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6fbf67c4f8-wmrbx"] Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.844478 4854 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.844620 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" podUID="5673e957-d032-4112-b620-9f255b01d0d9" containerName="barbican-api-log" containerID="cri-o://8138d4b4267015e79d13138c0c538582f0f03d240febe0ced67849accae5eda1" gracePeriod=30 Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.845786 4854 scope.go:117] "RemoveContainer" containerID="93aa871019615946934170062639bc63d3ce8dfb98daaff8a261f9e9809150d8" Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.845653 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" podUID="5673e957-d032-4112-b620-9f255b01d0d9" containerName="barbican-api" containerID="cri-o://4693cbd94ea9abb6515ae8732113c7cf02b75e3ac39ffd080462cbb8f082a46e" gracePeriod=30 Oct 07 12:42:44 crc kubenswrapper[4854]: I1007 12:42:44.857358 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 12:42:45 crc kubenswrapper[4854]: I1007 12:42:44.888440 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" podUID="5673e957-d032-4112-b620-9f255b01d0d9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": EOF" Oct 07 12:42:45 crc kubenswrapper[4854]: I1007 12:42:44.888687 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" podUID="5673e957-d032-4112-b620-9f255b01d0d9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": EOF" Oct 07 12:42:45 crc kubenswrapper[4854]: I1007 12:42:44.889776 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" podUID="5673e957-d032-4112-b620-9f255b01d0d9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": EOF" Oct 07 12:42:45 crc kubenswrapper[4854]: I1007 12:42:45.331141 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-89c6h"] Oct 07 12:42:45 crc kubenswrapper[4854]: I1007 12:42:45.345204 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-89c6h"] Oct 07 12:42:45 crc kubenswrapper[4854]: I1007 12:42:45.787529 4854 generic.go:334] "Generic (PLEG): container finished" podID="5673e957-d032-4112-b620-9f255b01d0d9" containerID="8138d4b4267015e79d13138c0c538582f0f03d240febe0ced67849accae5eda1" exitCode=143 Oct 07 12:42:45 crc kubenswrapper[4854]: I1007 12:42:45.787799 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" event={"ID":"5673e957-d032-4112-b620-9f255b01d0d9","Type":"ContainerDied","Data":"8138d4b4267015e79d13138c0c538582f0f03d240febe0ced67849accae5eda1"} Oct 07 12:42:45 crc kubenswrapper[4854]: I1007 12:42:45.808386 4854 generic.go:334] "Generic (PLEG): container finished" podID="3d29bd97-53f9-4154-8712-c564d07a07e0" containerID="13f9096214a06dfbf1d19f339d4fca37c68814b59579cab2f9d96d047bd885fa" exitCode=0 Oct 07 12:42:45 crc kubenswrapper[4854]: I1007 12:42:45.808525 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" event={"ID":"3d29bd97-53f9-4154-8712-c564d07a07e0","Type":"ContainerDied","Data":"13f9096214a06dfbf1d19f339d4fca37c68814b59579cab2f9d96d047bd885fa"} Oct 07 12:42:45 crc kubenswrapper[4854]: I1007 12:42:45.821426 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266","Type":"ContainerStarted","Data":"dbd10ccec47843a1d93b17b20b1643e188945b8548aedfb57de0e60b81354838"} Oct 07 12:42:45 crc kubenswrapper[4854]: I1007 12:42:45.824479 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8c6459d6-knb66" event={"ID":"1279e99a-d16c-441f-9aa8-33efa1cd351f","Type":"ContainerStarted","Data":"b6914afc6e91abbd192b372434d91fd604cfb5c5e273f64244c294561b68e139"} Oct 07 12:42:45 crc kubenswrapper[4854]: I1007 12:42:45.825218 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:42:45 crc kubenswrapper[4854]: I1007 12:42:45.847780 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2214523f-cc36-4152-b4cd-ad6731ceecfd","Type":"ContainerStarted","Data":"20c68f9823cef77d64816fddd3e11baddbaf7a317ee56f3ee4aa7ff0e639e1d3"} Oct 07 12:42:45 crc kubenswrapper[4854]: I1007 12:42:45.880898 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b8c6459d6-knb66" podStartSLOduration=3.880880136 podStartE2EDuration="3.880880136s" podCreationTimestamp="2025-10-07 12:42:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:42:45.866934501 +0000 UTC m=+1081.854766756" watchObservedRunningTime="2025-10-07 12:42:45.880880136 +0000 UTC m=+1081.868712381" Oct 07 12:42:46 crc kubenswrapper[4854]: I1007 12:42:46.716734 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21abefeb-fe0f-4c5b-b908-ae201a4cabdb" path="/var/lib/kubelet/pods/21abefeb-fe0f-4c5b-b908-ae201a4cabdb/volumes" Oct 07 12:42:46 crc kubenswrapper[4854]: I1007 12:42:46.857065 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efc4aa31-c1b9-4615-bca1-04c5f81d3024","Type":"ContainerStarted","Data":"0e27fa684f81118b3d69ffb7ca8797ef87e5550ba223a5f3b435edcb3fcbd78f"} Oct 07 12:42:46 crc kubenswrapper[4854]: I1007 12:42:46.858473 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 12:42:46 crc kubenswrapper[4854]: I1007 12:42:46.860582 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2214523f-cc36-4152-b4cd-ad6731ceecfd","Type":"ContainerStarted","Data":"b71e0bf0e727975ce30a7566fee4ff385dc97177def1d35ad77b0e3c1a53219c"} Oct 07 12:42:46 crc kubenswrapper[4854]: I1007 12:42:46.861018 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2214523f-cc36-4152-b4cd-ad6731ceecfd" containerName="cinder-api-log" containerID="cri-o://20c68f9823cef77d64816fddd3e11baddbaf7a317ee56f3ee4aa7ff0e639e1d3" gracePeriod=30 Oct 07 12:42:46 crc kubenswrapper[4854]: I1007 12:42:46.861341 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2214523f-cc36-4152-b4cd-ad6731ceecfd" containerName="cinder-api" containerID="cri-o://b71e0bf0e727975ce30a7566fee4ff385dc97177def1d35ad77b0e3c1a53219c" gracePeriod=30 Oct 07 12:42:46 crc kubenswrapper[4854]: I1007 12:42:46.861362 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 12:42:46 crc kubenswrapper[4854]: I1007 12:42:46.875766 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" event={"ID":"3d29bd97-53f9-4154-8712-c564d07a07e0","Type":"ContainerStarted","Data":"6011f6e5c82aac945460eaf9b07dd392036697ba6d3a9396036ef971f5d74aa5"} Oct 07 12:42:46 crc kubenswrapper[4854]: I1007 12:42:46.875923 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:46 crc kubenswrapper[4854]: I1007 12:42:46.887083 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266","Type":"ContainerStarted","Data":"537ec4107d7a3ada8e4108ccb53c402db4fd57c429c13e6e8127aef6f3fedc65"} Oct 07 12:42:46 crc kubenswrapper[4854]: I1007 12:42:46.952926 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.413377675 podStartE2EDuration="8.952905546s" podCreationTimestamp="2025-10-07 12:42:38 +0000 UTC" firstStartedPulling="2025-10-07 12:42:39.314458773 +0000 UTC m=+1075.302291028" lastFinishedPulling="2025-10-07 12:42:45.853986644 +0000 UTC m=+1081.841818899" observedRunningTime="2025-10-07 12:42:46.940259228 +0000 UTC m=+1082.928091483" watchObservedRunningTime="2025-10-07 12:42:46.952905546 +0000 UTC m=+1082.940737801" Oct 07 12:42:47 crc kubenswrapper[4854]: I1007 12:42:47.030804 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.030787632 podStartE2EDuration="6.030787632s" podCreationTimestamp="2025-10-07 12:42:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:42:47.005135576 +0000 UTC m=+1082.992967831" watchObservedRunningTime="2025-10-07 12:42:47.030787632 +0000 UTC m=+1083.018619887" Oct 07 12:42:47 crc kubenswrapper[4854]: I1007 12:42:47.036093 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" podStartSLOduration=5.036077266 podStartE2EDuration="5.036077266s" podCreationTimestamp="2025-10-07 12:42:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:42:47.033404858 +0000 UTC m=+1083.021237113" watchObservedRunningTime="2025-10-07 12:42:47.036077266 +0000 UTC m=+1083.023909521" Oct 07 12:42:47 crc kubenswrapper[4854]: I1007 12:42:47.061923 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.764378957 podStartE2EDuration="6.061907697s" podCreationTimestamp="2025-10-07 12:42:41 +0000 UTC" firstStartedPulling="2025-10-07 12:42:42.879721721 +0000 UTC m=+1078.867553976" lastFinishedPulling="2025-10-07 12:42:44.177250461 +0000 UTC m=+1080.165082716" observedRunningTime="2025-10-07 12:42:47.058329743 +0000 UTC m=+1083.046161998" watchObservedRunningTime="2025-10-07 12:42:47.061907697 +0000 UTC m=+1083.049739952" Oct 07 12:42:47 crc kubenswrapper[4854]: I1007 12:42:47.073238 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 12:42:47 crc kubenswrapper[4854]: I1007 12:42:47.902925 4854 generic.go:334] "Generic (PLEG): container finished" podID="2214523f-cc36-4152-b4cd-ad6731ceecfd" containerID="20c68f9823cef77d64816fddd3e11baddbaf7a317ee56f3ee4aa7ff0e639e1d3" exitCode=143 Oct 07 12:42:47 crc kubenswrapper[4854]: I1007 12:42:47.903833 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2214523f-cc36-4152-b4cd-ad6731ceecfd","Type":"ContainerDied","Data":"20c68f9823cef77d64816fddd3e11baddbaf7a317ee56f3ee4aa7ff0e639e1d3"} Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.342684 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7445f79585-rckdn"] Oct 07 12:42:48 crc kubenswrapper[4854]: E1007 12:42:48.343083 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21abefeb-fe0f-4c5b-b908-ae201a4cabdb" containerName="init" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.343099 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="21abefeb-fe0f-4c5b-b908-ae201a4cabdb" containerName="init" Oct 07 12:42:48 crc kubenswrapper[4854]: E1007 12:42:48.343111 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e713bda-f49c-4d97-98c7-4ae024be86f4" containerName="init" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.343117 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e713bda-f49c-4d97-98c7-4ae024be86f4" containerName="init" Oct 07 12:42:48 crc kubenswrapper[4854]: E1007 12:42:48.343130 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e713bda-f49c-4d97-98c7-4ae024be86f4" containerName="dnsmasq-dns" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.343137 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e713bda-f49c-4d97-98c7-4ae024be86f4" containerName="dnsmasq-dns" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.343346 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="21abefeb-fe0f-4c5b-b908-ae201a4cabdb" containerName="init" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.343370 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e713bda-f49c-4d97-98c7-4ae024be86f4" containerName="dnsmasq-dns" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.344304 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.346557 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.347078 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.356269 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7445f79585-rckdn"] Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.449804 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-config\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.449902 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6226q\" (UniqueName: \"kubernetes.io/projected/c2c26a76-531a-4a6b-ac0f-6aa23680f903-kube-api-access-6226q\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.449958 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-public-tls-certs\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.450123 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-ovndb-tls-certs\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.450254 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-internal-tls-certs\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.450324 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-httpd-config\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.450424 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-combined-ca-bundle\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.551996 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-ovndb-tls-certs\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.552058 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-internal-tls-certs\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.552087 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-httpd-config\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.552132 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-combined-ca-bundle\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.552173 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-config\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.552214 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6226q\" (UniqueName: \"kubernetes.io/projected/c2c26a76-531a-4a6b-ac0f-6aa23680f903-kube-api-access-6226q\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.552261 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-public-tls-certs\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.559604 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-internal-tls-certs\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.561386 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-public-tls-certs\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.561751 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-ovndb-tls-certs\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.562009 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-combined-ca-bundle\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.563340 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-config\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.563496 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-httpd-config\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.573556 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6226q\" (UniqueName: \"kubernetes.io/projected/c2c26a76-531a-4a6b-ac0f-6aa23680f903-kube-api-access-6226q\") pod \"neutron-7445f79585-rckdn\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:48 crc kubenswrapper[4854]: I1007 12:42:48.663988 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:49 crc kubenswrapper[4854]: I1007 12:42:49.301550 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7445f79585-rckdn"] Oct 07 12:42:49 crc kubenswrapper[4854]: I1007 12:42:49.930440 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" podUID="5673e957-d032-4112-b620-9f255b01d0d9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 12:42:49 crc kubenswrapper[4854]: I1007 12:42:49.932015 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445f79585-rckdn" event={"ID":"c2c26a76-531a-4a6b-ac0f-6aa23680f903","Type":"ContainerStarted","Data":"4442c10882343c29c4d45b4876f4c5029aa8758304578d81129d781083582260"} Oct 07 12:42:49 crc kubenswrapper[4854]: I1007 12:42:49.932049 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445f79585-rckdn" event={"ID":"c2c26a76-531a-4a6b-ac0f-6aa23680f903","Type":"ContainerStarted","Data":"82adcb574899325e5c5ccb3b6bb13576d4a4618eac34de03953fb2c28cd68412"} Oct 07 12:42:49 crc kubenswrapper[4854]: I1007 12:42:49.932075 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445f79585-rckdn" event={"ID":"c2c26a76-531a-4a6b-ac0f-6aa23680f903","Type":"ContainerStarted","Data":"30c8c40f65d07566ebbf2ad35ec8aa1ad52ea6bd2d2446546bd98b3e799c22e1"} Oct 07 12:42:49 crc kubenswrapper[4854]: I1007 12:42:49.932197 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:42:49 crc kubenswrapper[4854]: I1007 12:42:49.956822 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7445f79585-rckdn" podStartSLOduration=1.956803871 podStartE2EDuration="1.956803871s" podCreationTimestamp="2025-10-07 12:42:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:42:49.954304588 +0000 UTC m=+1085.942136843" watchObservedRunningTime="2025-10-07 12:42:49.956803871 +0000 UTC m=+1085.944636126" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.282944 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" podUID="5673e957-d032-4112-b620-9f255b01d0d9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": read tcp 10.217.0.2:36144->10.217.0.154:9311: read: connection reset by peer" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.283562 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" podUID="5673e957-d032-4112-b620-9f255b01d0d9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": read tcp 10.217.0.2:45596->10.217.0.154:9311: read: connection reset by peer" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.772294 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.795509 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-combined-ca-bundle\") pod \"5673e957-d032-4112-b620-9f255b01d0d9\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.795899 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-config-data\") pod \"5673e957-d032-4112-b620-9f255b01d0d9\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.796117 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vcvs\" (UniqueName: \"kubernetes.io/projected/5673e957-d032-4112-b620-9f255b01d0d9-kube-api-access-2vcvs\") pod \"5673e957-d032-4112-b620-9f255b01d0d9\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.796323 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-config-data-custom\") pod \"5673e957-d032-4112-b620-9f255b01d0d9\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.796449 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5673e957-d032-4112-b620-9f255b01d0d9-logs\") pod \"5673e957-d032-4112-b620-9f255b01d0d9\" (UID: \"5673e957-d032-4112-b620-9f255b01d0d9\") " Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.797216 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5673e957-d032-4112-b620-9f255b01d0d9-logs" (OuterVolumeSpecName: "logs") pod "5673e957-d032-4112-b620-9f255b01d0d9" (UID: "5673e957-d032-4112-b620-9f255b01d0d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.802326 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5673e957-d032-4112-b620-9f255b01d0d9" (UID: "5673e957-d032-4112-b620-9f255b01d0d9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.802381 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5673e957-d032-4112-b620-9f255b01d0d9-kube-api-access-2vcvs" (OuterVolumeSpecName: "kube-api-access-2vcvs") pod "5673e957-d032-4112-b620-9f255b01d0d9" (UID: "5673e957-d032-4112-b620-9f255b01d0d9"). InnerVolumeSpecName "kube-api-access-2vcvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.837285 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5673e957-d032-4112-b620-9f255b01d0d9" (UID: "5673e957-d032-4112-b620-9f255b01d0d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.853118 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-config-data" (OuterVolumeSpecName: "config-data") pod "5673e957-d032-4112-b620-9f255b01d0d9" (UID: "5673e957-d032-4112-b620-9f255b01d0d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.907461 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vcvs\" (UniqueName: \"kubernetes.io/projected/5673e957-d032-4112-b620-9f255b01d0d9-kube-api-access-2vcvs\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.907497 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.907510 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5673e957-d032-4112-b620-9f255b01d0d9-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.907525 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.907535 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5673e957-d032-4112-b620-9f255b01d0d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.945365 4854 generic.go:334] "Generic (PLEG): container finished" podID="5673e957-d032-4112-b620-9f255b01d0d9" containerID="4693cbd94ea9abb6515ae8732113c7cf02b75e3ac39ffd080462cbb8f082a46e" exitCode=0 Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.946214 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.946362 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" event={"ID":"5673e957-d032-4112-b620-9f255b01d0d9","Type":"ContainerDied","Data":"4693cbd94ea9abb6515ae8732113c7cf02b75e3ac39ffd080462cbb8f082a46e"} Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.946405 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fbf67c4f8-wmrbx" event={"ID":"5673e957-d032-4112-b620-9f255b01d0d9","Type":"ContainerDied","Data":"7ff2c9bf035fd17367d877c52e09ee7a97496092d8ee1ef0df7600f78ea4287c"} Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.946421 4854 scope.go:117] "RemoveContainer" containerID="4693cbd94ea9abb6515ae8732113c7cf02b75e3ac39ffd080462cbb8f082a46e" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.989335 4854 scope.go:117] "RemoveContainer" containerID="8138d4b4267015e79d13138c0c538582f0f03d240febe0ced67849accae5eda1" Oct 07 12:42:50 crc kubenswrapper[4854]: I1007 12:42:50.993801 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6fbf67c4f8-wmrbx"] Oct 07 12:42:51 crc kubenswrapper[4854]: I1007 12:42:51.001667 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6fbf67c4f8-wmrbx"] Oct 07 12:42:51 crc kubenswrapper[4854]: I1007 12:42:51.009460 4854 scope.go:117] "RemoveContainer" containerID="4693cbd94ea9abb6515ae8732113c7cf02b75e3ac39ffd080462cbb8f082a46e" Oct 07 12:42:51 crc kubenswrapper[4854]: E1007 12:42:51.009962 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4693cbd94ea9abb6515ae8732113c7cf02b75e3ac39ffd080462cbb8f082a46e\": container with ID starting with 4693cbd94ea9abb6515ae8732113c7cf02b75e3ac39ffd080462cbb8f082a46e not found: ID does not exist" containerID="4693cbd94ea9abb6515ae8732113c7cf02b75e3ac39ffd080462cbb8f082a46e" Oct 07 12:42:51 crc kubenswrapper[4854]: I1007 12:42:51.010052 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4693cbd94ea9abb6515ae8732113c7cf02b75e3ac39ffd080462cbb8f082a46e"} err="failed to get container status \"4693cbd94ea9abb6515ae8732113c7cf02b75e3ac39ffd080462cbb8f082a46e\": rpc error: code = NotFound desc = could not find container \"4693cbd94ea9abb6515ae8732113c7cf02b75e3ac39ffd080462cbb8f082a46e\": container with ID starting with 4693cbd94ea9abb6515ae8732113c7cf02b75e3ac39ffd080462cbb8f082a46e not found: ID does not exist" Oct 07 12:42:51 crc kubenswrapper[4854]: I1007 12:42:51.010135 4854 scope.go:117] "RemoveContainer" containerID="8138d4b4267015e79d13138c0c538582f0f03d240febe0ced67849accae5eda1" Oct 07 12:42:51 crc kubenswrapper[4854]: E1007 12:42:51.010500 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8138d4b4267015e79d13138c0c538582f0f03d240febe0ced67849accae5eda1\": container with ID starting with 8138d4b4267015e79d13138c0c538582f0f03d240febe0ced67849accae5eda1 not found: ID does not exist" containerID="8138d4b4267015e79d13138c0c538582f0f03d240febe0ced67849accae5eda1" Oct 07 12:42:51 crc kubenswrapper[4854]: I1007 12:42:51.010590 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8138d4b4267015e79d13138c0c538582f0f03d240febe0ced67849accae5eda1"} err="failed to get container status \"8138d4b4267015e79d13138c0c538582f0f03d240febe0ced67849accae5eda1\": rpc error: code = NotFound desc = could not find container \"8138d4b4267015e79d13138c0c538582f0f03d240febe0ced67849accae5eda1\": container with ID starting with 8138d4b4267015e79d13138c0c538582f0f03d240febe0ced67849accae5eda1 not found: ID does not exist" Oct 07 12:42:52 crc kubenswrapper[4854]: I1007 12:42:52.324210 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 12:42:52 crc kubenswrapper[4854]: I1007 12:42:52.390833 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 12:42:52 crc kubenswrapper[4854]: I1007 12:42:52.733872 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5673e957-d032-4112-b620-9f255b01d0d9" path="/var/lib/kubelet/pods/5673e957-d032-4112-b620-9f255b01d0d9/volumes" Oct 07 12:42:52 crc kubenswrapper[4854]: I1007 12:42:52.963808 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" containerName="cinder-scheduler" containerID="cri-o://dbd10ccec47843a1d93b17b20b1643e188945b8548aedfb57de0e60b81354838" gracePeriod=30 Oct 07 12:42:52 crc kubenswrapper[4854]: I1007 12:42:52.963925 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" containerName="probe" containerID="cri-o://537ec4107d7a3ada8e4108ccb53c402db4fd57c429c13e6e8127aef6f3fedc65" gracePeriod=30 Oct 07 12:42:53 crc kubenswrapper[4854]: I1007 12:42:53.099918 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:42:53 crc kubenswrapper[4854]: I1007 12:42:53.215054 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-9872c"] Oct 07 12:42:53 crc kubenswrapper[4854]: I1007 12:42:53.215422 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" podUID="a1a00ddb-7ed9-45dd-98bf-a2be628047df" containerName="dnsmasq-dns" containerID="cri-o://55fbf956fe0dfb40eef07681e52b8b94d3944ac0ae84361d414261f85bd29c80" gracePeriod=10 Oct 07 12:42:53 crc kubenswrapper[4854]: I1007 12:42:53.892444 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:42:53 crc kubenswrapper[4854]: I1007 12:42:53.978255 4854 generic.go:334] "Generic (PLEG): container finished" podID="a1a00ddb-7ed9-45dd-98bf-a2be628047df" containerID="55fbf956fe0dfb40eef07681e52b8b94d3944ac0ae84361d414261f85bd29c80" exitCode=0 Oct 07 12:42:53 crc kubenswrapper[4854]: I1007 12:42:53.978299 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" event={"ID":"a1a00ddb-7ed9-45dd-98bf-a2be628047df","Type":"ContainerDied","Data":"55fbf956fe0dfb40eef07681e52b8b94d3944ac0ae84361d414261f85bd29c80"} Oct 07 12:42:53 crc kubenswrapper[4854]: I1007 12:42:53.978326 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" event={"ID":"a1a00ddb-7ed9-45dd-98bf-a2be628047df","Type":"ContainerDied","Data":"dc6378d82309d7d3a0d11d04f8a7ea0c91a0772e184d20d86c22de15ec20a98e"} Oct 07 12:42:53 crc kubenswrapper[4854]: I1007 12:42:53.978332 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-9872c" Oct 07 12:42:53 crc kubenswrapper[4854]: I1007 12:42:53.978344 4854 scope.go:117] "RemoveContainer" containerID="55fbf956fe0dfb40eef07681e52b8b94d3944ac0ae84361d414261f85bd29c80" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.002659 4854 scope.go:117] "RemoveContainer" containerID="5eff53ddae156645ff120293c2b04f9524dfccaeeb600fce9803a546cd9ac95a" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.035136 4854 scope.go:117] "RemoveContainer" containerID="55fbf956fe0dfb40eef07681e52b8b94d3944ac0ae84361d414261f85bd29c80" Oct 07 12:42:54 crc kubenswrapper[4854]: E1007 12:42:54.035724 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55fbf956fe0dfb40eef07681e52b8b94d3944ac0ae84361d414261f85bd29c80\": container with ID starting with 55fbf956fe0dfb40eef07681e52b8b94d3944ac0ae84361d414261f85bd29c80 not found: ID does not exist" containerID="55fbf956fe0dfb40eef07681e52b8b94d3944ac0ae84361d414261f85bd29c80" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.035757 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55fbf956fe0dfb40eef07681e52b8b94d3944ac0ae84361d414261f85bd29c80"} err="failed to get container status \"55fbf956fe0dfb40eef07681e52b8b94d3944ac0ae84361d414261f85bd29c80\": rpc error: code = NotFound desc = could not find container \"55fbf956fe0dfb40eef07681e52b8b94d3944ac0ae84361d414261f85bd29c80\": container with ID starting with 55fbf956fe0dfb40eef07681e52b8b94d3944ac0ae84361d414261f85bd29c80 not found: ID does not exist" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.035780 4854 scope.go:117] "RemoveContainer" containerID="5eff53ddae156645ff120293c2b04f9524dfccaeeb600fce9803a546cd9ac95a" Oct 07 12:42:54 crc kubenswrapper[4854]: E1007 12:42:54.036081 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eff53ddae156645ff120293c2b04f9524dfccaeeb600fce9803a546cd9ac95a\": container with ID starting with 5eff53ddae156645ff120293c2b04f9524dfccaeeb600fce9803a546cd9ac95a not found: ID does not exist" containerID="5eff53ddae156645ff120293c2b04f9524dfccaeeb600fce9803a546cd9ac95a" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.036113 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eff53ddae156645ff120293c2b04f9524dfccaeeb600fce9803a546cd9ac95a"} err="failed to get container status \"5eff53ddae156645ff120293c2b04f9524dfccaeeb600fce9803a546cd9ac95a\": rpc error: code = NotFound desc = could not find container \"5eff53ddae156645ff120293c2b04f9524dfccaeeb600fce9803a546cd9ac95a\": container with ID starting with 5eff53ddae156645ff120293c2b04f9524dfccaeeb600fce9803a546cd9ac95a not found: ID does not exist" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.077022 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-dns-swift-storage-0\") pod \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.077338 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-ovsdbserver-nb\") pod \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.077939 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-config\") pod \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.078513 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8jlw\" (UniqueName: \"kubernetes.io/projected/a1a00ddb-7ed9-45dd-98bf-a2be628047df-kube-api-access-r8jlw\") pod \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.078673 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-ovsdbserver-sb\") pod \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.079517 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-dns-svc\") pod \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\" (UID: \"a1a00ddb-7ed9-45dd-98bf-a2be628047df\") " Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.090789 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1a00ddb-7ed9-45dd-98bf-a2be628047df-kube-api-access-r8jlw" (OuterVolumeSpecName: "kube-api-access-r8jlw") pod "a1a00ddb-7ed9-45dd-98bf-a2be628047df" (UID: "a1a00ddb-7ed9-45dd-98bf-a2be628047df"). InnerVolumeSpecName "kube-api-access-r8jlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.127989 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a1a00ddb-7ed9-45dd-98bf-a2be628047df" (UID: "a1a00ddb-7ed9-45dd-98bf-a2be628047df"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.131560 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-config" (OuterVolumeSpecName: "config") pod "a1a00ddb-7ed9-45dd-98bf-a2be628047df" (UID: "a1a00ddb-7ed9-45dd-98bf-a2be628047df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.132377 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a1a00ddb-7ed9-45dd-98bf-a2be628047df" (UID: "a1a00ddb-7ed9-45dd-98bf-a2be628047df"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.137265 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a1a00ddb-7ed9-45dd-98bf-a2be628047df" (UID: "a1a00ddb-7ed9-45dd-98bf-a2be628047df"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.142500 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a1a00ddb-7ed9-45dd-98bf-a2be628047df" (UID: "a1a00ddb-7ed9-45dd-98bf-a2be628047df"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.182220 4854 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.182255 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.182266 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.182277 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8jlw\" (UniqueName: \"kubernetes.io/projected/a1a00ddb-7ed9-45dd-98bf-a2be628047df-kube-api-access-r8jlw\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.182287 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.182298 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1a00ddb-7ed9-45dd-98bf-a2be628047df-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.317213 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-9872c"] Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.325468 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-9872c"] Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.715858 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a00ddb-7ed9-45dd-98bf-a2be628047df" path="/var/lib/kubelet/pods/a1a00ddb-7ed9-45dd-98bf-a2be628047df/volumes" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.918737 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.998434 4854 generic.go:334] "Generic (PLEG): container finished" podID="4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" containerID="537ec4107d7a3ada8e4108ccb53c402db4fd57c429c13e6e8127aef6f3fedc65" exitCode=0 Oct 07 12:42:54 crc kubenswrapper[4854]: I1007 12:42:54.998479 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266","Type":"ContainerDied","Data":"537ec4107d7a3ada8e4108ccb53c402db4fd57c429c13e6e8127aef6f3fedc65"} Oct 07 12:42:57 crc kubenswrapper[4854]: I1007 12:42:57.035347 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-775676c768-frcfp" Oct 07 12:42:57 crc kubenswrapper[4854]: I1007 12:42:57.711150 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:57 crc kubenswrapper[4854]: I1007 12:42:57.885279 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.009630 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.028425 4854 generic.go:334] "Generic (PLEG): container finished" podID="4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" containerID="dbd10ccec47843a1d93b17b20b1643e188945b8548aedfb57de0e60b81354838" exitCode=0 Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.028470 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266","Type":"ContainerDied","Data":"dbd10ccec47843a1d93b17b20b1643e188945b8548aedfb57de0e60b81354838"} Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.028515 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266","Type":"ContainerDied","Data":"5d1e15c52508850fd4e996e3f2ffeaae496156c9c816b894a664b514a4278c2d"} Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.028533 4854 scope.go:117] "RemoveContainer" containerID="537ec4107d7a3ada8e4108ccb53c402db4fd57c429c13e6e8127aef6f3fedc65" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.028533 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.057522 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-combined-ca-bundle\") pod \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.057583 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-config-data-custom\") pod \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.057628 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7xrx\" (UniqueName: \"kubernetes.io/projected/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-kube-api-access-d7xrx\") pod \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.057686 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-etc-machine-id\") pod \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.057794 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-scripts\") pod \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.057812 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-config-data\") pod \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\" (UID: \"4d4d1ac6-6080-4082-9d3b-29e7cb7a2266\") " Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.062235 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" (UID: "4d4d1ac6-6080-4082-9d3b-29e7cb7a2266"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.062350 4854 scope.go:117] "RemoveContainer" containerID="dbd10ccec47843a1d93b17b20b1643e188945b8548aedfb57de0e60b81354838" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.082585 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-scripts" (OuterVolumeSpecName: "scripts") pod "4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" (UID: "4d4d1ac6-6080-4082-9d3b-29e7cb7a2266"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.087423 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" (UID: "4d4d1ac6-6080-4082-9d3b-29e7cb7a2266"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.110716 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-kube-api-access-d7xrx" (OuterVolumeSpecName: "kube-api-access-d7xrx") pod "4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" (UID: "4d4d1ac6-6080-4082-9d3b-29e7cb7a2266"). InnerVolumeSpecName "kube-api-access-d7xrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.118149 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" (UID: "4d4d1ac6-6080-4082-9d3b-29e7cb7a2266"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.163651 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.163686 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.163699 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7xrx\" (UniqueName: \"kubernetes.io/projected/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-kube-api-access-d7xrx\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.163713 4854 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.163724 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.176304 4854 scope.go:117] "RemoveContainer" containerID="537ec4107d7a3ada8e4108ccb53c402db4fd57c429c13e6e8127aef6f3fedc65" Oct 07 12:42:58 crc kubenswrapper[4854]: E1007 12:42:58.177243 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"537ec4107d7a3ada8e4108ccb53c402db4fd57c429c13e6e8127aef6f3fedc65\": container with ID starting with 537ec4107d7a3ada8e4108ccb53c402db4fd57c429c13e6e8127aef6f3fedc65 not found: ID does not exist" containerID="537ec4107d7a3ada8e4108ccb53c402db4fd57c429c13e6e8127aef6f3fedc65" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.177274 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"537ec4107d7a3ada8e4108ccb53c402db4fd57c429c13e6e8127aef6f3fedc65"} err="failed to get container status \"537ec4107d7a3ada8e4108ccb53c402db4fd57c429c13e6e8127aef6f3fedc65\": rpc error: code = NotFound desc = could not find container \"537ec4107d7a3ada8e4108ccb53c402db4fd57c429c13e6e8127aef6f3fedc65\": container with ID starting with 537ec4107d7a3ada8e4108ccb53c402db4fd57c429c13e6e8127aef6f3fedc65 not found: ID does not exist" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.177294 4854 scope.go:117] "RemoveContainer" containerID="dbd10ccec47843a1d93b17b20b1643e188945b8548aedfb57de0e60b81354838" Oct 07 12:42:58 crc kubenswrapper[4854]: E1007 12:42:58.178297 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbd10ccec47843a1d93b17b20b1643e188945b8548aedfb57de0e60b81354838\": container with ID starting with dbd10ccec47843a1d93b17b20b1643e188945b8548aedfb57de0e60b81354838 not found: ID does not exist" containerID="dbd10ccec47843a1d93b17b20b1643e188945b8548aedfb57de0e60b81354838" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.178325 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd10ccec47843a1d93b17b20b1643e188945b8548aedfb57de0e60b81354838"} err="failed to get container status \"dbd10ccec47843a1d93b17b20b1643e188945b8548aedfb57de0e60b81354838\": rpc error: code = NotFound desc = could not find container \"dbd10ccec47843a1d93b17b20b1643e188945b8548aedfb57de0e60b81354838\": container with ID starting with dbd10ccec47843a1d93b17b20b1643e188945b8548aedfb57de0e60b81354838 not found: ID does not exist" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.229927 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-config-data" (OuterVolumeSpecName: "config-data") pod "4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" (UID: "4d4d1ac6-6080-4082-9d3b-29e7cb7a2266"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.265487 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.358692 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.365983 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.381383 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 12:42:58 crc kubenswrapper[4854]: E1007 12:42:58.381738 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5673e957-d032-4112-b620-9f255b01d0d9" containerName="barbican-api" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.381753 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5673e957-d032-4112-b620-9f255b01d0d9" containerName="barbican-api" Oct 07 12:42:58 crc kubenswrapper[4854]: E1007 12:42:58.381771 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a00ddb-7ed9-45dd-98bf-a2be628047df" containerName="init" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.381777 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a00ddb-7ed9-45dd-98bf-a2be628047df" containerName="init" Oct 07 12:42:58 crc kubenswrapper[4854]: E1007 12:42:58.381788 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a00ddb-7ed9-45dd-98bf-a2be628047df" containerName="dnsmasq-dns" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.381794 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a00ddb-7ed9-45dd-98bf-a2be628047df" containerName="dnsmasq-dns" Oct 07 12:42:58 crc kubenswrapper[4854]: E1007 12:42:58.381802 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" containerName="probe" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.381807 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" containerName="probe" Oct 07 12:42:58 crc kubenswrapper[4854]: E1007 12:42:58.381818 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5673e957-d032-4112-b620-9f255b01d0d9" containerName="barbican-api-log" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.381823 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5673e957-d032-4112-b620-9f255b01d0d9" containerName="barbican-api-log" Oct 07 12:42:58 crc kubenswrapper[4854]: E1007 12:42:58.381837 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" containerName="cinder-scheduler" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.381843 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" containerName="cinder-scheduler" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.382004 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a00ddb-7ed9-45dd-98bf-a2be628047df" containerName="dnsmasq-dns" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.382017 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" containerName="cinder-scheduler" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.382034 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5673e957-d032-4112-b620-9f255b01d0d9" containerName="barbican-api-log" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.382047 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" containerName="probe" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.382052 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5673e957-d032-4112-b620-9f255b01d0d9" containerName="barbican-api" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.382923 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.385066 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.404444 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.468776 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-scripts\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.468859 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.468885 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-config-data\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.468933 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.468968 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.469069 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5v4\" (UniqueName: \"kubernetes.io/projected/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-kube-api-access-rk5v4\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.570998 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-scripts\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.571099 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.571125 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-config-data\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.571191 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.571225 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.571248 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5v4\" (UniqueName: \"kubernetes.io/projected/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-kube-api-access-rk5v4\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.571253 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.576194 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.576793 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.578663 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-scripts\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.584355 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-config-data\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.589861 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5v4\" (UniqueName: \"kubernetes.io/projected/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-kube-api-access-rk5v4\") pod \"cinder-scheduler-0\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " pod="openstack/cinder-scheduler-0" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.715811 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d4d1ac6-6080-4082-9d3b-29e7cb7a2266" path="/var/lib/kubelet/pods/4d4d1ac6-6080-4082-9d3b-29e7cb7a2266/volumes" Oct 07 12:42:58 crc kubenswrapper[4854]: I1007 12:42:58.736953 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 12:42:59 crc kubenswrapper[4854]: I1007 12:42:59.203960 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 12:42:59 crc kubenswrapper[4854]: W1007 12:42:59.207180 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21065050_7bdc_4f4e_9a7b_9dbcc2dab200.slice/crio-ce7b642e4ff9856acd7c51c12f4b1bffb1d2e910fc862dfc77237c8e0e6bacc9 WatchSource:0}: Error finding container ce7b642e4ff9856acd7c51c12f4b1bffb1d2e910fc862dfc77237c8e0e6bacc9: Status 404 returned error can't find the container with id ce7b642e4ff9856acd7c51c12f4b1bffb1d2e910fc862dfc77237c8e0e6bacc9 Oct 07 12:43:00 crc kubenswrapper[4854]: I1007 12:43:00.053455 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21065050-7bdc-4f4e-9a7b-9dbcc2dab200","Type":"ContainerStarted","Data":"32ad432a8d636a25c21369c163b0875cef6bace27819c32cee125a524960df32"} Oct 07 12:43:00 crc kubenswrapper[4854]: I1007 12:43:00.053804 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21065050-7bdc-4f4e-9a7b-9dbcc2dab200","Type":"ContainerStarted","Data":"ce7b642e4ff9856acd7c51c12f4b1bffb1d2e910fc862dfc77237c8e0e6bacc9"} Oct 07 12:43:00 crc kubenswrapper[4854]: I1007 12:43:00.752317 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 07 12:43:00 crc kubenswrapper[4854]: I1007 12:43:00.754013 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 12:43:00 crc kubenswrapper[4854]: I1007 12:43:00.756790 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 07 12:43:00 crc kubenswrapper[4854]: I1007 12:43:00.757263 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 07 12:43:00 crc kubenswrapper[4854]: I1007 12:43:00.762441 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-77km4" Oct 07 12:43:00 crc kubenswrapper[4854]: I1007 12:43:00.765920 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 12:43:00 crc kubenswrapper[4854]: I1007 12:43:00.915020 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-openstack-config-secret\") pod \"openstackclient\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " pod="openstack/openstackclient" Oct 07 12:43:00 crc kubenswrapper[4854]: I1007 12:43:00.915270 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " pod="openstack/openstackclient" Oct 07 12:43:00 crc kubenswrapper[4854]: I1007 12:43:00.915465 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-openstack-config\") pod \"openstackclient\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " pod="openstack/openstackclient" Oct 07 12:43:00 crc kubenswrapper[4854]: I1007 12:43:00.915514 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssz77\" (UniqueName: \"kubernetes.io/projected/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-kube-api-access-ssz77\") pod \"openstackclient\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " pod="openstack/openstackclient" Oct 07 12:43:01 crc kubenswrapper[4854]: I1007 12:43:01.017761 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-openstack-config\") pod \"openstackclient\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " pod="openstack/openstackclient" Oct 07 12:43:01 crc kubenswrapper[4854]: I1007 12:43:01.017814 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssz77\" (UniqueName: \"kubernetes.io/projected/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-kube-api-access-ssz77\") pod \"openstackclient\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " pod="openstack/openstackclient" Oct 07 12:43:01 crc kubenswrapper[4854]: I1007 12:43:01.017857 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-openstack-config-secret\") pod \"openstackclient\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " pod="openstack/openstackclient" Oct 07 12:43:01 crc kubenswrapper[4854]: I1007 12:43:01.017947 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " pod="openstack/openstackclient" Oct 07 12:43:01 crc kubenswrapper[4854]: I1007 12:43:01.019648 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-openstack-config\") pod \"openstackclient\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " pod="openstack/openstackclient" Oct 07 12:43:01 crc kubenswrapper[4854]: I1007 12:43:01.024999 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " pod="openstack/openstackclient" Oct 07 12:43:01 crc kubenswrapper[4854]: I1007 12:43:01.035695 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-openstack-config-secret\") pod \"openstackclient\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " pod="openstack/openstackclient" Oct 07 12:43:01 crc kubenswrapper[4854]: I1007 12:43:01.045310 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssz77\" (UniqueName: \"kubernetes.io/projected/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-kube-api-access-ssz77\") pod \"openstackclient\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " pod="openstack/openstackclient" Oct 07 12:43:01 crc kubenswrapper[4854]: I1007 12:43:01.063710 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21065050-7bdc-4f4e-9a7b-9dbcc2dab200","Type":"ContainerStarted","Data":"3cca971c6b0c47f58de9e946459a6537fae5efe97def4c9f7b9f8c2749fe2fc8"} Oct 07 12:43:01 crc kubenswrapper[4854]: I1007 12:43:01.074646 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 12:43:01 crc kubenswrapper[4854]: I1007 12:43:01.093871 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.093845741 podStartE2EDuration="3.093845741s" podCreationTimestamp="2025-10-07 12:42:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:43:01.082851921 +0000 UTC m=+1097.070684176" watchObservedRunningTime="2025-10-07 12:43:01.093845741 +0000 UTC m=+1097.081678006" Oct 07 12:43:01 crc kubenswrapper[4854]: I1007 12:43:01.527914 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 12:43:02 crc kubenswrapper[4854]: I1007 12:43:02.074211 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0416b15d-a0a1-4bf2-bd86-4209c14c8e48","Type":"ContainerStarted","Data":"4d617428e101f4a8a78873ee7bd8ec46bdae498934886611677b1f7d1248a23b"} Oct 07 12:43:03 crc kubenswrapper[4854]: I1007 12:43:03.737249 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.353580 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-667c455579-lnd9l"] Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.355850 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.369246 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.369492 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.370323 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.372712 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-667c455579-lnd9l"] Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.484353 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-internal-tls-certs\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.484417 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8167ae-8941-4616-bee2-ff0fb5e98c16-run-httpd\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.484450 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8167ae-8941-4616-bee2-ff0fb5e98c16-log-httpd\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.484487 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c8167ae-8941-4616-bee2-ff0fb5e98c16-etc-swift\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.484510 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-config-data\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.484573 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-public-tls-certs\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.484590 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7qdc\" (UniqueName: \"kubernetes.io/projected/2c8167ae-8941-4616-bee2-ff0fb5e98c16-kube-api-access-l7qdc\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.484641 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-combined-ca-bundle\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.586724 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-public-tls-certs\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.586767 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7qdc\" (UniqueName: \"kubernetes.io/projected/2c8167ae-8941-4616-bee2-ff0fb5e98c16-kube-api-access-l7qdc\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.586841 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-combined-ca-bundle\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.586900 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-internal-tls-certs\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.586934 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8167ae-8941-4616-bee2-ff0fb5e98c16-run-httpd\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.586961 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8167ae-8941-4616-bee2-ff0fb5e98c16-log-httpd\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.586995 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c8167ae-8941-4616-bee2-ff0fb5e98c16-etc-swift\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.587038 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-config-data\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.588108 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8167ae-8941-4616-bee2-ff0fb5e98c16-log-httpd\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.588364 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8167ae-8941-4616-bee2-ff0fb5e98c16-run-httpd\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.593454 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-public-tls-certs\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.593500 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-combined-ca-bundle\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.597978 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c8167ae-8941-4616-bee2-ff0fb5e98c16-etc-swift\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.599284 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-internal-tls-certs\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.599977 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-config-data\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.611891 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7qdc\" (UniqueName: \"kubernetes.io/projected/2c8167ae-8941-4616-bee2-ff0fb5e98c16-kube-api-access-l7qdc\") pod \"swift-proxy-667c455579-lnd9l\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:04 crc kubenswrapper[4854]: I1007 12:43:04.688535 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:05 crc kubenswrapper[4854]: I1007 12:43:05.266033 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-667c455579-lnd9l"] Oct 07 12:43:05 crc kubenswrapper[4854]: I1007 12:43:05.536876 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:05 crc kubenswrapper[4854]: I1007 12:43:05.537423 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="ceilometer-central-agent" containerID="cri-o://3ab4bbcf97db87a62ff3a0e0264374c75b015402a12f1a9cec1680d13ecfcb9d" gracePeriod=30 Oct 07 12:43:05 crc kubenswrapper[4854]: I1007 12:43:05.537709 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="sg-core" containerID="cri-o://fc76adf34e01c5b363c966a85cb79552f2e821e0a6ad3c35ba4df05b7ba60bec" gracePeriod=30 Oct 07 12:43:05 crc kubenswrapper[4854]: I1007 12:43:05.537859 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="proxy-httpd" containerID="cri-o://0e27fa684f81118b3d69ffb7ca8797ef87e5550ba223a5f3b435edcb3fcbd78f" gracePeriod=30 Oct 07 12:43:05 crc kubenswrapper[4854]: I1007 12:43:05.537910 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="ceilometer-notification-agent" containerID="cri-o://dd6037f3dd57699d1eb4185e084b92eb28906c0fadfe662dc09d6034dba57d4b" gracePeriod=30 Oct 07 12:43:05 crc kubenswrapper[4854]: I1007 12:43:05.545909 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 12:43:06 crc kubenswrapper[4854]: I1007 12:43:06.119200 4854 generic.go:334] "Generic (PLEG): container finished" podID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerID="0e27fa684f81118b3d69ffb7ca8797ef87e5550ba223a5f3b435edcb3fcbd78f" exitCode=0 Oct 07 12:43:06 crc kubenswrapper[4854]: I1007 12:43:06.119434 4854 generic.go:334] "Generic (PLEG): container finished" podID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerID="fc76adf34e01c5b363c966a85cb79552f2e821e0a6ad3c35ba4df05b7ba60bec" exitCode=2 Oct 07 12:43:06 crc kubenswrapper[4854]: I1007 12:43:06.119442 4854 generic.go:334] "Generic (PLEG): container finished" podID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerID="3ab4bbcf97db87a62ff3a0e0264374c75b015402a12f1a9cec1680d13ecfcb9d" exitCode=0 Oct 07 12:43:06 crc kubenswrapper[4854]: I1007 12:43:06.119276 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efc4aa31-c1b9-4615-bca1-04c5f81d3024","Type":"ContainerDied","Data":"0e27fa684f81118b3d69ffb7ca8797ef87e5550ba223a5f3b435edcb3fcbd78f"} Oct 07 12:43:06 crc kubenswrapper[4854]: I1007 12:43:06.119498 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efc4aa31-c1b9-4615-bca1-04c5f81d3024","Type":"ContainerDied","Data":"fc76adf34e01c5b363c966a85cb79552f2e821e0a6ad3c35ba4df05b7ba60bec"} Oct 07 12:43:06 crc kubenswrapper[4854]: I1007 12:43:06.119509 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efc4aa31-c1b9-4615-bca1-04c5f81d3024","Type":"ContainerDied","Data":"3ab4bbcf97db87a62ff3a0e0264374c75b015402a12f1a9cec1680d13ecfcb9d"} Oct 07 12:43:06 crc kubenswrapper[4854]: I1007 12:43:06.121659 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-667c455579-lnd9l" event={"ID":"2c8167ae-8941-4616-bee2-ff0fb5e98c16","Type":"ContainerStarted","Data":"dd65a138aac9fa721dada3f006a871724f98f90b0fa13687c508bb7e6049a302"} Oct 07 12:43:06 crc kubenswrapper[4854]: I1007 12:43:06.121682 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-667c455579-lnd9l" event={"ID":"2c8167ae-8941-4616-bee2-ff0fb5e98c16","Type":"ContainerStarted","Data":"364708de26903cb2d2d7224428a3d27a659926ef0d8ef5da26e1579482e5acf8"} Oct 07 12:43:06 crc kubenswrapper[4854]: I1007 12:43:06.121692 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-667c455579-lnd9l" event={"ID":"2c8167ae-8941-4616-bee2-ff0fb5e98c16","Type":"ContainerStarted","Data":"8d1b853dcbf4379c9355c85ed7807d68fc02707946cc9d5d5c9ec4959b8edb49"} Oct 07 12:43:06 crc kubenswrapper[4854]: I1007 12:43:06.121783 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:06 crc kubenswrapper[4854]: I1007 12:43:06.142486 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-667c455579-lnd9l" podStartSLOduration=2.142467695 podStartE2EDuration="2.142467695s" podCreationTimestamp="2025-10-07 12:43:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:43:06.142014382 +0000 UTC m=+1102.129846637" watchObservedRunningTime="2025-10-07 12:43:06.142467695 +0000 UTC m=+1102.130299950" Oct 07 12:43:07 crc kubenswrapper[4854]: I1007 12:43:07.132633 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:08 crc kubenswrapper[4854]: I1007 12:43:08.835196 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.156:3000/\": dial tcp 10.217.0.156:3000: connect: connection refused" Oct 07 12:43:09 crc kubenswrapper[4854]: I1007 12:43:09.007930 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.060898 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.061454 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="6c132b4c-1591-4194-8912-637f54cea863" containerName="kube-state-metrics" containerID="cri-o://a2b65fae5dce349adef0285c33c91c4d9dbae7be68e750c07159d89c13fba861" gracePeriod=30 Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.372953 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vz2cc"] Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.374083 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vz2cc" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.385097 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vz2cc"] Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.464106 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-8bbnw"] Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.465182 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8bbnw" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.483264 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-8bbnw"] Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.515602 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpfkq\" (UniqueName: \"kubernetes.io/projected/50a7ebb6-06f1-44cf-806a-f824afac8cf9-kube-api-access-bpfkq\") pod \"nova-api-db-create-vz2cc\" (UID: \"50a7ebb6-06f1-44cf-806a-f824afac8cf9\") " pod="openstack/nova-api-db-create-vz2cc" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.576485 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-k7545"] Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.577867 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k7545" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.588418 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-k7545"] Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.616816 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqp9v\" (UniqueName: \"kubernetes.io/projected/a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f-kube-api-access-fqp9v\") pod \"nova-cell0-db-create-8bbnw\" (UID: \"a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f\") " pod="openstack/nova-cell0-db-create-8bbnw" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.616878 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpfkq\" (UniqueName: \"kubernetes.io/projected/50a7ebb6-06f1-44cf-806a-f824afac8cf9-kube-api-access-bpfkq\") pod \"nova-api-db-create-vz2cc\" (UID: \"50a7ebb6-06f1-44cf-806a-f824afac8cf9\") " pod="openstack/nova-api-db-create-vz2cc" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.636645 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpfkq\" (UniqueName: \"kubernetes.io/projected/50a7ebb6-06f1-44cf-806a-f824afac8cf9-kube-api-access-bpfkq\") pod \"nova-api-db-create-vz2cc\" (UID: \"50a7ebb6-06f1-44cf-806a-f824afac8cf9\") " pod="openstack/nova-api-db-create-vz2cc" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.718985 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8ttj\" (UniqueName: \"kubernetes.io/projected/982baa82-c0c6-4cc7-8c1e-fd351b582446-kube-api-access-f8ttj\") pod \"nova-cell1-db-create-k7545\" (UID: \"982baa82-c0c6-4cc7-8c1e-fd351b582446\") " pod="openstack/nova-cell1-db-create-k7545" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.719524 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqp9v\" (UniqueName: \"kubernetes.io/projected/a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f-kube-api-access-fqp9v\") pod \"nova-cell0-db-create-8bbnw\" (UID: \"a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f\") " pod="openstack/nova-cell0-db-create-8bbnw" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.742358 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqp9v\" (UniqueName: \"kubernetes.io/projected/a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f-kube-api-access-fqp9v\") pod \"nova-cell0-db-create-8bbnw\" (UID: \"a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f\") " pod="openstack/nova-cell0-db-create-8bbnw" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.806273 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vz2cc" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.807062 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8bbnw" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.821815 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8ttj\" (UniqueName: \"kubernetes.io/projected/982baa82-c0c6-4cc7-8c1e-fd351b582446-kube-api-access-f8ttj\") pod \"nova-cell1-db-create-k7545\" (UID: \"982baa82-c0c6-4cc7-8c1e-fd351b582446\") " pod="openstack/nova-cell1-db-create-k7545" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.838707 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8ttj\" (UniqueName: \"kubernetes.io/projected/982baa82-c0c6-4cc7-8c1e-fd351b582446-kube-api-access-f8ttj\") pod \"nova-cell1-db-create-k7545\" (UID: \"982baa82-c0c6-4cc7-8c1e-fd351b582446\") " pod="openstack/nova-cell1-db-create-k7545" Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.845600 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.845966 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b6837285-79fa-44f5-83bd-5d1e5959cb8a" containerName="glance-httpd" containerID="cri-o://00e4df36e042f5a5d48c09c816b2ba5f585be063679da01da6210f205b34bcd6" gracePeriod=30 Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.846339 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b6837285-79fa-44f5-83bd-5d1e5959cb8a" containerName="glance-log" containerID="cri-o://5a5aba34501996755646bff327da7dc06f8295afd26fd3399b7b27ec146c57c0" gracePeriod=30 Oct 07 12:43:10 crc kubenswrapper[4854]: I1007 12:43:10.898881 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k7545" Oct 07 12:43:11 crc kubenswrapper[4854]: I1007 12:43:11.185956 4854 generic.go:334] "Generic (PLEG): container finished" podID="6c132b4c-1591-4194-8912-637f54cea863" containerID="a2b65fae5dce349adef0285c33c91c4d9dbae7be68e750c07159d89c13fba861" exitCode=2 Oct 07 12:43:11 crc kubenswrapper[4854]: I1007 12:43:11.186035 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6c132b4c-1591-4194-8912-637f54cea863","Type":"ContainerDied","Data":"a2b65fae5dce349adef0285c33c91c4d9dbae7be68e750c07159d89c13fba861"} Oct 07 12:43:11 crc kubenswrapper[4854]: I1007 12:43:11.189655 4854 generic.go:334] "Generic (PLEG): container finished" podID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerID="dd6037f3dd57699d1eb4185e084b92eb28906c0fadfe662dc09d6034dba57d4b" exitCode=0 Oct 07 12:43:11 crc kubenswrapper[4854]: I1007 12:43:11.189728 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efc4aa31-c1b9-4615-bca1-04c5f81d3024","Type":"ContainerDied","Data":"dd6037f3dd57699d1eb4185e084b92eb28906c0fadfe662dc09d6034dba57d4b"} Oct 07 12:43:11 crc kubenswrapper[4854]: I1007 12:43:11.191788 4854 generic.go:334] "Generic (PLEG): container finished" podID="b6837285-79fa-44f5-83bd-5d1e5959cb8a" containerID="5a5aba34501996755646bff327da7dc06f8295afd26fd3399b7b27ec146c57c0" exitCode=143 Oct 07 12:43:11 crc kubenswrapper[4854]: I1007 12:43:11.191826 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b6837285-79fa-44f5-83bd-5d1e5959cb8a","Type":"ContainerDied","Data":"5a5aba34501996755646bff327da7dc06f8295afd26fd3399b7b27ec146c57c0"} Oct 07 12:43:11 crc kubenswrapper[4854]: I1007 12:43:11.477805 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="6c132b4c-1591-4194-8912-637f54cea863" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": dial tcp 10.217.0.107:8081: connect: connection refused" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.397874 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.447648 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2vhn\" (UniqueName: \"kubernetes.io/projected/6c132b4c-1591-4194-8912-637f54cea863-kube-api-access-d2vhn\") pod \"6c132b4c-1591-4194-8912-637f54cea863\" (UID: \"6c132b4c-1591-4194-8912-637f54cea863\") " Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.452355 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c132b4c-1591-4194-8912-637f54cea863-kube-api-access-d2vhn" (OuterVolumeSpecName: "kube-api-access-d2vhn") pod "6c132b4c-1591-4194-8912-637f54cea863" (UID: "6c132b4c-1591-4194-8912-637f54cea863"). InnerVolumeSpecName "kube-api-access-d2vhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.540315 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.549553 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2vhn\" (UniqueName: \"kubernetes.io/projected/6c132b4c-1591-4194-8912-637f54cea863-kube-api-access-d2vhn\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.650817 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-scripts\") pod \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.650878 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efc4aa31-c1b9-4615-bca1-04c5f81d3024-log-httpd\") pod \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.650922 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7cdv\" (UniqueName: \"kubernetes.io/projected/efc4aa31-c1b9-4615-bca1-04c5f81d3024-kube-api-access-z7cdv\") pod \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.650942 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-sg-core-conf-yaml\") pod \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.650978 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efc4aa31-c1b9-4615-bca1-04c5f81d3024-run-httpd\") pod \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.651010 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-combined-ca-bundle\") pod \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.651031 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-config-data\") pod \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\" (UID: \"efc4aa31-c1b9-4615-bca1-04c5f81d3024\") " Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.651614 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efc4aa31-c1b9-4615-bca1-04c5f81d3024-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "efc4aa31-c1b9-4615-bca1-04c5f81d3024" (UID: "efc4aa31-c1b9-4615-bca1-04c5f81d3024"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.652059 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efc4aa31-c1b9-4615-bca1-04c5f81d3024-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "efc4aa31-c1b9-4615-bca1-04c5f81d3024" (UID: "efc4aa31-c1b9-4615-bca1-04c5f81d3024"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.659892 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc4aa31-c1b9-4615-bca1-04c5f81d3024-kube-api-access-z7cdv" (OuterVolumeSpecName: "kube-api-access-z7cdv") pod "efc4aa31-c1b9-4615-bca1-04c5f81d3024" (UID: "efc4aa31-c1b9-4615-bca1-04c5f81d3024"). InnerVolumeSpecName "kube-api-access-z7cdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.661124 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-scripts" (OuterVolumeSpecName: "scripts") pod "efc4aa31-c1b9-4615-bca1-04c5f81d3024" (UID: "efc4aa31-c1b9-4615-bca1-04c5f81d3024"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.683426 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "efc4aa31-c1b9-4615-bca1-04c5f81d3024" (UID: "efc4aa31-c1b9-4615-bca1-04c5f81d3024"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.748915 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-k7545"] Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.754236 4854 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efc4aa31-c1b9-4615-bca1-04c5f81d3024-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.754263 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.754273 4854 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efc4aa31-c1b9-4615-bca1-04c5f81d3024-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.754283 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7cdv\" (UniqueName: \"kubernetes.io/projected/efc4aa31-c1b9-4615-bca1-04c5f81d3024-kube-api-access-z7cdv\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.754294 4854 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:12 crc kubenswrapper[4854]: W1007 12:43:12.757599 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50a7ebb6_06f1_44cf_806a_f824afac8cf9.slice/crio-26fbbf334768177558e81c7ff0c62f5e1400bebb52948cbbd3223f4166c145d5 WatchSource:0}: Error finding container 26fbbf334768177558e81c7ff0c62f5e1400bebb52948cbbd3223f4166c145d5: Status 404 returned error can't find the container with id 26fbbf334768177558e81c7ff0c62f5e1400bebb52948cbbd3223f4166c145d5 Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.761638 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efc4aa31-c1b9-4615-bca1-04c5f81d3024" (UID: "efc4aa31-c1b9-4615-bca1-04c5f81d3024"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.766641 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vz2cc"] Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.806972 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-config-data" (OuterVolumeSpecName: "config-data") pod "efc4aa31-c1b9-4615-bca1-04c5f81d3024" (UID: "efc4aa31-c1b9-4615-bca1-04c5f81d3024"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.855659 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.855688 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc4aa31-c1b9-4615-bca1-04c5f81d3024-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:12 crc kubenswrapper[4854]: I1007 12:43:12.900001 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-8bbnw"] Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.098238 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.236517 4854 generic.go:334] "Generic (PLEG): container finished" podID="a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f" containerID="b307bfacb066ee957083e0a594099299bdfed229bc968e0f23b8125a6b5c50a4" exitCode=0 Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.236939 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8bbnw" event={"ID":"a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f","Type":"ContainerDied","Data":"b307bfacb066ee957083e0a594099299bdfed229bc968e0f23b8125a6b5c50a4"} Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.237197 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8bbnw" event={"ID":"a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f","Type":"ContainerStarted","Data":"acb47181e208ad8395c1798b2833209a1c4c4e08062161c3ab9bf6c577d55091"} Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.241247 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vz2cc" event={"ID":"50a7ebb6-06f1-44cf-806a-f824afac8cf9","Type":"ContainerDied","Data":"d7769a32795acfc7b8c09a637fbe56dba9fefbe79baa9e6e7baa998c4c190434"} Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.239796 4854 generic.go:334] "Generic (PLEG): container finished" podID="50a7ebb6-06f1-44cf-806a-f824afac8cf9" containerID="d7769a32795acfc7b8c09a637fbe56dba9fefbe79baa9e6e7baa998c4c190434" exitCode=0 Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.241490 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vz2cc" event={"ID":"50a7ebb6-06f1-44cf-806a-f824afac8cf9","Type":"ContainerStarted","Data":"26fbbf334768177558e81c7ff0c62f5e1400bebb52948cbbd3223f4166c145d5"} Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.243907 4854 generic.go:334] "Generic (PLEG): container finished" podID="982baa82-c0c6-4cc7-8c1e-fd351b582446" containerID="0a7cb163d43e07e5f8bed43aadca4cd323a0ec917a18fb9d72237c12ae237809" exitCode=0 Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.243959 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k7545" event={"ID":"982baa82-c0c6-4cc7-8c1e-fd351b582446","Type":"ContainerDied","Data":"0a7cb163d43e07e5f8bed43aadca4cd323a0ec917a18fb9d72237c12ae237809"} Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.243978 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k7545" event={"ID":"982baa82-c0c6-4cc7-8c1e-fd351b582446","Type":"ContainerStarted","Data":"f06582b41b537b07f189565087c42b289acbf0e8fc341efc389c3120a7eb3513"} Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.251994 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0416b15d-a0a1-4bf2-bd86-4209c14c8e48","Type":"ContainerStarted","Data":"d08d57f3c4a2642e473841c6986c3097f5c785af5a96f4a083b730e4a989d74c"} Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.253931 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.253948 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6c132b4c-1591-4194-8912-637f54cea863","Type":"ContainerDied","Data":"556ab3cd7fa1d85d049f918caeb3a5b0e00dd0377241a23fc44567af89a5a0f2"} Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.253990 4854 scope.go:117] "RemoveContainer" containerID="a2b65fae5dce349adef0285c33c91c4d9dbae7be68e750c07159d89c13fba861" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.256978 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efc4aa31-c1b9-4615-bca1-04c5f81d3024","Type":"ContainerDied","Data":"6f33c5e04ac8af8ed233bdc01118f9feb7f99398c45ef3115fd12dfe9b7dcf34"} Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.257079 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.300584 4854 scope.go:117] "RemoveContainer" containerID="0e27fa684f81118b3d69ffb7ca8797ef87e5550ba223a5f3b435edcb3fcbd78f" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.319335 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.661896464 podStartE2EDuration="13.319316372s" podCreationTimestamp="2025-10-07 12:43:00 +0000 UTC" firstStartedPulling="2025-10-07 12:43:01.550828706 +0000 UTC m=+1097.538660961" lastFinishedPulling="2025-10-07 12:43:12.208248614 +0000 UTC m=+1108.196080869" observedRunningTime="2025-10-07 12:43:13.304600772 +0000 UTC m=+1109.292433027" watchObservedRunningTime="2025-10-07 12:43:13.319316372 +0000 UTC m=+1109.307148627" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.343544 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.350936 4854 scope.go:117] "RemoveContainer" containerID="fc76adf34e01c5b363c966a85cb79552f2e821e0a6ad3c35ba4df05b7ba60bec" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.351949 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.369255 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.383762 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.389555 4854 scope.go:117] "RemoveContainer" containerID="dd6037f3dd57699d1eb4185e084b92eb28906c0fadfe662dc09d6034dba57d4b" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.392464 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 12:43:13 crc kubenswrapper[4854]: E1007 12:43:13.392883 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="sg-core" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.392899 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="sg-core" Oct 07 12:43:13 crc kubenswrapper[4854]: E1007 12:43:13.392911 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="proxy-httpd" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.392918 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="proxy-httpd" Oct 07 12:43:13 crc kubenswrapper[4854]: E1007 12:43:13.392932 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="ceilometer-notification-agent" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.392938 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="ceilometer-notification-agent" Oct 07 12:43:13 crc kubenswrapper[4854]: E1007 12:43:13.392968 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="ceilometer-central-agent" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.392974 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="ceilometer-central-agent" Oct 07 12:43:13 crc kubenswrapper[4854]: E1007 12:43:13.392981 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c132b4c-1591-4194-8912-637f54cea863" containerName="kube-state-metrics" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.392988 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c132b4c-1591-4194-8912-637f54cea863" containerName="kube-state-metrics" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.393171 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c132b4c-1591-4194-8912-637f54cea863" containerName="kube-state-metrics" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.393183 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="sg-core" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.393195 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="ceilometer-notification-agent" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.393204 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="ceilometer-central-agent" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.393218 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" containerName="proxy-httpd" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.393865 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.397280 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-z298z" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.398195 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.398384 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.412269 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.439823 4854 scope.go:117] "RemoveContainer" containerID="3ab4bbcf97db87a62ff3a0e0264374c75b015402a12f1a9cec1680d13ecfcb9d" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.443105 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.445964 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.448182 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.448559 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.448764 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.471127 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.472417 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.472623 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.472734 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td7lc\" (UniqueName: \"kubernetes.io/projected/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-api-access-td7lc\") pod \"kube-state-metrics-0\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.473094 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.575139 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a15303e-66e9-4c76-ba02-902d1949cf49-log-httpd\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.575264 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-scripts\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.575307 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.575341 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a15303e-66e9-4c76-ba02-902d1949cf49-run-httpd\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.575365 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.575417 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.575492 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-config-data\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.575556 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r9rc\" (UniqueName: \"kubernetes.io/projected/7a15303e-66e9-4c76-ba02-902d1949cf49-kube-api-access-8r9rc\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.575602 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.575638 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td7lc\" (UniqueName: \"kubernetes.io/projected/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-api-access-td7lc\") pod \"kube-state-metrics-0\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.575668 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.575717 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.581892 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.582661 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.600350 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td7lc\" (UniqueName: \"kubernetes.io/projected/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-api-access-td7lc\") pod \"kube-state-metrics-0\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.600863 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.677548 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-config-data\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.677610 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r9rc\" (UniqueName: \"kubernetes.io/projected/7a15303e-66e9-4c76-ba02-902d1949cf49-kube-api-access-8r9rc\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.677641 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.677670 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.677693 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a15303e-66e9-4c76-ba02-902d1949cf49-log-httpd\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.677744 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-scripts\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.677773 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a15303e-66e9-4c76-ba02-902d1949cf49-run-httpd\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.677790 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.678575 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a15303e-66e9-4c76-ba02-902d1949cf49-log-httpd\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.678791 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a15303e-66e9-4c76-ba02-902d1949cf49-run-httpd\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.681440 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.681558 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.681559 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-scripts\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.682229 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.682578 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-config-data\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.697676 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r9rc\" (UniqueName: \"kubernetes.io/projected/7a15303e-66e9-4c76-ba02-902d1949cf49-kube-api-access-8r9rc\") pod \"ceilometer-0\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " pod="openstack/ceilometer-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.723946 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 12:43:13 crc kubenswrapper[4854]: I1007 12:43:13.763518 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:43:14 crc kubenswrapper[4854]: I1007 12:43:14.075583 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 12:43:14 crc kubenswrapper[4854]: I1007 12:43:14.272159 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e309be64-7a5a-4156-89a6-d1201eaaff63","Type":"ContainerStarted","Data":"26680bebc3fd5e6e9b2b69dbb0e21addd670512464a9a360a906df7833a01d0e"} Oct 07 12:43:14 crc kubenswrapper[4854]: I1007 12:43:14.334504 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:14 crc kubenswrapper[4854]: W1007 12:43:14.362081 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a15303e_66e9_4c76_ba02_902d1949cf49.slice/crio-daeb0808f9280ca4fadd369f7263738401cbd4f3ae302e4e8cf237c5864f7a73 WatchSource:0}: Error finding container daeb0808f9280ca4fadd369f7263738401cbd4f3ae302e4e8cf237c5864f7a73: Status 404 returned error can't find the container with id daeb0808f9280ca4fadd369f7263738401cbd4f3ae302e4e8cf237c5864f7a73 Oct 07 12:43:14 crc kubenswrapper[4854]: I1007 12:43:14.478815 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:14 crc kubenswrapper[4854]: I1007 12:43:14.695349 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:14 crc kubenswrapper[4854]: I1007 12:43:14.695931 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:43:14 crc kubenswrapper[4854]: I1007 12:43:14.742408 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c132b4c-1591-4194-8912-637f54cea863" path="/var/lib/kubelet/pods/6c132b4c-1591-4194-8912-637f54cea863/volumes" Oct 07 12:43:14 crc kubenswrapper[4854]: I1007 12:43:14.754919 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc4aa31-c1b9-4615-bca1-04c5f81d3024" path="/var/lib/kubelet/pods/efc4aa31-c1b9-4615-bca1-04c5f81d3024/volumes" Oct 07 12:43:14 crc kubenswrapper[4854]: I1007 12:43:14.889901 4854 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod21abefeb-fe0f-4c5b-b908-ae201a4cabdb"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod21abefeb-fe0f-4c5b-b908-ae201a4cabdb] : Timed out while waiting for systemd to remove kubepods-besteffort-pod21abefeb_fe0f_4c5b_b908_ae201a4cabdb.slice" Oct 07 12:43:14 crc kubenswrapper[4854]: I1007 12:43:14.915503 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vz2cc" Oct 07 12:43:14 crc kubenswrapper[4854]: I1007 12:43:14.946921 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k7545" Oct 07 12:43:14 crc kubenswrapper[4854]: I1007 12:43:14.948844 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8bbnw" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.006541 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqp9v\" (UniqueName: \"kubernetes.io/projected/a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f-kube-api-access-fqp9v\") pod \"a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f\" (UID: \"a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f\") " Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.006773 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpfkq\" (UniqueName: \"kubernetes.io/projected/50a7ebb6-06f1-44cf-806a-f824afac8cf9-kube-api-access-bpfkq\") pod \"50a7ebb6-06f1-44cf-806a-f824afac8cf9\" (UID: \"50a7ebb6-06f1-44cf-806a-f824afac8cf9\") " Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.006919 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8ttj\" (UniqueName: \"kubernetes.io/projected/982baa82-c0c6-4cc7-8c1e-fd351b582446-kube-api-access-f8ttj\") pod \"982baa82-c0c6-4cc7-8c1e-fd351b582446\" (UID: \"982baa82-c0c6-4cc7-8c1e-fd351b582446\") " Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.012819 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a7ebb6-06f1-44cf-806a-f824afac8cf9-kube-api-access-bpfkq" (OuterVolumeSpecName: "kube-api-access-bpfkq") pod "50a7ebb6-06f1-44cf-806a-f824afac8cf9" (UID: "50a7ebb6-06f1-44cf-806a-f824afac8cf9"). InnerVolumeSpecName "kube-api-access-bpfkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.013261 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/982baa82-c0c6-4cc7-8c1e-fd351b582446-kube-api-access-f8ttj" (OuterVolumeSpecName: "kube-api-access-f8ttj") pod "982baa82-c0c6-4cc7-8c1e-fd351b582446" (UID: "982baa82-c0c6-4cc7-8c1e-fd351b582446"). InnerVolumeSpecName "kube-api-access-f8ttj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.016209 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f-kube-api-access-fqp9v" (OuterVolumeSpecName: "kube-api-access-fqp9v") pod "a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f" (UID: "a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f"). InnerVolumeSpecName "kube-api-access-fqp9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.109649 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpfkq\" (UniqueName: \"kubernetes.io/projected/50a7ebb6-06f1-44cf-806a-f824afac8cf9-kube-api-access-bpfkq\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.109908 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8ttj\" (UniqueName: \"kubernetes.io/projected/982baa82-c0c6-4cc7-8c1e-fd351b582446-kube-api-access-f8ttj\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.109922 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqp9v\" (UniqueName: \"kubernetes.io/projected/a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f-kube-api-access-fqp9v\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.288257 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a15303e-66e9-4c76-ba02-902d1949cf49","Type":"ContainerStarted","Data":"daeb0808f9280ca4fadd369f7263738401cbd4f3ae302e4e8cf237c5864f7a73"} Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.292224 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vz2cc" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.292236 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vz2cc" event={"ID":"50a7ebb6-06f1-44cf-806a-f824afac8cf9","Type":"ContainerDied","Data":"26fbbf334768177558e81c7ff0c62f5e1400bebb52948cbbd3223f4166c145d5"} Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.292269 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26fbbf334768177558e81c7ff0c62f5e1400bebb52948cbbd3223f4166c145d5" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.301616 4854 generic.go:334] "Generic (PLEG): container finished" podID="b6837285-79fa-44f5-83bd-5d1e5959cb8a" containerID="00e4df36e042f5a5d48c09c816b2ba5f585be063679da01da6210f205b34bcd6" exitCode=0 Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.301699 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b6837285-79fa-44f5-83bd-5d1e5959cb8a","Type":"ContainerDied","Data":"00e4df36e042f5a5d48c09c816b2ba5f585be063679da01da6210f205b34bcd6"} Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.306791 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k7545" event={"ID":"982baa82-c0c6-4cc7-8c1e-fd351b582446","Type":"ContainerDied","Data":"f06582b41b537b07f189565087c42b289acbf0e8fc341efc389c3120a7eb3513"} Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.306830 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f06582b41b537b07f189565087c42b289acbf0e8fc341efc389c3120a7eb3513" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.306895 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k7545" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.310074 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8bbnw" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.310277 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8bbnw" event={"ID":"a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f","Type":"ContainerDied","Data":"acb47181e208ad8395c1798b2833209a1c4c4e08062161c3ab9bf6c577d55091"} Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.310320 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acb47181e208ad8395c1798b2833209a1c4c4e08062161c3ab9bf6c577d55091" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.853004 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.931040 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-scripts\") pod \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.931133 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-internal-tls-certs\") pod \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.931212 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd5sn\" (UniqueName: \"kubernetes.io/projected/b6837285-79fa-44f5-83bd-5d1e5959cb8a-kube-api-access-cd5sn\") pod \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.931273 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6837285-79fa-44f5-83bd-5d1e5959cb8a-httpd-run\") pod \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.931355 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-combined-ca-bundle\") pod \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.931396 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6837285-79fa-44f5-83bd-5d1e5959cb8a-logs\") pod \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.931484 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.931516 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-config-data\") pod \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\" (UID: \"b6837285-79fa-44f5-83bd-5d1e5959cb8a\") " Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.932644 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6837285-79fa-44f5-83bd-5d1e5959cb8a-logs" (OuterVolumeSpecName: "logs") pod "b6837285-79fa-44f5-83bd-5d1e5959cb8a" (UID: "b6837285-79fa-44f5-83bd-5d1e5959cb8a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.932661 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6837285-79fa-44f5-83bd-5d1e5959cb8a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b6837285-79fa-44f5-83bd-5d1e5959cb8a" (UID: "b6837285-79fa-44f5-83bd-5d1e5959cb8a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.943083 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-scripts" (OuterVolumeSpecName: "scripts") pod "b6837285-79fa-44f5-83bd-5d1e5959cb8a" (UID: "b6837285-79fa-44f5-83bd-5d1e5959cb8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.945863 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6837285-79fa-44f5-83bd-5d1e5959cb8a-kube-api-access-cd5sn" (OuterVolumeSpecName: "kube-api-access-cd5sn") pod "b6837285-79fa-44f5-83bd-5d1e5959cb8a" (UID: "b6837285-79fa-44f5-83bd-5d1e5959cb8a"). InnerVolumeSpecName "kube-api-access-cd5sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:43:15 crc kubenswrapper[4854]: I1007 12:43:15.946419 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "b6837285-79fa-44f5-83bd-5d1e5959cb8a" (UID: "b6837285-79fa-44f5-83bd-5d1e5959cb8a"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.044487 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.044520 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd5sn\" (UniqueName: \"kubernetes.io/projected/b6837285-79fa-44f5-83bd-5d1e5959cb8a-kube-api-access-cd5sn\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.044531 4854 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6837285-79fa-44f5-83bd-5d1e5959cb8a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.044541 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6837285-79fa-44f5-83bd-5d1e5959cb8a-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.044562 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.073483 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6837285-79fa-44f5-83bd-5d1e5959cb8a" (UID: "b6837285-79fa-44f5-83bd-5d1e5959cb8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.079257 4854 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.091448 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b6837285-79fa-44f5-83bd-5d1e5959cb8a" (UID: "b6837285-79fa-44f5-83bd-5d1e5959cb8a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.102847 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-config-data" (OuterVolumeSpecName: "config-data") pod "b6837285-79fa-44f5-83bd-5d1e5959cb8a" (UID: "b6837285-79fa-44f5-83bd-5d1e5959cb8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.146592 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.146637 4854 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.146653 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.146665 4854 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6837285-79fa-44f5-83bd-5d1e5959cb8a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.319504 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b6837285-79fa-44f5-83bd-5d1e5959cb8a","Type":"ContainerDied","Data":"a662bbf6531bbfad94c1be7360720244bcf72dcf71612b1f6b296d9dd20dd422"} Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.319564 4854 scope.go:117] "RemoveContainer" containerID="00e4df36e042f5a5d48c09c816b2ba5f585be063679da01da6210f205b34bcd6" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.319521 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.323340 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e309be64-7a5a-4156-89a6-d1201eaaff63","Type":"ContainerStarted","Data":"ca038c665c300165c2b08c130cce95306c24d5eee4ccd22ba09577872f53b9c2"} Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.324338 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.326851 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a15303e-66e9-4c76-ba02-902d1949cf49","Type":"ContainerStarted","Data":"512023c1348affae988ba42ea4c9852e6519486ab0ddc2e0e6e48fce90d79cf9"} Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.341482 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.098120669 podStartE2EDuration="3.341457719s" podCreationTimestamp="2025-10-07 12:43:13 +0000 UTC" firstStartedPulling="2025-10-07 12:43:14.106098546 +0000 UTC m=+1110.093930801" lastFinishedPulling="2025-10-07 12:43:15.349435596 +0000 UTC m=+1111.337267851" observedRunningTime="2025-10-07 12:43:16.340003447 +0000 UTC m=+1112.327835702" watchObservedRunningTime="2025-10-07 12:43:16.341457719 +0000 UTC m=+1112.329289974" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.362115 4854 scope.go:117] "RemoveContainer" containerID="5a5aba34501996755646bff327da7dc06f8295afd26fd3399b7b27ec146c57c0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.366601 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.377227 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.383681 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:43:16 crc kubenswrapper[4854]: E1007 12:43:16.384087 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6837285-79fa-44f5-83bd-5d1e5959cb8a" containerName="glance-log" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.384112 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6837285-79fa-44f5-83bd-5d1e5959cb8a" containerName="glance-log" Oct 07 12:43:16 crc kubenswrapper[4854]: E1007 12:43:16.384168 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a7ebb6-06f1-44cf-806a-f824afac8cf9" containerName="mariadb-database-create" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.384178 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a7ebb6-06f1-44cf-806a-f824afac8cf9" containerName="mariadb-database-create" Oct 07 12:43:16 crc kubenswrapper[4854]: E1007 12:43:16.384185 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982baa82-c0c6-4cc7-8c1e-fd351b582446" containerName="mariadb-database-create" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.384190 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="982baa82-c0c6-4cc7-8c1e-fd351b582446" containerName="mariadb-database-create" Oct 07 12:43:16 crc kubenswrapper[4854]: E1007 12:43:16.384201 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6837285-79fa-44f5-83bd-5d1e5959cb8a" containerName="glance-httpd" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.384208 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6837285-79fa-44f5-83bd-5d1e5959cb8a" containerName="glance-httpd" Oct 07 12:43:16 crc kubenswrapper[4854]: E1007 12:43:16.384229 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f" containerName="mariadb-database-create" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.384236 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f" containerName="mariadb-database-create" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.384415 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6837285-79fa-44f5-83bd-5d1e5959cb8a" containerName="glance-httpd" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.384425 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="982baa82-c0c6-4cc7-8c1e-fd351b582446" containerName="mariadb-database-create" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.384446 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a7ebb6-06f1-44cf-806a-f824afac8cf9" containerName="mariadb-database-create" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.384460 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6837285-79fa-44f5-83bd-5d1e5959cb8a" containerName="glance-log" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.384469 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f" containerName="mariadb-database-create" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.385526 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.394164 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.394426 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.401550 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.454376 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.454555 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13453d02-4f55-45a7-98be-1cd41c741a3e-logs\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.454622 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.454667 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.454721 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.454925 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13453d02-4f55-45a7-98be-1cd41c741a3e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.454983 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw9xk\" (UniqueName: \"kubernetes.io/projected/13453d02-4f55-45a7-98be-1cd41c741a3e-kube-api-access-sw9xk\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.455115 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.556659 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13453d02-4f55-45a7-98be-1cd41c741a3e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.556709 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw9xk\" (UniqueName: \"kubernetes.io/projected/13453d02-4f55-45a7-98be-1cd41c741a3e-kube-api-access-sw9xk\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.556751 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.556786 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.556815 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13453d02-4f55-45a7-98be-1cd41c741a3e-logs\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.556841 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.556858 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.556882 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.557388 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13453d02-4f55-45a7-98be-1cd41c741a3e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.557801 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.558270 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13453d02-4f55-45a7-98be-1cd41c741a3e-logs\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.565721 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.567066 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.567642 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.568007 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.577477 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw9xk\" (UniqueName: \"kubernetes.io/projected/13453d02-4f55-45a7-98be-1cd41c741a3e-kube-api-access-sw9xk\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.594167 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.707731 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 12:43:16 crc kubenswrapper[4854]: I1007 12:43:16.714919 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6837285-79fa-44f5-83bd-5d1e5959cb8a" path="/var/lib/kubelet/pods/b6837285-79fa-44f5-83bd-5d1e5959cb8a/volumes" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.321709 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.349815 4854 generic.go:334] "Generic (PLEG): container finished" podID="2214523f-cc36-4152-b4cd-ad6731ceecfd" containerID="b71e0bf0e727975ce30a7566fee4ff385dc97177def1d35ad77b0e3c1a53219c" exitCode=137 Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.349883 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2214523f-cc36-4152-b4cd-ad6731ceecfd","Type":"ContainerDied","Data":"b71e0bf0e727975ce30a7566fee4ff385dc97177def1d35ad77b0e3c1a53219c"} Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.356042 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a15303e-66e9-4c76-ba02-902d1949cf49","Type":"ContainerStarted","Data":"4fec556e8b7c7348f34af57027490f7e654cae87c6f4e4c77e42d40f0f3e5c81"} Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.523003 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.576132 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-combined-ca-bundle\") pod \"2214523f-cc36-4152-b4cd-ad6731ceecfd\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.576384 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-config-data-custom\") pod \"2214523f-cc36-4152-b4cd-ad6731ceecfd\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.576426 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-config-data\") pod \"2214523f-cc36-4152-b4cd-ad6731ceecfd\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.576447 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-scripts\") pod \"2214523f-cc36-4152-b4cd-ad6731ceecfd\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.576503 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2214523f-cc36-4152-b4cd-ad6731ceecfd-etc-machine-id\") pod \"2214523f-cc36-4152-b4cd-ad6731ceecfd\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.576546 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dmwg\" (UniqueName: \"kubernetes.io/projected/2214523f-cc36-4152-b4cd-ad6731ceecfd-kube-api-access-5dmwg\") pod \"2214523f-cc36-4152-b4cd-ad6731ceecfd\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.576571 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2214523f-cc36-4152-b4cd-ad6731ceecfd-logs\") pod \"2214523f-cc36-4152-b4cd-ad6731ceecfd\" (UID: \"2214523f-cc36-4152-b4cd-ad6731ceecfd\") " Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.577594 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2214523f-cc36-4152-b4cd-ad6731ceecfd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2214523f-cc36-4152-b4cd-ad6731ceecfd" (UID: "2214523f-cc36-4152-b4cd-ad6731ceecfd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.577710 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2214523f-cc36-4152-b4cd-ad6731ceecfd-logs" (OuterVolumeSpecName: "logs") pod "2214523f-cc36-4152-b4cd-ad6731ceecfd" (UID: "2214523f-cc36-4152-b4cd-ad6731ceecfd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.585913 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-scripts" (OuterVolumeSpecName: "scripts") pod "2214523f-cc36-4152-b4cd-ad6731ceecfd" (UID: "2214523f-cc36-4152-b4cd-ad6731ceecfd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.585982 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2214523f-cc36-4152-b4cd-ad6731ceecfd" (UID: "2214523f-cc36-4152-b4cd-ad6731ceecfd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.586058 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2214523f-cc36-4152-b4cd-ad6731ceecfd-kube-api-access-5dmwg" (OuterVolumeSpecName: "kube-api-access-5dmwg") pod "2214523f-cc36-4152-b4cd-ad6731ceecfd" (UID: "2214523f-cc36-4152-b4cd-ad6731ceecfd"). InnerVolumeSpecName "kube-api-access-5dmwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.620654 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2214523f-cc36-4152-b4cd-ad6731ceecfd" (UID: "2214523f-cc36-4152-b4cd-ad6731ceecfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.640450 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-config-data" (OuterVolumeSpecName: "config-data") pod "2214523f-cc36-4152-b4cd-ad6731ceecfd" (UID: "2214523f-cc36-4152-b4cd-ad6731ceecfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.678353 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.678379 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.678388 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.678396 4854 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2214523f-cc36-4152-b4cd-ad6731ceecfd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.678404 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dmwg\" (UniqueName: \"kubernetes.io/projected/2214523f-cc36-4152-b4cd-ad6731ceecfd-kube-api-access-5dmwg\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.678413 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2214523f-cc36-4152-b4cd-ad6731ceecfd-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:17 crc kubenswrapper[4854]: I1007 12:43:17.678421 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2214523f-cc36-4152-b4cd-ad6731ceecfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.367850 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2214523f-cc36-4152-b4cd-ad6731ceecfd","Type":"ContainerDied","Data":"e0158e5e4326c510908493c3c500819d3f97e7b2c09c2a0f184d87e0e4991116"} Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.368194 4854 scope.go:117] "RemoveContainer" containerID="b71e0bf0e727975ce30a7566fee4ff385dc97177def1d35ad77b0e3c1a53219c" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.367883 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.372755 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13453d02-4f55-45a7-98be-1cd41c741a3e","Type":"ContainerStarted","Data":"b70abd6d9bcc3a200f109c4995793bf820e1b93bdac5dd9db4c650a6ff934a3d"} Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.372794 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13453d02-4f55-45a7-98be-1cd41c741a3e","Type":"ContainerStarted","Data":"491449b30345fa51266807b560fb43c69324ed35091a73d2f9c7b2e839da0060"} Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.376000 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a15303e-66e9-4c76-ba02-902d1949cf49","Type":"ContainerStarted","Data":"a9eb89ef4efa661c4083f60a2506bb42ebbedcf85f2dfd9263e2f0e6599f9588"} Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.408508 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.419413 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.428934 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 12:43:18 crc kubenswrapper[4854]: E1007 12:43:18.429394 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2214523f-cc36-4152-b4cd-ad6731ceecfd" containerName="cinder-api-log" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.429421 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="2214523f-cc36-4152-b4cd-ad6731ceecfd" containerName="cinder-api-log" Oct 07 12:43:18 crc kubenswrapper[4854]: E1007 12:43:18.429457 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2214523f-cc36-4152-b4cd-ad6731ceecfd" containerName="cinder-api" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.429464 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="2214523f-cc36-4152-b4cd-ad6731ceecfd" containerName="cinder-api" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.429658 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="2214523f-cc36-4152-b4cd-ad6731ceecfd" containerName="cinder-api" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.429680 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="2214523f-cc36-4152-b4cd-ad6731ceecfd" containerName="cinder-api-log" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.430641 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.433519 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.439680 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.440018 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.452763 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.466684 4854 scope.go:117] "RemoveContainer" containerID="20c68f9823cef77d64816fddd3e11baddbaf7a317ee56f3ee4aa7ff0e639e1d3" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.492210 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-scripts\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.492299 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.492343 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-config-data-custom\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.492364 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.492410 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbfrg\" (UniqueName: \"kubernetes.io/projected/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-kube-api-access-bbfrg\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.492438 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.492473 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-logs\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.492496 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-config-data\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.492588 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.594250 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-scripts\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.594310 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.594333 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-config-data-custom\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.594355 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.594390 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbfrg\" (UniqueName: \"kubernetes.io/projected/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-kube-api-access-bbfrg\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.594418 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.594445 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-logs\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.594460 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-config-data\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.594511 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.595402 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-logs\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.595557 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.600762 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.605494 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.605629 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.606232 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-config-data\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.606678 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-config-data-custom\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.618792 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-scripts\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.626948 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbfrg\" (UniqueName: \"kubernetes.io/projected/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-kube-api-access-bbfrg\") pod \"cinder-api-0\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.717089 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2214523f-cc36-4152-b4cd-ad6731ceecfd" path="/var/lib/kubelet/pods/2214523f-cc36-4152-b4cd-ad6731ceecfd/volumes" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.717704 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.755956 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.780943 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b8c6459d6-knb66"] Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.781389 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b8c6459d6-knb66" podUID="1279e99a-d16c-441f-9aa8-33efa1cd351f" containerName="neutron-api" containerID="cri-o://6c0a474801b8f493e4770e6082eaf3f4b418df172e291fd80d1aee9313faf60c" gracePeriod=30 Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.781890 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b8c6459d6-knb66" podUID="1279e99a-d16c-441f-9aa8-33efa1cd351f" containerName="neutron-httpd" containerID="cri-o://b6914afc6e91abbd192b372434d91fd604cfb5c5e273f64244c294561b68e139" gracePeriod=30 Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.910468 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.910871 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9dc203ca-4a7b-4508-87d6-4922a76a1fb7" containerName="glance-log" containerID="cri-o://374a7faa09b1f2933202f50219d10eed6a8601aaa44361171ee0ef98440e6c0c" gracePeriod=30 Oct 07 12:43:18 crc kubenswrapper[4854]: I1007 12:43:18.911241 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9dc203ca-4a7b-4508-87d6-4922a76a1fb7" containerName="glance-httpd" containerID="cri-o://a95bd116db9ea390605319534fc00572f56d0206068c040f76d694264158b473" gracePeriod=30 Oct 07 12:43:19 crc kubenswrapper[4854]: I1007 12:43:19.347586 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 12:43:19 crc kubenswrapper[4854]: I1007 12:43:19.405084 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13453d02-4f55-45a7-98be-1cd41c741a3e","Type":"ContainerStarted","Data":"6ab3625afd9d5554eb97be59fc81dda17398623a3a3702f50c0d684653f13606"} Oct 07 12:43:19 crc kubenswrapper[4854]: I1007 12:43:19.410671 4854 generic.go:334] "Generic (PLEG): container finished" podID="1279e99a-d16c-441f-9aa8-33efa1cd351f" containerID="b6914afc6e91abbd192b372434d91fd604cfb5c5e273f64244c294561b68e139" exitCode=0 Oct 07 12:43:19 crc kubenswrapper[4854]: I1007 12:43:19.410758 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8c6459d6-knb66" event={"ID":"1279e99a-d16c-441f-9aa8-33efa1cd351f","Type":"ContainerDied","Data":"b6914afc6e91abbd192b372434d91fd604cfb5c5e273f64244c294561b68e139"} Oct 07 12:43:19 crc kubenswrapper[4854]: I1007 12:43:19.413128 4854 generic.go:334] "Generic (PLEG): container finished" podID="9dc203ca-4a7b-4508-87d6-4922a76a1fb7" containerID="374a7faa09b1f2933202f50219d10eed6a8601aaa44361171ee0ef98440e6c0c" exitCode=143 Oct 07 12:43:19 crc kubenswrapper[4854]: I1007 12:43:19.413198 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dc203ca-4a7b-4508-87d6-4922a76a1fb7","Type":"ContainerDied","Data":"374a7faa09b1f2933202f50219d10eed6a8601aaa44361171ee0ef98440e6c0c"} Oct 07 12:43:19 crc kubenswrapper[4854]: I1007 12:43:19.417388 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf","Type":"ContainerStarted","Data":"27072ab6ef04a6c47adee37687942b234b2555f7c9dbff7def6b6c3eebce4240"} Oct 07 12:43:19 crc kubenswrapper[4854]: I1007 12:43:19.438292 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.438268373 podStartE2EDuration="3.438268373s" podCreationTimestamp="2025-10-07 12:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:43:19.430714031 +0000 UTC m=+1115.418546296" watchObservedRunningTime="2025-10-07 12:43:19.438268373 +0000 UTC m=+1115.426100628" Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.449355 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a15303e-66e9-4c76-ba02-902d1949cf49","Type":"ContainerStarted","Data":"aed246739a200dc3292b614b7f083808741f3422b08b991f5eeb78b24a7512a7"} Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.449774 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="ceilometer-central-agent" containerID="cri-o://512023c1348affae988ba42ea4c9852e6519486ab0ddc2e0e6e48fce90d79cf9" gracePeriod=30 Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.450097 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.450141 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="proxy-httpd" containerID="cri-o://aed246739a200dc3292b614b7f083808741f3422b08b991f5eeb78b24a7512a7" gracePeriod=30 Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.450228 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="sg-core" containerID="cri-o://a9eb89ef4efa661c4083f60a2506bb42ebbedcf85f2dfd9263e2f0e6599f9588" gracePeriod=30 Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.450268 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="ceilometer-notification-agent" containerID="cri-o://4fec556e8b7c7348f34af57027490f7e654cae87c6f4e4c77e42d40f0f3e5c81" gracePeriod=30 Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.459943 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf","Type":"ContainerStarted","Data":"0775213d263bbb54ab38859826cdc168c18bef5c43a091d586b83ae90efc35b8"} Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.483081 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.5397335070000002 podStartE2EDuration="7.48305963s" podCreationTimestamp="2025-10-07 12:43:13 +0000 UTC" firstStartedPulling="2025-10-07 12:43:14.374270457 +0000 UTC m=+1110.362102712" lastFinishedPulling="2025-10-07 12:43:19.31759658 +0000 UTC m=+1115.305428835" observedRunningTime="2025-10-07 12:43:20.482651548 +0000 UTC m=+1116.470483803" watchObservedRunningTime="2025-10-07 12:43:20.48305963 +0000 UTC m=+1116.470891885" Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.565373 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-643b-account-create-q5znc"] Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.566483 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-643b-account-create-q5znc" Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.569645 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.585135 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-643b-account-create-q5znc"] Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.655805 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds64s\" (UniqueName: \"kubernetes.io/projected/757432e0-dbdf-40a7-b6e4-2534d6600bea-kube-api-access-ds64s\") pod \"nova-api-643b-account-create-q5znc\" (UID: \"757432e0-dbdf-40a7-b6e4-2534d6600bea\") " pod="openstack/nova-api-643b-account-create-q5znc" Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.758002 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds64s\" (UniqueName: \"kubernetes.io/projected/757432e0-dbdf-40a7-b6e4-2534d6600bea-kube-api-access-ds64s\") pod \"nova-api-643b-account-create-q5znc\" (UID: \"757432e0-dbdf-40a7-b6e4-2534d6600bea\") " pod="openstack/nova-api-643b-account-create-q5znc" Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.793286 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds64s\" (UniqueName: \"kubernetes.io/projected/757432e0-dbdf-40a7-b6e4-2534d6600bea-kube-api-access-ds64s\") pod \"nova-api-643b-account-create-q5znc\" (UID: \"757432e0-dbdf-40a7-b6e4-2534d6600bea\") " pod="openstack/nova-api-643b-account-create-q5znc" Oct 07 12:43:20 crc kubenswrapper[4854]: I1007 12:43:20.958197 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-643b-account-create-q5znc" Oct 07 12:43:21 crc kubenswrapper[4854]: I1007 12:43:21.474780 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf","Type":"ContainerStarted","Data":"66d61a36b5d71d29e4fc833793e7b8e2d512bb92c6cdc2ceee1421b70515026b"} Oct 07 12:43:21 crc kubenswrapper[4854]: I1007 12:43:21.475034 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 12:43:21 crc kubenswrapper[4854]: I1007 12:43:21.478719 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-643b-account-create-q5znc"] Oct 07 12:43:21 crc kubenswrapper[4854]: I1007 12:43:21.482793 4854 generic.go:334] "Generic (PLEG): container finished" podID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerID="aed246739a200dc3292b614b7f083808741f3422b08b991f5eeb78b24a7512a7" exitCode=0 Oct 07 12:43:21 crc kubenswrapper[4854]: I1007 12:43:21.482829 4854 generic.go:334] "Generic (PLEG): container finished" podID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerID="a9eb89ef4efa661c4083f60a2506bb42ebbedcf85f2dfd9263e2f0e6599f9588" exitCode=2 Oct 07 12:43:21 crc kubenswrapper[4854]: I1007 12:43:21.482839 4854 generic.go:334] "Generic (PLEG): container finished" podID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerID="4fec556e8b7c7348f34af57027490f7e654cae87c6f4e4c77e42d40f0f3e5c81" exitCode=0 Oct 07 12:43:21 crc kubenswrapper[4854]: I1007 12:43:21.482865 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a15303e-66e9-4c76-ba02-902d1949cf49","Type":"ContainerDied","Data":"aed246739a200dc3292b614b7f083808741f3422b08b991f5eeb78b24a7512a7"} Oct 07 12:43:21 crc kubenswrapper[4854]: I1007 12:43:21.482896 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a15303e-66e9-4c76-ba02-902d1949cf49","Type":"ContainerDied","Data":"a9eb89ef4efa661c4083f60a2506bb42ebbedcf85f2dfd9263e2f0e6599f9588"} Oct 07 12:43:21 crc kubenswrapper[4854]: I1007 12:43:21.482909 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a15303e-66e9-4c76-ba02-902d1949cf49","Type":"ContainerDied","Data":"4fec556e8b7c7348f34af57027490f7e654cae87c6f4e4c77e42d40f0f3e5c81"} Oct 07 12:43:21 crc kubenswrapper[4854]: I1007 12:43:21.498739 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.498720004 podStartE2EDuration="3.498720004s" podCreationTimestamp="2025-10-07 12:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:43:21.497396326 +0000 UTC m=+1117.485228591" watchObservedRunningTime="2025-10-07 12:43:21.498720004 +0000 UTC m=+1117.486552259" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.000946 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.082645 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-combined-ca-bundle\") pod \"1279e99a-d16c-441f-9aa8-33efa1cd351f\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.082717 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-465dq\" (UniqueName: \"kubernetes.io/projected/1279e99a-d16c-441f-9aa8-33efa1cd351f-kube-api-access-465dq\") pod \"1279e99a-d16c-441f-9aa8-33efa1cd351f\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.082796 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-ovndb-tls-certs\") pod \"1279e99a-d16c-441f-9aa8-33efa1cd351f\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.082826 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-config\") pod \"1279e99a-d16c-441f-9aa8-33efa1cd351f\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.082857 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-httpd-config\") pod \"1279e99a-d16c-441f-9aa8-33efa1cd351f\" (UID: \"1279e99a-d16c-441f-9aa8-33efa1cd351f\") " Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.082906 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="9dc203ca-4a7b-4508-87d6-4922a76a1fb7" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.148:9292/healthcheck\": read tcp 10.217.0.2:36736->10.217.0.148:9292: read: connection reset by peer" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.082951 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="9dc203ca-4a7b-4508-87d6-4922a76a1fb7" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.148:9292/healthcheck\": read tcp 10.217.0.2:36734->10.217.0.148:9292: read: connection reset by peer" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.088208 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1279e99a-d16c-441f-9aa8-33efa1cd351f" (UID: "1279e99a-d16c-441f-9aa8-33efa1cd351f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.090709 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1279e99a-d16c-441f-9aa8-33efa1cd351f-kube-api-access-465dq" (OuterVolumeSpecName: "kube-api-access-465dq") pod "1279e99a-d16c-441f-9aa8-33efa1cd351f" (UID: "1279e99a-d16c-441f-9aa8-33efa1cd351f"). InnerVolumeSpecName "kube-api-access-465dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.153716 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-config" (OuterVolumeSpecName: "config") pod "1279e99a-d16c-441f-9aa8-33efa1cd351f" (UID: "1279e99a-d16c-441f-9aa8-33efa1cd351f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.159328 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1279e99a-d16c-441f-9aa8-33efa1cd351f" (UID: "1279e99a-d16c-441f-9aa8-33efa1cd351f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.172338 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "1279e99a-d16c-441f-9aa8-33efa1cd351f" (UID: "1279e99a-d16c-441f-9aa8-33efa1cd351f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.184675 4854 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.184708 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.184717 4854 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.184726 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1279e99a-d16c-441f-9aa8-33efa1cd351f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.184735 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-465dq\" (UniqueName: \"kubernetes.io/projected/1279e99a-d16c-441f-9aa8-33efa1cd351f-kube-api-access-465dq\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.422843 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="2214523f-cc36-4152-b4cd-ad6731ceecfd" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.159:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.496010 4854 generic.go:334] "Generic (PLEG): container finished" podID="1279e99a-d16c-441f-9aa8-33efa1cd351f" containerID="6c0a474801b8f493e4770e6082eaf3f4b418df172e291fd80d1aee9313faf60c" exitCode=0 Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.496088 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8c6459d6-knb66" event={"ID":"1279e99a-d16c-441f-9aa8-33efa1cd351f","Type":"ContainerDied","Data":"6c0a474801b8f493e4770e6082eaf3f4b418df172e291fd80d1aee9313faf60c"} Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.496122 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b8c6459d6-knb66" event={"ID":"1279e99a-d16c-441f-9aa8-33efa1cd351f","Type":"ContainerDied","Data":"8b9cf9f1e5f913a2b0d2a38d6340ce6fb1474fa2f985aa231489e03e27ec7e48"} Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.496161 4854 scope.go:117] "RemoveContainer" containerID="b6914afc6e91abbd192b372434d91fd604cfb5c5e273f64244c294561b68e139" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.496309 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b8c6459d6-knb66" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.502887 4854 generic.go:334] "Generic (PLEG): container finished" podID="9dc203ca-4a7b-4508-87d6-4922a76a1fb7" containerID="a95bd116db9ea390605319534fc00572f56d0206068c040f76d694264158b473" exitCode=0 Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.503010 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dc203ca-4a7b-4508-87d6-4922a76a1fb7","Type":"ContainerDied","Data":"a95bd116db9ea390605319534fc00572f56d0206068c040f76d694264158b473"} Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.503045 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dc203ca-4a7b-4508-87d6-4922a76a1fb7","Type":"ContainerDied","Data":"952f6d163b565dd3772a045351572ebe9fb6a9e6aae6019001c353abc93b1a75"} Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.503060 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="952f6d163b565dd3772a045351572ebe9fb6a9e6aae6019001c353abc93b1a75" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.505467 4854 generic.go:334] "Generic (PLEG): container finished" podID="757432e0-dbdf-40a7-b6e4-2534d6600bea" containerID="583495e30479442aa4785338cb425c3825f402c3aefd36d65751b6b6adc984b2" exitCode=0 Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.506867 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-643b-account-create-q5znc" event={"ID":"757432e0-dbdf-40a7-b6e4-2534d6600bea","Type":"ContainerDied","Data":"583495e30479442aa4785338cb425c3825f402c3aefd36d65751b6b6adc984b2"} Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.506901 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-643b-account-create-q5znc" event={"ID":"757432e0-dbdf-40a7-b6e4-2534d6600bea","Type":"ContainerStarted","Data":"3e2d1a354d5d2fc9d51216328eb9a8450dff49dd4a0ba5659d5837352cbd3ddf"} Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.574867 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.596466 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b8c6459d6-knb66"] Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.599034 4854 scope.go:117] "RemoveContainer" containerID="6c0a474801b8f493e4770e6082eaf3f4b418df172e291fd80d1aee9313faf60c" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.603356 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b8c6459d6-knb66"] Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.635946 4854 scope.go:117] "RemoveContainer" containerID="b6914afc6e91abbd192b372434d91fd604cfb5c5e273f64244c294561b68e139" Oct 07 12:43:22 crc kubenswrapper[4854]: E1007 12:43:22.636392 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6914afc6e91abbd192b372434d91fd604cfb5c5e273f64244c294561b68e139\": container with ID starting with b6914afc6e91abbd192b372434d91fd604cfb5c5e273f64244c294561b68e139 not found: ID does not exist" containerID="b6914afc6e91abbd192b372434d91fd604cfb5c5e273f64244c294561b68e139" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.636417 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6914afc6e91abbd192b372434d91fd604cfb5c5e273f64244c294561b68e139"} err="failed to get container status \"b6914afc6e91abbd192b372434d91fd604cfb5c5e273f64244c294561b68e139\": rpc error: code = NotFound desc = could not find container \"b6914afc6e91abbd192b372434d91fd604cfb5c5e273f64244c294561b68e139\": container with ID starting with b6914afc6e91abbd192b372434d91fd604cfb5c5e273f64244c294561b68e139 not found: ID does not exist" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.636438 4854 scope.go:117] "RemoveContainer" containerID="6c0a474801b8f493e4770e6082eaf3f4b418df172e291fd80d1aee9313faf60c" Oct 07 12:43:22 crc kubenswrapper[4854]: E1007 12:43:22.636633 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c0a474801b8f493e4770e6082eaf3f4b418df172e291fd80d1aee9313faf60c\": container with ID starting with 6c0a474801b8f493e4770e6082eaf3f4b418df172e291fd80d1aee9313faf60c not found: ID does not exist" containerID="6c0a474801b8f493e4770e6082eaf3f4b418df172e291fd80d1aee9313faf60c" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.636650 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c0a474801b8f493e4770e6082eaf3f4b418df172e291fd80d1aee9313faf60c"} err="failed to get container status \"6c0a474801b8f493e4770e6082eaf3f4b418df172e291fd80d1aee9313faf60c\": rpc error: code = NotFound desc = could not find container \"6c0a474801b8f493e4770e6082eaf3f4b418df172e291fd80d1aee9313faf60c\": container with ID starting with 6c0a474801b8f493e4770e6082eaf3f4b418df172e291fd80d1aee9313faf60c not found: ID does not exist" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.698906 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-combined-ca-bundle\") pod \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.698957 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clwd6\" (UniqueName: \"kubernetes.io/projected/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-kube-api-access-clwd6\") pod \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.698991 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-public-tls-certs\") pod \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.699080 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-scripts\") pod \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.699108 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.699137 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-config-data\") pod \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.699721 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-httpd-run\") pod \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.699826 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-logs\") pod \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\" (UID: \"9dc203ca-4a7b-4508-87d6-4922a76a1fb7\") " Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.700000 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9dc203ca-4a7b-4508-87d6-4922a76a1fb7" (UID: "9dc203ca-4a7b-4508-87d6-4922a76a1fb7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.700251 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-logs" (OuterVolumeSpecName: "logs") pod "9dc203ca-4a7b-4508-87d6-4922a76a1fb7" (UID: "9dc203ca-4a7b-4508-87d6-4922a76a1fb7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.700735 4854 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.700761 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.704059 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "9dc203ca-4a7b-4508-87d6-4922a76a1fb7" (UID: "9dc203ca-4a7b-4508-87d6-4922a76a1fb7"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.706018 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-scripts" (OuterVolumeSpecName: "scripts") pod "9dc203ca-4a7b-4508-87d6-4922a76a1fb7" (UID: "9dc203ca-4a7b-4508-87d6-4922a76a1fb7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.706345 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-kube-api-access-clwd6" (OuterVolumeSpecName: "kube-api-access-clwd6") pod "9dc203ca-4a7b-4508-87d6-4922a76a1fb7" (UID: "9dc203ca-4a7b-4508-87d6-4922a76a1fb7"). InnerVolumeSpecName "kube-api-access-clwd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.716743 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1279e99a-d16c-441f-9aa8-33efa1cd351f" path="/var/lib/kubelet/pods/1279e99a-d16c-441f-9aa8-33efa1cd351f/volumes" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.733774 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dc203ca-4a7b-4508-87d6-4922a76a1fb7" (UID: "9dc203ca-4a7b-4508-87d6-4922a76a1fb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.751206 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9dc203ca-4a7b-4508-87d6-4922a76a1fb7" (UID: "9dc203ca-4a7b-4508-87d6-4922a76a1fb7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.761631 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-config-data" (OuterVolumeSpecName: "config-data") pod "9dc203ca-4a7b-4508-87d6-4922a76a1fb7" (UID: "9dc203ca-4a7b-4508-87d6-4922a76a1fb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.802780 4854 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.802814 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.802834 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.802843 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.802854 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.802864 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clwd6\" (UniqueName: \"kubernetes.io/projected/9dc203ca-4a7b-4508-87d6-4922a76a1fb7-kube-api-access-clwd6\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.822256 4854 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 07 12:43:22 crc kubenswrapper[4854]: I1007 12:43:22.905404 4854 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.515392 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.552163 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.566709 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.580792 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:43:23 crc kubenswrapper[4854]: E1007 12:43:23.581261 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1279e99a-d16c-441f-9aa8-33efa1cd351f" containerName="neutron-api" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.581287 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1279e99a-d16c-441f-9aa8-33efa1cd351f" containerName="neutron-api" Oct 07 12:43:23 crc kubenswrapper[4854]: E1007 12:43:23.581331 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc203ca-4a7b-4508-87d6-4922a76a1fb7" containerName="glance-log" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.581341 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc203ca-4a7b-4508-87d6-4922a76a1fb7" containerName="glance-log" Oct 07 12:43:23 crc kubenswrapper[4854]: E1007 12:43:23.581372 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc203ca-4a7b-4508-87d6-4922a76a1fb7" containerName="glance-httpd" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.581382 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc203ca-4a7b-4508-87d6-4922a76a1fb7" containerName="glance-httpd" Oct 07 12:43:23 crc kubenswrapper[4854]: E1007 12:43:23.581401 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1279e99a-d16c-441f-9aa8-33efa1cd351f" containerName="neutron-httpd" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.581410 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1279e99a-d16c-441f-9aa8-33efa1cd351f" containerName="neutron-httpd" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.581604 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="1279e99a-d16c-441f-9aa8-33efa1cd351f" containerName="neutron-api" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.581620 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc203ca-4a7b-4508-87d6-4922a76a1fb7" containerName="glance-log" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.581630 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="1279e99a-d16c-441f-9aa8-33efa1cd351f" containerName="neutron-httpd" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.581636 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc203ca-4a7b-4508-87d6-4922a76a1fb7" containerName="glance-httpd" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.583859 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.591836 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.591844 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.596746 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.620130 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a936b898-5163-4c5e-ac30-64af1533cec7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.620233 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-config-data\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.620313 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.620423 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.620449 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.620488 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-scripts\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.621519 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a936b898-5163-4c5e-ac30-64af1533cec7-logs\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.625056 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlwq8\" (UniqueName: \"kubernetes.io/projected/a936b898-5163-4c5e-ac30-64af1533cec7-kube-api-access-tlwq8\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.726323 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.726382 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-scripts\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.726412 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a936b898-5163-4c5e-ac30-64af1533cec7-logs\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.726457 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlwq8\" (UniqueName: \"kubernetes.io/projected/a936b898-5163-4c5e-ac30-64af1533cec7-kube-api-access-tlwq8\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.726497 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a936b898-5163-4c5e-ac30-64af1533cec7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.726519 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-config-data\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.726573 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.726640 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.728019 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.728217 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a936b898-5163-4c5e-ac30-64af1533cec7-logs\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.728358 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a936b898-5163-4c5e-ac30-64af1533cec7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.769085 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-scripts\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.769512 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-config-data\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.769892 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.773756 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.777174 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlwq8\" (UniqueName: \"kubernetes.io/projected/a936b898-5163-4c5e-ac30-64af1533cec7-kube-api-access-tlwq8\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.795888 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.803948 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.917512 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 12:43:23 crc kubenswrapper[4854]: I1007 12:43:23.936954 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-643b-account-create-q5znc" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.033136 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds64s\" (UniqueName: \"kubernetes.io/projected/757432e0-dbdf-40a7-b6e4-2534d6600bea-kube-api-access-ds64s\") pod \"757432e0-dbdf-40a7-b6e4-2534d6600bea\" (UID: \"757432e0-dbdf-40a7-b6e4-2534d6600bea\") " Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.038646 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757432e0-dbdf-40a7-b6e4-2534d6600bea-kube-api-access-ds64s" (OuterVolumeSpecName: "kube-api-access-ds64s") pod "757432e0-dbdf-40a7-b6e4-2534d6600bea" (UID: "757432e0-dbdf-40a7-b6e4-2534d6600bea"). InnerVolumeSpecName "kube-api-access-ds64s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.135710 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds64s\" (UniqueName: \"kubernetes.io/projected/757432e0-dbdf-40a7-b6e4-2534d6600bea-kube-api-access-ds64s\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.537277 4854 generic.go:334] "Generic (PLEG): container finished" podID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerID="512023c1348affae988ba42ea4c9852e6519486ab0ddc2e0e6e48fce90d79cf9" exitCode=0 Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.537389 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a15303e-66e9-4c76-ba02-902d1949cf49","Type":"ContainerDied","Data":"512023c1348affae988ba42ea4c9852e6519486ab0ddc2e0e6e48fce90d79cf9"} Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.537791 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a15303e-66e9-4c76-ba02-902d1949cf49","Type":"ContainerDied","Data":"daeb0808f9280ca4fadd369f7263738401cbd4f3ae302e4e8cf237c5864f7a73"} Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.537813 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daeb0808f9280ca4fadd369f7263738401cbd4f3ae302e4e8cf237c5864f7a73" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.540193 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-643b-account-create-q5znc" event={"ID":"757432e0-dbdf-40a7-b6e4-2534d6600bea","Type":"ContainerDied","Data":"3e2d1a354d5d2fc9d51216328eb9a8450dff49dd4a0ba5659d5837352cbd3ddf"} Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.540211 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-643b-account-create-q5znc" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.540226 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e2d1a354d5d2fc9d51216328eb9a8450dff49dd4a0ba5659d5837352cbd3ddf" Oct 07 12:43:24 crc kubenswrapper[4854]: W1007 12:43:24.550432 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda936b898_5163_4c5e_ac30_64af1533cec7.slice/crio-178b76a44ed3e91ddc157457a0a57a8f8807036b5e3c023f4860ac6fa7893eb4 WatchSource:0}: Error finding container 178b76a44ed3e91ddc157457a0a57a8f8807036b5e3c023f4860ac6fa7893eb4: Status 404 returned error can't find the container with id 178b76a44ed3e91ddc157457a0a57a8f8807036b5e3c023f4860ac6fa7893eb4 Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.551583 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.614809 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.646135 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-combined-ca-bundle\") pod \"7a15303e-66e9-4c76-ba02-902d1949cf49\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.646267 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a15303e-66e9-4c76-ba02-902d1949cf49-log-httpd\") pod \"7a15303e-66e9-4c76-ba02-902d1949cf49\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.646311 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r9rc\" (UniqueName: \"kubernetes.io/projected/7a15303e-66e9-4c76-ba02-902d1949cf49-kube-api-access-8r9rc\") pod \"7a15303e-66e9-4c76-ba02-902d1949cf49\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.646358 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a15303e-66e9-4c76-ba02-902d1949cf49-run-httpd\") pod \"7a15303e-66e9-4c76-ba02-902d1949cf49\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.646476 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-config-data\") pod \"7a15303e-66e9-4c76-ba02-902d1949cf49\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.646513 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-scripts\") pod \"7a15303e-66e9-4c76-ba02-902d1949cf49\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.646552 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-sg-core-conf-yaml\") pod \"7a15303e-66e9-4c76-ba02-902d1949cf49\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.646652 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-ceilometer-tls-certs\") pod \"7a15303e-66e9-4c76-ba02-902d1949cf49\" (UID: \"7a15303e-66e9-4c76-ba02-902d1949cf49\") " Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.647194 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a15303e-66e9-4c76-ba02-902d1949cf49-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7a15303e-66e9-4c76-ba02-902d1949cf49" (UID: "7a15303e-66e9-4c76-ba02-902d1949cf49"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.650115 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a15303e-66e9-4c76-ba02-902d1949cf49-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7a15303e-66e9-4c76-ba02-902d1949cf49" (UID: "7a15303e-66e9-4c76-ba02-902d1949cf49"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.660980 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-scripts" (OuterVolumeSpecName: "scripts") pod "7a15303e-66e9-4c76-ba02-902d1949cf49" (UID: "7a15303e-66e9-4c76-ba02-902d1949cf49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.673391 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a15303e-66e9-4c76-ba02-902d1949cf49-kube-api-access-8r9rc" (OuterVolumeSpecName: "kube-api-access-8r9rc") pod "7a15303e-66e9-4c76-ba02-902d1949cf49" (UID: "7a15303e-66e9-4c76-ba02-902d1949cf49"). InnerVolumeSpecName "kube-api-access-8r9rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.688072 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7a15303e-66e9-4c76-ba02-902d1949cf49" (UID: "7a15303e-66e9-4c76-ba02-902d1949cf49"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.720986 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc203ca-4a7b-4508-87d6-4922a76a1fb7" path="/var/lib/kubelet/pods/9dc203ca-4a7b-4508-87d6-4922a76a1fb7/volumes" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.749095 4854 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a15303e-66e9-4c76-ba02-902d1949cf49-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.749134 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r9rc\" (UniqueName: \"kubernetes.io/projected/7a15303e-66e9-4c76-ba02-902d1949cf49-kube-api-access-8r9rc\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.749163 4854 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a15303e-66e9-4c76-ba02-902d1949cf49-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.749174 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.749186 4854 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.763133 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a15303e-66e9-4c76-ba02-902d1949cf49" (UID: "7a15303e-66e9-4c76-ba02-902d1949cf49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.771486 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7a15303e-66e9-4c76-ba02-902d1949cf49" (UID: "7a15303e-66e9-4c76-ba02-902d1949cf49"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.811410 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-config-data" (OuterVolumeSpecName: "config-data") pod "7a15303e-66e9-4c76-ba02-902d1949cf49" (UID: "7a15303e-66e9-4c76-ba02-902d1949cf49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.853695 4854 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.853727 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:24 crc kubenswrapper[4854]: I1007 12:43:24.853737 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a15303e-66e9-4c76-ba02-902d1949cf49-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.560748 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.560839 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a936b898-5163-4c5e-ac30-64af1533cec7","Type":"ContainerStarted","Data":"ab9b63d9a92d2b3c1a80cf5ab47e0bdb681cabb6aa9a7547201bf16b8e4df31e"} Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.561268 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a936b898-5163-4c5e-ac30-64af1533cec7","Type":"ContainerStarted","Data":"178b76a44ed3e91ddc157457a0a57a8f8807036b5e3c023f4860ac6fa7893eb4"} Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.624567 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.636734 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.646769 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:25 crc kubenswrapper[4854]: E1007 12:43:25.647195 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="sg-core" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.647214 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="sg-core" Oct 07 12:43:25 crc kubenswrapper[4854]: E1007 12:43:25.647234 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="ceilometer-central-agent" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.647241 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="ceilometer-central-agent" Oct 07 12:43:25 crc kubenswrapper[4854]: E1007 12:43:25.647258 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757432e0-dbdf-40a7-b6e4-2534d6600bea" containerName="mariadb-account-create" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.647266 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="757432e0-dbdf-40a7-b6e4-2534d6600bea" containerName="mariadb-account-create" Oct 07 12:43:25 crc kubenswrapper[4854]: E1007 12:43:25.647276 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="ceilometer-notification-agent" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.647283 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="ceilometer-notification-agent" Oct 07 12:43:25 crc kubenswrapper[4854]: E1007 12:43:25.647303 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="proxy-httpd" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.647310 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="proxy-httpd" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.647474 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="sg-core" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.647492 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="proxy-httpd" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.647502 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="ceilometer-central-agent" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.647512 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="757432e0-dbdf-40a7-b6e4-2534d6600bea" containerName="mariadb-account-create" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.647520 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" containerName="ceilometer-notification-agent" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.649711 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.651851 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.652291 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.652487 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.662949 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.768794 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.768858 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-log-httpd\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.768924 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbvl4\" (UniqueName: \"kubernetes.io/projected/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-kube-api-access-cbvl4\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.768942 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.769017 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-scripts\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.769079 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.769113 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-config-data\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.769167 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-run-httpd\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.870126 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbvl4\" (UniqueName: \"kubernetes.io/projected/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-kube-api-access-cbvl4\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.870178 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.870228 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-scripts\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.870262 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.870282 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-config-data\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.870319 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-run-httpd\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.870346 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.870372 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-log-httpd\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.870725 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-log-httpd\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.870986 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-run-httpd\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.875572 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.876750 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-config-data\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.877567 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.884852 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-scripts\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.885503 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.894374 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbvl4\" (UniqueName: \"kubernetes.io/projected/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-kube-api-access-cbvl4\") pod \"ceilometer-0\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " pod="openstack/ceilometer-0" Oct 07 12:43:25 crc kubenswrapper[4854]: I1007 12:43:25.970652 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:43:26 crc kubenswrapper[4854]: I1007 12:43:26.417442 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:26 crc kubenswrapper[4854]: W1007 12:43:26.420591 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc4429b8_da8e_46fd_8ec6_ea5fcc1a6198.slice/crio-6b598b749d7da90fa20b95c6503e2ac1cc4eb2cf6bd3c1ba4ca638751246a6cc WatchSource:0}: Error finding container 6b598b749d7da90fa20b95c6503e2ac1cc4eb2cf6bd3c1ba4ca638751246a6cc: Status 404 returned error can't find the container with id 6b598b749d7da90fa20b95c6503e2ac1cc4eb2cf6bd3c1ba4ca638751246a6cc Oct 07 12:43:26 crc kubenswrapper[4854]: I1007 12:43:26.571556 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a936b898-5163-4c5e-ac30-64af1533cec7","Type":"ContainerStarted","Data":"a963e64a5e60796e5674815e7ff05f24676f7da4a8d14514981bf0f0c9a9c739"} Oct 07 12:43:26 crc kubenswrapper[4854]: I1007 12:43:26.573833 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198","Type":"ContainerStarted","Data":"6b598b749d7da90fa20b95c6503e2ac1cc4eb2cf6bd3c1ba4ca638751246a6cc"} Oct 07 12:43:26 crc kubenswrapper[4854]: I1007 12:43:26.606801 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:26 crc kubenswrapper[4854]: I1007 12:43:26.716585 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a15303e-66e9-4c76-ba02-902d1949cf49" path="/var/lib/kubelet/pods/7a15303e-66e9-4c76-ba02-902d1949cf49/volumes" Oct 07 12:43:26 crc kubenswrapper[4854]: I1007 12:43:26.717449 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 12:43:26 crc kubenswrapper[4854]: I1007 12:43:26.717477 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 12:43:26 crc kubenswrapper[4854]: I1007 12:43:26.737979 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 12:43:26 crc kubenswrapper[4854]: I1007 12:43:26.755210 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 12:43:26 crc kubenswrapper[4854]: I1007 12:43:26.766570 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.766552708 podStartE2EDuration="3.766552708s" podCreationTimestamp="2025-10-07 12:43:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:43:26.619968856 +0000 UTC m=+1122.607801111" watchObservedRunningTime="2025-10-07 12:43:26.766552708 +0000 UTC m=+1122.754384963" Oct 07 12:43:27 crc kubenswrapper[4854]: I1007 12:43:27.584007 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198","Type":"ContainerStarted","Data":"0e66bb5e4a5802427e46bf735c2796c91b895f20ecd886ea205aa6d66c586737"} Oct 07 12:43:27 crc kubenswrapper[4854]: I1007 12:43:27.584495 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 12:43:27 crc kubenswrapper[4854]: I1007 12:43:27.584530 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 12:43:28 crc kubenswrapper[4854]: I1007 12:43:28.595865 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198","Type":"ContainerStarted","Data":"866c4fc273b718201e4a0c66415d7bc4a0c2b56bcaab916d63410b88d97edfc8"} Oct 07 12:43:29 crc kubenswrapper[4854]: I1007 12:43:29.622314 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198","Type":"ContainerStarted","Data":"4c9aba28c14e24369dc5854de285866b8c150d6dc18aed6fb25f5d0ef76b09ed"} Oct 07 12:43:29 crc kubenswrapper[4854]: I1007 12:43:29.782128 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 12:43:29 crc kubenswrapper[4854]: I1007 12:43:29.782282 4854 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 12:43:29 crc kubenswrapper[4854]: I1007 12:43:29.784221 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 12:43:30 crc kubenswrapper[4854]: I1007 12:43:30.797180 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e568-account-create-hjhqj"] Oct 07 12:43:30 crc kubenswrapper[4854]: I1007 12:43:30.798499 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e568-account-create-hjhqj" Oct 07 12:43:30 crc kubenswrapper[4854]: I1007 12:43:30.802885 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 07 12:43:30 crc kubenswrapper[4854]: I1007 12:43:30.812726 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e568-account-create-hjhqj"] Oct 07 12:43:30 crc kubenswrapper[4854]: I1007 12:43:30.972977 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-531a-account-create-zzlft"] Oct 07 12:43:30 crc kubenswrapper[4854]: I1007 12:43:30.974351 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-531a-account-create-zzlft" Oct 07 12:43:30 crc kubenswrapper[4854]: I1007 12:43:30.977187 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 07 12:43:30 crc kubenswrapper[4854]: I1007 12:43:30.992136 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrr7q\" (UniqueName: \"kubernetes.io/projected/3a366a87-147e-465b-80f0-484c351b07c4-kube-api-access-qrr7q\") pod \"nova-cell0-e568-account-create-hjhqj\" (UID: \"3a366a87-147e-465b-80f0-484c351b07c4\") " pod="openstack/nova-cell0-e568-account-create-hjhqj" Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.002732 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-531a-account-create-zzlft"] Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.093914 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrr7q\" (UniqueName: \"kubernetes.io/projected/3a366a87-147e-465b-80f0-484c351b07c4-kube-api-access-qrr7q\") pod \"nova-cell0-e568-account-create-hjhqj\" (UID: \"3a366a87-147e-465b-80f0-484c351b07c4\") " pod="openstack/nova-cell0-e568-account-create-hjhqj" Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.094076 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwm9b\" (UniqueName: \"kubernetes.io/projected/4e76b358-7d43-4d2b-8aec-c90b3e4e1021-kube-api-access-kwm9b\") pod \"nova-cell1-531a-account-create-zzlft\" (UID: \"4e76b358-7d43-4d2b-8aec-c90b3e4e1021\") " pod="openstack/nova-cell1-531a-account-create-zzlft" Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.111958 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrr7q\" (UniqueName: \"kubernetes.io/projected/3a366a87-147e-465b-80f0-484c351b07c4-kube-api-access-qrr7q\") pod \"nova-cell0-e568-account-create-hjhqj\" (UID: \"3a366a87-147e-465b-80f0-484c351b07c4\") " pod="openstack/nova-cell0-e568-account-create-hjhqj" Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.117679 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e568-account-create-hjhqj" Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.166903 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.196197 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwm9b\" (UniqueName: \"kubernetes.io/projected/4e76b358-7d43-4d2b-8aec-c90b3e4e1021-kube-api-access-kwm9b\") pod \"nova-cell1-531a-account-create-zzlft\" (UID: \"4e76b358-7d43-4d2b-8aec-c90b3e4e1021\") " pod="openstack/nova-cell1-531a-account-create-zzlft" Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.234460 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwm9b\" (UniqueName: \"kubernetes.io/projected/4e76b358-7d43-4d2b-8aec-c90b3e4e1021-kube-api-access-kwm9b\") pod \"nova-cell1-531a-account-create-zzlft\" (UID: \"4e76b358-7d43-4d2b-8aec-c90b3e4e1021\") " pod="openstack/nova-cell1-531a-account-create-zzlft" Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.366645 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-531a-account-create-zzlft" Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.659505 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198","Type":"ContainerStarted","Data":"128d2414d1af63511ce849e03979372ff7ae46dcd0e1cc78521005e83a16a040"} Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.660015 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="ceilometer-central-agent" containerID="cri-o://0e66bb5e4a5802427e46bf735c2796c91b895f20ecd886ea205aa6d66c586737" gracePeriod=30 Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.660117 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.660533 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="proxy-httpd" containerID="cri-o://128d2414d1af63511ce849e03979372ff7ae46dcd0e1cc78521005e83a16a040" gracePeriod=30 Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.660576 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="sg-core" containerID="cri-o://4c9aba28c14e24369dc5854de285866b8c150d6dc18aed6fb25f5d0ef76b09ed" gracePeriod=30 Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.660614 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="ceilometer-notification-agent" containerID="cri-o://866c4fc273b718201e4a0c66415d7bc4a0c2b56bcaab916d63410b88d97edfc8" gracePeriod=30 Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.699340 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.248054755 podStartE2EDuration="6.694115789s" podCreationTimestamp="2025-10-07 12:43:25 +0000 UTC" firstStartedPulling="2025-10-07 12:43:26.423266548 +0000 UTC m=+1122.411098803" lastFinishedPulling="2025-10-07 12:43:30.869327582 +0000 UTC m=+1126.857159837" observedRunningTime="2025-10-07 12:43:31.686693582 +0000 UTC m=+1127.674525837" watchObservedRunningTime="2025-10-07 12:43:31.694115789 +0000 UTC m=+1127.681948044" Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.769323 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e568-account-create-hjhqj"] Oct 07 12:43:31 crc kubenswrapper[4854]: I1007 12:43:31.927185 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-531a-account-create-zzlft"] Oct 07 12:43:31 crc kubenswrapper[4854]: W1007 12:43:31.978999 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e76b358_7d43_4d2b_8aec_c90b3e4e1021.slice/crio-5ad86dfc6e16f4daff947d79173d2b6bece73fd4b33a941f8b0f018f46e1cffe WatchSource:0}: Error finding container 5ad86dfc6e16f4daff947d79173d2b6bece73fd4b33a941f8b0f018f46e1cffe: Status 404 returned error can't find the container with id 5ad86dfc6e16f4daff947d79173d2b6bece73fd4b33a941f8b0f018f46e1cffe Oct 07 12:43:32 crc kubenswrapper[4854]: I1007 12:43:32.669090 4854 generic.go:334] "Generic (PLEG): container finished" podID="3a366a87-147e-465b-80f0-484c351b07c4" containerID="22436301dc51bb07908e34417e3def5bdc13345529116eef4d8a01af67741c06" exitCode=0 Oct 07 12:43:32 crc kubenswrapper[4854]: I1007 12:43:32.669124 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e568-account-create-hjhqj" event={"ID":"3a366a87-147e-465b-80f0-484c351b07c4","Type":"ContainerDied","Data":"22436301dc51bb07908e34417e3def5bdc13345529116eef4d8a01af67741c06"} Oct 07 12:43:32 crc kubenswrapper[4854]: I1007 12:43:32.669228 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e568-account-create-hjhqj" event={"ID":"3a366a87-147e-465b-80f0-484c351b07c4","Type":"ContainerStarted","Data":"be3dbaf83f8473449a97c3b708fee5cdd83f383e0f371b44f7e4ba8d581ad9dc"} Oct 07 12:43:32 crc kubenswrapper[4854]: I1007 12:43:32.671010 4854 generic.go:334] "Generic (PLEG): container finished" podID="4e76b358-7d43-4d2b-8aec-c90b3e4e1021" containerID="4f650846279a6172ef21f2029a264b0a99efd4af6906193c022b89126c8051db" exitCode=0 Oct 07 12:43:32 crc kubenswrapper[4854]: I1007 12:43:32.671069 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-531a-account-create-zzlft" event={"ID":"4e76b358-7d43-4d2b-8aec-c90b3e4e1021","Type":"ContainerDied","Data":"4f650846279a6172ef21f2029a264b0a99efd4af6906193c022b89126c8051db"} Oct 07 12:43:32 crc kubenswrapper[4854]: I1007 12:43:32.671189 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-531a-account-create-zzlft" event={"ID":"4e76b358-7d43-4d2b-8aec-c90b3e4e1021","Type":"ContainerStarted","Data":"5ad86dfc6e16f4daff947d79173d2b6bece73fd4b33a941f8b0f018f46e1cffe"} Oct 07 12:43:32 crc kubenswrapper[4854]: I1007 12:43:32.674473 4854 generic.go:334] "Generic (PLEG): container finished" podID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerID="128d2414d1af63511ce849e03979372ff7ae46dcd0e1cc78521005e83a16a040" exitCode=0 Oct 07 12:43:32 crc kubenswrapper[4854]: I1007 12:43:32.674516 4854 generic.go:334] "Generic (PLEG): container finished" podID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerID="4c9aba28c14e24369dc5854de285866b8c150d6dc18aed6fb25f5d0ef76b09ed" exitCode=2 Oct 07 12:43:32 crc kubenswrapper[4854]: I1007 12:43:32.674526 4854 generic.go:334] "Generic (PLEG): container finished" podID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerID="866c4fc273b718201e4a0c66415d7bc4a0c2b56bcaab916d63410b88d97edfc8" exitCode=0 Oct 07 12:43:32 crc kubenswrapper[4854]: I1007 12:43:32.674554 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198","Type":"ContainerDied","Data":"128d2414d1af63511ce849e03979372ff7ae46dcd0e1cc78521005e83a16a040"} Oct 07 12:43:32 crc kubenswrapper[4854]: I1007 12:43:32.674586 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198","Type":"ContainerDied","Data":"4c9aba28c14e24369dc5854de285866b8c150d6dc18aed6fb25f5d0ef76b09ed"} Oct 07 12:43:32 crc kubenswrapper[4854]: I1007 12:43:32.674600 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198","Type":"ContainerDied","Data":"866c4fc273b718201e4a0c66415d7bc4a0c2b56bcaab916d63410b88d97edfc8"} Oct 07 12:43:33 crc kubenswrapper[4854]: I1007 12:43:33.918773 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 12:43:33 crc kubenswrapper[4854]: I1007 12:43:33.919387 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 12:43:33 crc kubenswrapper[4854]: I1007 12:43:33.956811 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 12:43:33 crc kubenswrapper[4854]: I1007 12:43:33.995194 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.110535 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e568-account-create-hjhqj" Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.116584 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-531a-account-create-zzlft" Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.269336 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrr7q\" (UniqueName: \"kubernetes.io/projected/3a366a87-147e-465b-80f0-484c351b07c4-kube-api-access-qrr7q\") pod \"3a366a87-147e-465b-80f0-484c351b07c4\" (UID: \"3a366a87-147e-465b-80f0-484c351b07c4\") " Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.269582 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwm9b\" (UniqueName: \"kubernetes.io/projected/4e76b358-7d43-4d2b-8aec-c90b3e4e1021-kube-api-access-kwm9b\") pod \"4e76b358-7d43-4d2b-8aec-c90b3e4e1021\" (UID: \"4e76b358-7d43-4d2b-8aec-c90b3e4e1021\") " Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.296733 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e76b358-7d43-4d2b-8aec-c90b3e4e1021-kube-api-access-kwm9b" (OuterVolumeSpecName: "kube-api-access-kwm9b") pod "4e76b358-7d43-4d2b-8aec-c90b3e4e1021" (UID: "4e76b358-7d43-4d2b-8aec-c90b3e4e1021"). InnerVolumeSpecName "kube-api-access-kwm9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.297911 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a366a87-147e-465b-80f0-484c351b07c4-kube-api-access-qrr7q" (OuterVolumeSpecName: "kube-api-access-qrr7q") pod "3a366a87-147e-465b-80f0-484c351b07c4" (UID: "3a366a87-147e-465b-80f0-484c351b07c4"). InnerVolumeSpecName "kube-api-access-qrr7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.371281 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrr7q\" (UniqueName: \"kubernetes.io/projected/3a366a87-147e-465b-80f0-484c351b07c4-kube-api-access-qrr7q\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.371309 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwm9b\" (UniqueName: \"kubernetes.io/projected/4e76b358-7d43-4d2b-8aec-c90b3e4e1021-kube-api-access-kwm9b\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.694853 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e568-account-create-hjhqj" event={"ID":"3a366a87-147e-465b-80f0-484c351b07c4","Type":"ContainerDied","Data":"be3dbaf83f8473449a97c3b708fee5cdd83f383e0f371b44f7e4ba8d581ad9dc"} Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.695227 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be3dbaf83f8473449a97c3b708fee5cdd83f383e0f371b44f7e4ba8d581ad9dc" Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.694894 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e568-account-create-hjhqj" Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.696522 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-531a-account-create-zzlft" event={"ID":"4e76b358-7d43-4d2b-8aec-c90b3e4e1021","Type":"ContainerDied","Data":"5ad86dfc6e16f4daff947d79173d2b6bece73fd4b33a941f8b0f018f46e1cffe"} Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.696642 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ad86dfc6e16f4daff947d79173d2b6bece73fd4b33a941f8b0f018f46e1cffe" Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.696731 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.696791 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 12:43:34 crc kubenswrapper[4854]: I1007 12:43:34.696559 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-531a-account-create-zzlft" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.105783 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rrljj"] Oct 07 12:43:36 crc kubenswrapper[4854]: E1007 12:43:36.106173 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e76b358-7d43-4d2b-8aec-c90b3e4e1021" containerName="mariadb-account-create" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.106185 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e76b358-7d43-4d2b-8aec-c90b3e4e1021" containerName="mariadb-account-create" Oct 07 12:43:36 crc kubenswrapper[4854]: E1007 12:43:36.106196 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a366a87-147e-465b-80f0-484c351b07c4" containerName="mariadb-account-create" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.106201 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a366a87-147e-465b-80f0-484c351b07c4" containerName="mariadb-account-create" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.106390 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a366a87-147e-465b-80f0-484c351b07c4" containerName="mariadb-account-create" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.106409 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e76b358-7d43-4d2b-8aec-c90b3e4e1021" containerName="mariadb-account-create" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.106987 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.111667 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.111772 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.112320 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5xvmn" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.114469 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rrljj"] Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.208194 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-scripts\") pod \"nova-cell0-conductor-db-sync-rrljj\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.208260 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rrljj\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.208296 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hhjs\" (UniqueName: \"kubernetes.io/projected/85fde367-0bd7-457c-bfa7-6d72cece519e-kube-api-access-5hhjs\") pod \"nova-cell0-conductor-db-sync-rrljj\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.208368 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-config-data\") pod \"nova-cell0-conductor-db-sync-rrljj\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.310663 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-config-data\") pod \"nova-cell0-conductor-db-sync-rrljj\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.310821 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-scripts\") pod \"nova-cell0-conductor-db-sync-rrljj\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.310865 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rrljj\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.310912 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hhjs\" (UniqueName: \"kubernetes.io/projected/85fde367-0bd7-457c-bfa7-6d72cece519e-kube-api-access-5hhjs\") pod \"nova-cell0-conductor-db-sync-rrljj\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.317919 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-config-data\") pod \"nova-cell0-conductor-db-sync-rrljj\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.318037 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rrljj\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.318730 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-scripts\") pod \"nova-cell0-conductor-db-sync-rrljj\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.334678 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hhjs\" (UniqueName: \"kubernetes.io/projected/85fde367-0bd7-457c-bfa7-6d72cece519e-kube-api-access-5hhjs\") pod \"nova-cell0-conductor-db-sync-rrljj\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.426384 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.885748 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rrljj"] Oct 07 12:43:36 crc kubenswrapper[4854]: W1007 12:43:36.891576 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85fde367_0bd7_457c_bfa7_6d72cece519e.slice/crio-94eee5c450cdc50166881e1fcf74851557ad16ab2236ad7dc151363a2ab9081b WatchSource:0}: Error finding container 94eee5c450cdc50166881e1fcf74851557ad16ab2236ad7dc151363a2ab9081b: Status 404 returned error can't find the container with id 94eee5c450cdc50166881e1fcf74851557ad16ab2236ad7dc151363a2ab9081b Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.966664 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.966789 4854 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 12:43:36 crc kubenswrapper[4854]: I1007 12:43:36.973448 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 12:43:37 crc kubenswrapper[4854]: I1007 12:43:37.726075 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rrljj" event={"ID":"85fde367-0bd7-457c-bfa7-6d72cece519e","Type":"ContainerStarted","Data":"94eee5c450cdc50166881e1fcf74851557ad16ab2236ad7dc151363a2ab9081b"} Oct 07 12:43:42 crc kubenswrapper[4854]: I1007 12:43:42.823246 4854 generic.go:334] "Generic (PLEG): container finished" podID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerID="0e66bb5e4a5802427e46bf735c2796c91b895f20ecd886ea205aa6d66c586737" exitCode=0 Oct 07 12:43:42 crc kubenswrapper[4854]: I1007 12:43:42.824854 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198","Type":"ContainerDied","Data":"0e66bb5e4a5802427e46bf735c2796c91b895f20ecd886ea205aa6d66c586737"} Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.348066 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.368256 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-combined-ca-bundle\") pod \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.368336 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-config-data\") pod \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.368453 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-sg-core-conf-yaml\") pod \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.368501 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-log-httpd\") pod \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.368585 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-ceilometer-tls-certs\") pod \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.368618 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-scripts\") pod \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.368646 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-run-httpd\") pod \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.368673 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbvl4\" (UniqueName: \"kubernetes.io/projected/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-kube-api-access-cbvl4\") pod \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\" (UID: \"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198\") " Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.369282 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" (UID: "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.369388 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" (UID: "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.377320 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-scripts" (OuterVolumeSpecName: "scripts") pod "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" (UID: "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.391357 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-kube-api-access-cbvl4" (OuterVolumeSpecName: "kube-api-access-cbvl4") pod "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" (UID: "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198"). InnerVolumeSpecName "kube-api-access-cbvl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.467395 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" (UID: "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.468162 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" (UID: "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.470359 4854 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.470405 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.470417 4854 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.470428 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbvl4\" (UniqueName: \"kubernetes.io/projected/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-kube-api-access-cbvl4\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.470438 4854 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.470471 4854 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.513332 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" (UID: "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.572243 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.611029 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-config-data" (OuterVolumeSpecName: "config-data") pod "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" (UID: "bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.673626 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.864448 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rrljj" event={"ID":"85fde367-0bd7-457c-bfa7-6d72cece519e","Type":"ContainerStarted","Data":"82f009346826759cec317a3226ffe23dbce339f0f5f46da29da41f5605075cb6"} Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.869105 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198","Type":"ContainerDied","Data":"6b598b749d7da90fa20b95c6503e2ac1cc4eb2cf6bd3c1ba4ca638751246a6cc"} Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.869166 4854 scope.go:117] "RemoveContainer" containerID="128d2414d1af63511ce849e03979372ff7ae46dcd0e1cc78521005e83a16a040" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.869405 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.888040 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-rrljj" podStartSLOduration=1.7364978770000001 podStartE2EDuration="8.888019308s" podCreationTimestamp="2025-10-07 12:43:36 +0000 UTC" firstStartedPulling="2025-10-07 12:43:36.893112386 +0000 UTC m=+1132.880944641" lastFinishedPulling="2025-10-07 12:43:44.044633817 +0000 UTC m=+1140.032466072" observedRunningTime="2025-10-07 12:43:44.884725572 +0000 UTC m=+1140.872557857" watchObservedRunningTime="2025-10-07 12:43:44.888019308 +0000 UTC m=+1140.875851563" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.916495 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.922237 4854 scope.go:117] "RemoveContainer" containerID="4c9aba28c14e24369dc5854de285866b8c150d6dc18aed6fb25f5d0ef76b09ed" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.928361 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.938222 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:44 crc kubenswrapper[4854]: E1007 12:43:44.938992 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="proxy-httpd" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.939097 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="proxy-httpd" Oct 07 12:43:44 crc kubenswrapper[4854]: E1007 12:43:44.939208 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="ceilometer-central-agent" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.939308 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="ceilometer-central-agent" Oct 07 12:43:44 crc kubenswrapper[4854]: E1007 12:43:44.939424 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="ceilometer-notification-agent" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.939498 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="ceilometer-notification-agent" Oct 07 12:43:44 crc kubenswrapper[4854]: E1007 12:43:44.939596 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="sg-core" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.939669 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="sg-core" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.940090 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="proxy-httpd" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.940239 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="sg-core" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.940333 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="ceilometer-central-agent" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.940430 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" containerName="ceilometer-notification-agent" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.942737 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.943045 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.945018 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.945192 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.946325 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.953040 4854 scope.go:117] "RemoveContainer" containerID="866c4fc273b718201e4a0c66415d7bc4a0c2b56bcaab916d63410b88d97edfc8" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.989642 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-log-httpd\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.989678 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.989705 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-config-data\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.989730 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.989746 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-run-httpd\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.989807 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpg7r\" (UniqueName: \"kubernetes.io/projected/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-kube-api-access-mpg7r\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.989853 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:44 crc kubenswrapper[4854]: I1007 12:43:44.989924 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-scripts\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.075215 4854 scope.go:117] "RemoveContainer" containerID="0e66bb5e4a5802427e46bf735c2796c91b895f20ecd886ea205aa6d66c586737" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.091976 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.092016 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-run-httpd\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.092080 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpg7r\" (UniqueName: \"kubernetes.io/projected/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-kube-api-access-mpg7r\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.092126 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.092377 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-scripts\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.092588 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-run-httpd\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.092765 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-log-httpd\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.092445 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-log-httpd\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.092817 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.092841 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-config-data\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.100022 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-scripts\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.101565 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.102486 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-config-data\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.107277 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.108047 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.110640 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpg7r\" (UniqueName: \"kubernetes.io/projected/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-kube-api-access-mpg7r\") pod \"ceilometer-0\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.372594 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:43:45 crc kubenswrapper[4854]: W1007 12:43:45.881315 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01251017_3c38_4f4d_a3cb_5fd3bf8eba53.slice/crio-de9cd7a6e7faacf596ad95f0f7250775d6ad9cd26e7c441c179b4c0630bf4fa8 WatchSource:0}: Error finding container de9cd7a6e7faacf596ad95f0f7250775d6ad9cd26e7c441c179b4c0630bf4fa8: Status 404 returned error can't find the container with id de9cd7a6e7faacf596ad95f0f7250775d6ad9cd26e7c441c179b4c0630bf4fa8 Oct 07 12:43:45 crc kubenswrapper[4854]: I1007 12:43:45.882832 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:43:46 crc kubenswrapper[4854]: I1007 12:43:46.713992 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198" path="/var/lib/kubelet/pods/bc4429b8-da8e-46fd-8ec6-ea5fcc1a6198/volumes" Oct 07 12:43:46 crc kubenswrapper[4854]: I1007 12:43:46.890289 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01251017-3c38-4f4d-a3cb-5fd3bf8eba53","Type":"ContainerStarted","Data":"e52a5fe19c22ed0d1d7f7e107733d571c35976c26ad6d63e0d2d4bd822824beb"} Oct 07 12:43:46 crc kubenswrapper[4854]: I1007 12:43:46.890336 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01251017-3c38-4f4d-a3cb-5fd3bf8eba53","Type":"ContainerStarted","Data":"de9cd7a6e7faacf596ad95f0f7250775d6ad9cd26e7c441c179b4c0630bf4fa8"} Oct 07 12:43:47 crc kubenswrapper[4854]: I1007 12:43:47.921801 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01251017-3c38-4f4d-a3cb-5fd3bf8eba53","Type":"ContainerStarted","Data":"bc607149edb68e28a38fa027d6f7857c2f8eea9ca9f90368d61c3f02967a426e"} Oct 07 12:43:48 crc kubenswrapper[4854]: I1007 12:43:48.936776 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01251017-3c38-4f4d-a3cb-5fd3bf8eba53","Type":"ContainerStarted","Data":"a52f4c10743192f2c44c050f76016293b1b1b4e4acb038044e33e6710bfa676c"} Oct 07 12:43:49 crc kubenswrapper[4854]: I1007 12:43:49.954850 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01251017-3c38-4f4d-a3cb-5fd3bf8eba53","Type":"ContainerStarted","Data":"ada33b6dd438b612ea3e209d98c214192ea4704b45010905b5df97852888b529"} Oct 07 12:43:49 crc kubenswrapper[4854]: I1007 12:43:49.955479 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 12:43:49 crc kubenswrapper[4854]: I1007 12:43:49.980555 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6303113160000002 podStartE2EDuration="5.980531218s" podCreationTimestamp="2025-10-07 12:43:44 +0000 UTC" firstStartedPulling="2025-10-07 12:43:45.883776971 +0000 UTC m=+1141.871609226" lastFinishedPulling="2025-10-07 12:43:49.233996873 +0000 UTC m=+1145.221829128" observedRunningTime="2025-10-07 12:43:49.977121648 +0000 UTC m=+1145.964953903" watchObservedRunningTime="2025-10-07 12:43:49.980531218 +0000 UTC m=+1145.968363473" Oct 07 12:44:00 crc kubenswrapper[4854]: I1007 12:44:00.075907 4854 generic.go:334] "Generic (PLEG): container finished" podID="85fde367-0bd7-457c-bfa7-6d72cece519e" containerID="82f009346826759cec317a3226ffe23dbce339f0f5f46da29da41f5605075cb6" exitCode=0 Oct 07 12:44:00 crc kubenswrapper[4854]: I1007 12:44:00.076113 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rrljj" event={"ID":"85fde367-0bd7-457c-bfa7-6d72cece519e","Type":"ContainerDied","Data":"82f009346826759cec317a3226ffe23dbce339f0f5f46da29da41f5605075cb6"} Oct 07 12:44:01 crc kubenswrapper[4854]: I1007 12:44:01.477944 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:44:01 crc kubenswrapper[4854]: I1007 12:44:01.561988 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-scripts\") pod \"85fde367-0bd7-457c-bfa7-6d72cece519e\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " Oct 07 12:44:01 crc kubenswrapper[4854]: I1007 12:44:01.562083 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-config-data\") pod \"85fde367-0bd7-457c-bfa7-6d72cece519e\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " Oct 07 12:44:01 crc kubenswrapper[4854]: I1007 12:44:01.562197 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hhjs\" (UniqueName: \"kubernetes.io/projected/85fde367-0bd7-457c-bfa7-6d72cece519e-kube-api-access-5hhjs\") pod \"85fde367-0bd7-457c-bfa7-6d72cece519e\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " Oct 07 12:44:01 crc kubenswrapper[4854]: I1007 12:44:01.562469 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-combined-ca-bundle\") pod \"85fde367-0bd7-457c-bfa7-6d72cece519e\" (UID: \"85fde367-0bd7-457c-bfa7-6d72cece519e\") " Oct 07 12:44:01 crc kubenswrapper[4854]: I1007 12:44:01.567979 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-scripts" (OuterVolumeSpecName: "scripts") pod "85fde367-0bd7-457c-bfa7-6d72cece519e" (UID: "85fde367-0bd7-457c-bfa7-6d72cece519e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:01 crc kubenswrapper[4854]: I1007 12:44:01.569454 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85fde367-0bd7-457c-bfa7-6d72cece519e-kube-api-access-5hhjs" (OuterVolumeSpecName: "kube-api-access-5hhjs") pod "85fde367-0bd7-457c-bfa7-6d72cece519e" (UID: "85fde367-0bd7-457c-bfa7-6d72cece519e"). InnerVolumeSpecName "kube-api-access-5hhjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:44:01 crc kubenswrapper[4854]: I1007 12:44:01.590121 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-config-data" (OuterVolumeSpecName: "config-data") pod "85fde367-0bd7-457c-bfa7-6d72cece519e" (UID: "85fde367-0bd7-457c-bfa7-6d72cece519e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:01 crc kubenswrapper[4854]: I1007 12:44:01.591870 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85fde367-0bd7-457c-bfa7-6d72cece519e" (UID: "85fde367-0bd7-457c-bfa7-6d72cece519e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:01 crc kubenswrapper[4854]: I1007 12:44:01.665128 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:01 crc kubenswrapper[4854]: I1007 12:44:01.665190 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:01 crc kubenswrapper[4854]: I1007 12:44:01.665200 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hhjs\" (UniqueName: \"kubernetes.io/projected/85fde367-0bd7-457c-bfa7-6d72cece519e-kube-api-access-5hhjs\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:01 crc kubenswrapper[4854]: I1007 12:44:01.665209 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85fde367-0bd7-457c-bfa7-6d72cece519e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.103450 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rrljj" event={"ID":"85fde367-0bd7-457c-bfa7-6d72cece519e","Type":"ContainerDied","Data":"94eee5c450cdc50166881e1fcf74851557ad16ab2236ad7dc151363a2ab9081b"} Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.103928 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94eee5c450cdc50166881e1fcf74851557ad16ab2236ad7dc151363a2ab9081b" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.103751 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rrljj" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.285480 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 12:44:02 crc kubenswrapper[4854]: E1007 12:44:02.286095 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85fde367-0bd7-457c-bfa7-6d72cece519e" containerName="nova-cell0-conductor-db-sync" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.286127 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="85fde367-0bd7-457c-bfa7-6d72cece519e" containerName="nova-cell0-conductor-db-sync" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.286491 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="85fde367-0bd7-457c-bfa7-6d72cece519e" containerName="nova-cell0-conductor-db-sync" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.287530 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.309449 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.309917 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5xvmn" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.324646 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.380049 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3565e266-6994-4000-a4f2-2901e22f6682-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3565e266-6994-4000-a4f2-2901e22f6682\") " pod="openstack/nova-cell0-conductor-0" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.380133 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76jz4\" (UniqueName: \"kubernetes.io/projected/3565e266-6994-4000-a4f2-2901e22f6682-kube-api-access-76jz4\") pod \"nova-cell0-conductor-0\" (UID: \"3565e266-6994-4000-a4f2-2901e22f6682\") " pod="openstack/nova-cell0-conductor-0" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.380393 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3565e266-6994-4000-a4f2-2901e22f6682-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3565e266-6994-4000-a4f2-2901e22f6682\") " pod="openstack/nova-cell0-conductor-0" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.481648 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3565e266-6994-4000-a4f2-2901e22f6682-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3565e266-6994-4000-a4f2-2901e22f6682\") " pod="openstack/nova-cell0-conductor-0" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.481746 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3565e266-6994-4000-a4f2-2901e22f6682-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3565e266-6994-4000-a4f2-2901e22f6682\") " pod="openstack/nova-cell0-conductor-0" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.481810 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76jz4\" (UniqueName: \"kubernetes.io/projected/3565e266-6994-4000-a4f2-2901e22f6682-kube-api-access-76jz4\") pod \"nova-cell0-conductor-0\" (UID: \"3565e266-6994-4000-a4f2-2901e22f6682\") " pod="openstack/nova-cell0-conductor-0" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.486009 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3565e266-6994-4000-a4f2-2901e22f6682-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3565e266-6994-4000-a4f2-2901e22f6682\") " pod="openstack/nova-cell0-conductor-0" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.486293 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3565e266-6994-4000-a4f2-2901e22f6682-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3565e266-6994-4000-a4f2-2901e22f6682\") " pod="openstack/nova-cell0-conductor-0" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.496764 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76jz4\" (UniqueName: \"kubernetes.io/projected/3565e266-6994-4000-a4f2-2901e22f6682-kube-api-access-76jz4\") pod \"nova-cell0-conductor-0\" (UID: \"3565e266-6994-4000-a4f2-2901e22f6682\") " pod="openstack/nova-cell0-conductor-0" Oct 07 12:44:02 crc kubenswrapper[4854]: I1007 12:44:02.620781 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 12:44:03 crc kubenswrapper[4854]: I1007 12:44:03.169462 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 12:44:04 crc kubenswrapper[4854]: I1007 12:44:04.126484 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3565e266-6994-4000-a4f2-2901e22f6682","Type":"ContainerStarted","Data":"6b55461da6eb4b748690750f389a1e88f87574727ef1064972561b0e7d0151ea"} Oct 07 12:44:04 crc kubenswrapper[4854]: I1007 12:44:04.127853 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 07 12:44:04 crc kubenswrapper[4854]: I1007 12:44:04.127925 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3565e266-6994-4000-a4f2-2901e22f6682","Type":"ContainerStarted","Data":"cb1ea59674a5427ac3dffbdb7b70cd2fd85f11c37af0b58c75c5288f3840438a"} Oct 07 12:44:04 crc kubenswrapper[4854]: I1007 12:44:04.148184 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.148124783 podStartE2EDuration="2.148124783s" podCreationTimestamp="2025-10-07 12:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:44:04.145841256 +0000 UTC m=+1160.133673531" watchObservedRunningTime="2025-10-07 12:44:04.148124783 +0000 UTC m=+1160.135957038" Oct 07 12:44:12 crc kubenswrapper[4854]: I1007 12:44:12.669121 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.216725 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-phsdv"] Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.219439 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.231220 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.231539 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.264926 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-phsdv"] Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.352217 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-phsdv\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.352811 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-scripts\") pod \"nova-cell0-cell-mapping-phsdv\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.352859 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-config-data\") pod \"nova-cell0-cell-mapping-phsdv\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.352896 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh5vr\" (UniqueName: \"kubernetes.io/projected/563fb438-9713-4237-8ad3-16a4614cbcfd-kube-api-access-mh5vr\") pod \"nova-cell0-cell-mapping-phsdv\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.402059 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.403291 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.408140 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.419036 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.445189 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.446698 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.453914 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.454966 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-config-data\") pod \"nova-cell0-cell-mapping-phsdv\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.455039 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh5vr\" (UniqueName: \"kubernetes.io/projected/563fb438-9713-4237-8ad3-16a4614cbcfd-kube-api-access-mh5vr\") pod \"nova-cell0-cell-mapping-phsdv\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.455094 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-phsdv\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.455203 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-scripts\") pod \"nova-cell0-cell-mapping-phsdv\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.469328 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-scripts\") pod \"nova-cell0-cell-mapping-phsdv\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.470361 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-config-data\") pod \"nova-cell0-cell-mapping-phsdv\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.483786 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.486949 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-phsdv\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.499014 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.500933 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.501821 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh5vr\" (UniqueName: \"kubernetes.io/projected/563fb438-9713-4237-8ad3-16a4614cbcfd-kube-api-access-mh5vr\") pod \"nova-cell0-cell-mapping-phsdv\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.504840 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.522651 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.556417 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821711bc-a957-460f-8614-5cb1d5ed1629-logs\") pod \"nova-api-0\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " pod="openstack/nova-api-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.556499 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2062490-ee95-46b5-9f6c-20ecd2ab4003-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.556542 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821711bc-a957-460f-8614-5cb1d5ed1629-config-data\") pod \"nova-api-0\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " pod="openstack/nova-api-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.556604 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821711bc-a957-460f-8614-5cb1d5ed1629-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " pod="openstack/nova-api-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.556625 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb2ql\" (UniqueName: \"kubernetes.io/projected/a2062490-ee95-46b5-9f6c-20ecd2ab4003-kube-api-access-vb2ql\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.556679 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cc4n\" (UniqueName: \"kubernetes.io/projected/821711bc-a957-460f-8614-5cb1d5ed1629-kube-api-access-4cc4n\") pod \"nova-api-0\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " pod="openstack/nova-api-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.556709 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2062490-ee95-46b5-9f6c-20ecd2ab4003-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.557983 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.593934 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.595499 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.605528 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.658930 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821711bc-a957-460f-8614-5cb1d5ed1629-logs\") pod \"nova-api-0\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " pod="openstack/nova-api-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.659035 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6sgx\" (UniqueName: \"kubernetes.io/projected/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-kube-api-access-w6sgx\") pod \"nova-scheduler-0\" (UID: \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.659107 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2062490-ee95-46b5-9f6c-20ecd2ab4003-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.659180 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821711bc-a957-460f-8614-5cb1d5ed1629-config-data\") pod \"nova-api-0\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " pod="openstack/nova-api-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.659231 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-config-data\") pod \"nova-scheduler-0\" (UID: \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.659267 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821711bc-a957-460f-8614-5cb1d5ed1629-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " pod="openstack/nova-api-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.659290 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb2ql\" (UniqueName: \"kubernetes.io/projected/a2062490-ee95-46b5-9f6c-20ecd2ab4003-kube-api-access-vb2ql\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.659360 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cc4n\" (UniqueName: \"kubernetes.io/projected/821711bc-a957-460f-8614-5cb1d5ed1629-kube-api-access-4cc4n\") pod \"nova-api-0\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " pod="openstack/nova-api-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.659391 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2062490-ee95-46b5-9f6c-20ecd2ab4003-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.659429 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.660124 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821711bc-a957-460f-8614-5cb1d5ed1629-logs\") pod \"nova-api-0\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " pod="openstack/nova-api-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.672138 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821711bc-a957-460f-8614-5cb1d5ed1629-config-data\") pod \"nova-api-0\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " pod="openstack/nova-api-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.674515 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821711bc-a957-460f-8614-5cb1d5ed1629-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " pod="openstack/nova-api-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.679961 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2062490-ee95-46b5-9f6c-20ecd2ab4003-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.684742 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2062490-ee95-46b5-9f6c-20ecd2ab4003-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.684872 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.697873 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cc4n\" (UniqueName: \"kubernetes.io/projected/821711bc-a957-460f-8614-5cb1d5ed1629-kube-api-access-4cc4n\") pod \"nova-api-0\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " pod="openstack/nova-api-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.705831 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb2ql\" (UniqueName: \"kubernetes.io/projected/a2062490-ee95-46b5-9f6c-20ecd2ab4003-kube-api-access-vb2ql\") pod \"nova-cell1-novncproxy-0\" (UID: \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.717437 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.750721 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-qrkmh"] Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.756013 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.767790 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-config-data\") pod \"nova-scheduler-0\" (UID: \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.767852 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9qlg\" (UniqueName: \"kubernetes.io/projected/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-kube-api-access-r9qlg\") pod \"nova-metadata-0\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " pod="openstack/nova-metadata-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.767964 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-logs\") pod \"nova-metadata-0\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " pod="openstack/nova-metadata-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.767995 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.768119 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6sgx\" (UniqueName: \"kubernetes.io/projected/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-kube-api-access-w6sgx\") pod \"nova-scheduler-0\" (UID: \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.768159 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-config-data\") pod \"nova-metadata-0\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " pod="openstack/nova-metadata-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.768193 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " pod="openstack/nova-metadata-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.775936 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-qrkmh"] Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.778849 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-config-data\") pod \"nova-scheduler-0\" (UID: \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.787816 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.789471 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6sgx\" (UniqueName: \"kubernetes.io/projected/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-kube-api-access-w6sgx\") pod \"nova-scheduler-0\" (UID: \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.862843 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.872793 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9qlg\" (UniqueName: \"kubernetes.io/projected/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-kube-api-access-r9qlg\") pod \"nova-metadata-0\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " pod="openstack/nova-metadata-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.874532 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-logs\") pod \"nova-metadata-0\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " pod="openstack/nova-metadata-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.874586 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-dns-svc\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.874603 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-config\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.877378 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-logs\") pod \"nova-metadata-0\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " pod="openstack/nova-metadata-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.877803 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.878552 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.878610 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-config-data\") pod \"nova-metadata-0\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " pod="openstack/nova-metadata-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.878639 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " pod="openstack/nova-metadata-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.878708 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.878734 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.878752 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8gbx\" (UniqueName: \"kubernetes.io/projected/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-kube-api-access-k8gbx\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.894652 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-config-data\") pod \"nova-metadata-0\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " pod="openstack/nova-metadata-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.898663 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9qlg\" (UniqueName: \"kubernetes.io/projected/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-kube-api-access-r9qlg\") pod \"nova-metadata-0\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " pod="openstack/nova-metadata-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.899263 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " pod="openstack/nova-metadata-0" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.981855 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-config\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.982216 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-dns-svc\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.982272 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.982361 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.982388 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.982417 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8gbx\" (UniqueName: \"kubernetes.io/projected/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-kube-api-access-k8gbx\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.983551 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.983766 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-dns-svc\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.984398 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.984633 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:13 crc kubenswrapper[4854]: I1007 12:44:13.985076 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-config\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.007591 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8gbx\" (UniqueName: \"kubernetes.io/projected/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-kube-api-access-k8gbx\") pod \"dnsmasq-dns-757b4f8459-qrkmh\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.031213 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.090316 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.180616 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-phsdv"] Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.258698 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-phsdv" event={"ID":"563fb438-9713-4237-8ad3-16a4614cbcfd","Type":"ContainerStarted","Data":"f9b2307e416c7eb16bdda55cdd71b91683f79d1d349eb4acfd30fa09f2b61544"} Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.304701 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpl49"] Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.306069 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.308804 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.308885 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.311192 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpl49"] Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.364979 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 12:44:14 crc kubenswrapper[4854]: W1007 12:44:14.377004 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2062490_ee95_46b5_9f6c_20ecd2ab4003.slice/crio-ca58e32c17a75fe6731625375facdf27bde90fd9e790abe4a71addb31f882239 WatchSource:0}: Error finding container ca58e32c17a75fe6731625375facdf27bde90fd9e790abe4a71addb31f882239: Status 404 returned error can't find the container with id ca58e32c17a75fe6731625375facdf27bde90fd9e790abe4a71addb31f882239 Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.490741 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-scripts\") pod \"nova-cell1-conductor-db-sync-zpl49\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.490826 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdlq7\" (UniqueName: \"kubernetes.io/projected/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-kube-api-access-vdlq7\") pod \"nova-cell1-conductor-db-sync-zpl49\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.490846 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zpl49\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.490878 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-config-data\") pod \"nova-cell1-conductor-db-sync-zpl49\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.491447 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.532611 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.592953 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdlq7\" (UniqueName: \"kubernetes.io/projected/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-kube-api-access-vdlq7\") pod \"nova-cell1-conductor-db-sync-zpl49\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.593295 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zpl49\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.593375 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-config-data\") pod \"nova-cell1-conductor-db-sync-zpl49\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.593495 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-scripts\") pod \"nova-cell1-conductor-db-sync-zpl49\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.605951 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zpl49\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.606010 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-config-data\") pod \"nova-cell1-conductor-db-sync-zpl49\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.605968 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-scripts\") pod \"nova-cell1-conductor-db-sync-zpl49\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.620037 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdlq7\" (UniqueName: \"kubernetes.io/projected/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-kube-api-access-vdlq7\") pod \"nova-cell1-conductor-db-sync-zpl49\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.701016 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.784990 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-qrkmh"] Oct 07 12:44:14 crc kubenswrapper[4854]: I1007 12:44:14.848496 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:44:15 crc kubenswrapper[4854]: I1007 12:44:15.185345 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpl49"] Oct 07 12:44:15 crc kubenswrapper[4854]: I1007 12:44:15.272424 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-phsdv" event={"ID":"563fb438-9713-4237-8ad3-16a4614cbcfd","Type":"ContainerStarted","Data":"f08094d673885fd0314ed4b83c9aab5d37e0546bd66ad1d1040044725431047a"} Oct 07 12:44:15 crc kubenswrapper[4854]: I1007 12:44:15.275523 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b","Type":"ContainerStarted","Data":"a93fa624873aaa45ac7118549105ef118962481217284a6474d2ecb5be505cc4"} Oct 07 12:44:15 crc kubenswrapper[4854]: I1007 12:44:15.278974 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6","Type":"ContainerStarted","Data":"a55a0a22c791de5aa769b30af6c52362bf7aa370ab1b7e3dd85f11cab3af6818"} Oct 07 12:44:15 crc kubenswrapper[4854]: I1007 12:44:15.281461 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a2062490-ee95-46b5-9f6c-20ecd2ab4003","Type":"ContainerStarted","Data":"ca58e32c17a75fe6731625375facdf27bde90fd9e790abe4a71addb31f882239"} Oct 07 12:44:15 crc kubenswrapper[4854]: I1007 12:44:15.282798 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zpl49" event={"ID":"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6","Type":"ContainerStarted","Data":"84e861d253b4356ccbd0ef47957e6426f339f97699dbbf5b6b70f1c706b6eea4"} Oct 07 12:44:15 crc kubenswrapper[4854]: I1007 12:44:15.284386 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"821711bc-a957-460f-8614-5cb1d5ed1629","Type":"ContainerStarted","Data":"4d619af198ba3526c352a486008ed1840cfc695c8b1127ad3b576c0e7d22377c"} Oct 07 12:44:15 crc kubenswrapper[4854]: I1007 12:44:15.292416 4854 generic.go:334] "Generic (PLEG): container finished" podID="08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" containerID="a201fe1b30730ebacc80338598ad0e731d64d92665cd87dafb32a0d8fee2a829" exitCode=0 Oct 07 12:44:15 crc kubenswrapper[4854]: I1007 12:44:15.292476 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" event={"ID":"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b","Type":"ContainerDied","Data":"a201fe1b30730ebacc80338598ad0e731d64d92665cd87dafb32a0d8fee2a829"} Oct 07 12:44:15 crc kubenswrapper[4854]: I1007 12:44:15.292508 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" event={"ID":"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b","Type":"ContainerStarted","Data":"8461a64008afbb098d179ebbd8511acd49c252c0dd6c91c1a66d9df5c6d89aa3"} Oct 07 12:44:15 crc kubenswrapper[4854]: I1007 12:44:15.295596 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-phsdv" podStartSLOduration=2.29557898 podStartE2EDuration="2.29557898s" podCreationTimestamp="2025-10-07 12:44:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:44:15.287193944 +0000 UTC m=+1171.275026199" watchObservedRunningTime="2025-10-07 12:44:15.29557898 +0000 UTC m=+1171.283411235" Oct 07 12:44:15 crc kubenswrapper[4854]: I1007 12:44:15.422487 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 12:44:16 crc kubenswrapper[4854]: I1007 12:44:16.308761 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zpl49" event={"ID":"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6","Type":"ContainerStarted","Data":"b3abd6b1e0a9375f23c25035816e904a52eb9a5e4ddd0a5fa6d460553c77a7c4"} Oct 07 12:44:16 crc kubenswrapper[4854]: I1007 12:44:16.311391 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" event={"ID":"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b","Type":"ContainerStarted","Data":"bd5c499f0e3e8753c745d12687bdb59d32b4860f35ea257290cecbb1a4600428"} Oct 07 12:44:16 crc kubenswrapper[4854]: I1007 12:44:16.311818 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:16 crc kubenswrapper[4854]: I1007 12:44:16.360929 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zpl49" podStartSLOduration=2.360907389 podStartE2EDuration="2.360907389s" podCreationTimestamp="2025-10-07 12:44:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:44:16.324423301 +0000 UTC m=+1172.312255576" watchObservedRunningTime="2025-10-07 12:44:16.360907389 +0000 UTC m=+1172.348739644" Oct 07 12:44:16 crc kubenswrapper[4854]: I1007 12:44:16.373521 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" podStartSLOduration=3.373502168 podStartE2EDuration="3.373502168s" podCreationTimestamp="2025-10-07 12:44:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:44:16.342554992 +0000 UTC m=+1172.330387257" watchObservedRunningTime="2025-10-07 12:44:16.373502168 +0000 UTC m=+1172.361334423" Oct 07 12:44:16 crc kubenswrapper[4854]: I1007 12:44:16.924081 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:44:16 crc kubenswrapper[4854]: I1007 12:44:16.968888 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.332573 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6","Type":"ContainerStarted","Data":"47f68e491ac5a34fe7785c9f6d6c6ea758453bb4929da41a0de88f06a94d27ad"} Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.334940 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a2062490-ee95-46b5-9f6c-20ecd2ab4003","Type":"ContainerStarted","Data":"3318ca888028ea7d06858c1fbf4206179181c704acdcc3a3ddadfaea72e5b3a0"} Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.335086 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a2062490-ee95-46b5-9f6c-20ecd2ab4003" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3318ca888028ea7d06858c1fbf4206179181c704acdcc3a3ddadfaea72e5b3a0" gracePeriod=30 Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.338245 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"821711bc-a957-460f-8614-5cb1d5ed1629","Type":"ContainerStarted","Data":"c5818a9a203876d0a745e806000f69d03a43b6347b0d2635aeb9d78c60fc9348"} Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.338294 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"821711bc-a957-460f-8614-5cb1d5ed1629","Type":"ContainerStarted","Data":"f42b18664bd90b68979032118d734abfa9226c23ca07312ac6c6b1f0bbb02aba"} Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.342478 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b","Type":"ContainerStarted","Data":"ed5fef242c07e5006a5c9e021bb5ac1cf15a88bbaf3824b5c1abeec3b9cc3490"} Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.342527 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b","Type":"ContainerStarted","Data":"503b644b1615c7eaf8b54360f22e93631c08812f2633b92cb50fd767300a0958"} Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.342643 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" containerName="nova-metadata-log" containerID="cri-o://503b644b1615c7eaf8b54360f22e93631c08812f2633b92cb50fd767300a0958" gracePeriod=30 Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.343112 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" containerName="nova-metadata-metadata" containerID="cri-o://ed5fef242c07e5006a5c9e021bb5ac1cf15a88bbaf3824b5c1abeec3b9cc3490" gracePeriod=30 Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.352838 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.394766764 podStartE2EDuration="5.352821425s" podCreationTimestamp="2025-10-07 12:44:13 +0000 UTC" firstStartedPulling="2025-10-07 12:44:14.580881827 +0000 UTC m=+1170.568714082" lastFinishedPulling="2025-10-07 12:44:17.538936488 +0000 UTC m=+1173.526768743" observedRunningTime="2025-10-07 12:44:18.348819118 +0000 UTC m=+1174.336651373" watchObservedRunningTime="2025-10-07 12:44:18.352821425 +0000 UTC m=+1174.340653680" Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.371123 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.412487533 podStartE2EDuration="5.37110454s" podCreationTimestamp="2025-10-07 12:44:13 +0000 UTC" firstStartedPulling="2025-10-07 12:44:14.582057392 +0000 UTC m=+1170.569889647" lastFinishedPulling="2025-10-07 12:44:17.540674409 +0000 UTC m=+1173.528506654" observedRunningTime="2025-10-07 12:44:18.363852698 +0000 UTC m=+1174.351684953" watchObservedRunningTime="2025-10-07 12:44:18.37110454 +0000 UTC m=+1174.358936795" Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.397009 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.244349561 podStartE2EDuration="5.396986868s" podCreationTimestamp="2025-10-07 12:44:13 +0000 UTC" firstStartedPulling="2025-10-07 12:44:14.387406913 +0000 UTC m=+1170.375239168" lastFinishedPulling="2025-10-07 12:44:17.54004422 +0000 UTC m=+1173.527876475" observedRunningTime="2025-10-07 12:44:18.38851614 +0000 UTC m=+1174.376348395" watchObservedRunningTime="2025-10-07 12:44:18.396986868 +0000 UTC m=+1174.384819123" Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.415759 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.662065749 podStartE2EDuration="5.415740977s" podCreationTimestamp="2025-10-07 12:44:13 +0000 UTC" firstStartedPulling="2025-10-07 12:44:14.799857997 +0000 UTC m=+1170.787690252" lastFinishedPulling="2025-10-07 12:44:17.553533185 +0000 UTC m=+1173.541365480" observedRunningTime="2025-10-07 12:44:18.414616984 +0000 UTC m=+1174.402449239" watchObservedRunningTime="2025-10-07 12:44:18.415740977 +0000 UTC m=+1174.403573232" Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.719290 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:18 crc kubenswrapper[4854]: I1007 12:44:18.878365 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 12:44:19 crc kubenswrapper[4854]: I1007 12:44:19.032348 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 12:44:19 crc kubenswrapper[4854]: I1007 12:44:19.032408 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 12:44:19 crc kubenswrapper[4854]: I1007 12:44:19.354345 4854 generic.go:334] "Generic (PLEG): container finished" podID="f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" containerID="503b644b1615c7eaf8b54360f22e93631c08812f2633b92cb50fd767300a0958" exitCode=143 Oct 07 12:44:19 crc kubenswrapper[4854]: I1007 12:44:19.354396 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b","Type":"ContainerDied","Data":"503b644b1615c7eaf8b54360f22e93631c08812f2633b92cb50fd767300a0958"} Oct 07 12:44:22 crc kubenswrapper[4854]: I1007 12:44:22.385319 4854 generic.go:334] "Generic (PLEG): container finished" podID="563fb438-9713-4237-8ad3-16a4614cbcfd" containerID="f08094d673885fd0314ed4b83c9aab5d37e0546bd66ad1d1040044725431047a" exitCode=0 Oct 07 12:44:22 crc kubenswrapper[4854]: I1007 12:44:22.385410 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-phsdv" event={"ID":"563fb438-9713-4237-8ad3-16a4614cbcfd","Type":"ContainerDied","Data":"f08094d673885fd0314ed4b83c9aab5d37e0546bd66ad1d1040044725431047a"} Oct 07 12:44:23 crc kubenswrapper[4854]: I1007 12:44:23.401215 4854 generic.go:334] "Generic (PLEG): container finished" podID="00fa0fe0-f7e2-4049-9158-248e0d3f1ea6" containerID="b3abd6b1e0a9375f23c25035816e904a52eb9a5e4ddd0a5fa6d460553c77a7c4" exitCode=0 Oct 07 12:44:23 crc kubenswrapper[4854]: I1007 12:44:23.401319 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zpl49" event={"ID":"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6","Type":"ContainerDied","Data":"b3abd6b1e0a9375f23c25035816e904a52eb9a5e4ddd0a5fa6d460553c77a7c4"} Oct 07 12:44:23 crc kubenswrapper[4854]: I1007 12:44:23.864155 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 12:44:23 crc kubenswrapper[4854]: I1007 12:44:23.864207 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 12:44:23 crc kubenswrapper[4854]: I1007 12:44:23.878016 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 12:44:23 crc kubenswrapper[4854]: I1007 12:44:23.879234 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:23 crc kubenswrapper[4854]: I1007 12:44:23.924590 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.019913 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh5vr\" (UniqueName: \"kubernetes.io/projected/563fb438-9713-4237-8ad3-16a4614cbcfd-kube-api-access-mh5vr\") pod \"563fb438-9713-4237-8ad3-16a4614cbcfd\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.020243 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-config-data\") pod \"563fb438-9713-4237-8ad3-16a4614cbcfd\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.020287 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-scripts\") pod \"563fb438-9713-4237-8ad3-16a4614cbcfd\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.020417 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-combined-ca-bundle\") pod \"563fb438-9713-4237-8ad3-16a4614cbcfd\" (UID: \"563fb438-9713-4237-8ad3-16a4614cbcfd\") " Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.025836 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-scripts" (OuterVolumeSpecName: "scripts") pod "563fb438-9713-4237-8ad3-16a4614cbcfd" (UID: "563fb438-9713-4237-8ad3-16a4614cbcfd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.026440 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/563fb438-9713-4237-8ad3-16a4614cbcfd-kube-api-access-mh5vr" (OuterVolumeSpecName: "kube-api-access-mh5vr") pod "563fb438-9713-4237-8ad3-16a4614cbcfd" (UID: "563fb438-9713-4237-8ad3-16a4614cbcfd"). InnerVolumeSpecName "kube-api-access-mh5vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.063839 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "563fb438-9713-4237-8ad3-16a4614cbcfd" (UID: "563fb438-9713-4237-8ad3-16a4614cbcfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.063890 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-config-data" (OuterVolumeSpecName: "config-data") pod "563fb438-9713-4237-8ad3-16a4614cbcfd" (UID: "563fb438-9713-4237-8ad3-16a4614cbcfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.092269 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.122580 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.122619 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh5vr\" (UniqueName: \"kubernetes.io/projected/563fb438-9713-4237-8ad3-16a4614cbcfd-kube-api-access-mh5vr\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.122635 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.122645 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/563fb438-9713-4237-8ad3-16a4614cbcfd-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.172816 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-wjtgg"] Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.173196 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" podUID="3d29bd97-53f9-4154-8712-c564d07a07e0" containerName="dnsmasq-dns" containerID="cri-o://6011f6e5c82aac945460eaf9b07dd392036697ba6d3a9396036ef971f5d74aa5" gracePeriod=10 Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.414684 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-phsdv" event={"ID":"563fb438-9713-4237-8ad3-16a4614cbcfd","Type":"ContainerDied","Data":"f9b2307e416c7eb16bdda55cdd71b91683f79d1d349eb4acfd30fa09f2b61544"} Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.414744 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9b2307e416c7eb16bdda55cdd71b91683f79d1d349eb4acfd30fa09f2b61544" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.415464 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-phsdv" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.424927 4854 generic.go:334] "Generic (PLEG): container finished" podID="3d29bd97-53f9-4154-8712-c564d07a07e0" containerID="6011f6e5c82aac945460eaf9b07dd392036697ba6d3a9396036ef971f5d74aa5" exitCode=0 Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.425010 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" event={"ID":"3d29bd97-53f9-4154-8712-c564d07a07e0","Type":"ContainerDied","Data":"6011f6e5c82aac945460eaf9b07dd392036697ba6d3a9396036ef971f5d74aa5"} Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.477968 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.608316 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.609093 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="821711bc-a957-460f-8614-5cb1d5ed1629" containerName="nova-api-api" containerID="cri-o://c5818a9a203876d0a745e806000f69d03a43b6347b0d2635aeb9d78c60fc9348" gracePeriod=30 Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.608638 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="821711bc-a957-460f-8614-5cb1d5ed1629" containerName="nova-api-log" containerID="cri-o://f42b18664bd90b68979032118d734abfa9226c23ca07312ac6c6b1f0bbb02aba" gracePeriod=30 Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.619448 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="821711bc-a957-460f-8614-5cb1d5ed1629" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": EOF" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.663336 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="821711bc-a957-460f-8614-5cb1d5ed1629" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": EOF" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.887573 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.957401 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-dns-swift-storage-0\") pod \"3d29bd97-53f9-4154-8712-c564d07a07e0\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.957461 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-ovsdbserver-nb\") pod \"3d29bd97-53f9-4154-8712-c564d07a07e0\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.957481 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pxtf\" (UniqueName: \"kubernetes.io/projected/3d29bd97-53f9-4154-8712-c564d07a07e0-kube-api-access-6pxtf\") pod \"3d29bd97-53f9-4154-8712-c564d07a07e0\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.957505 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-ovsdbserver-sb\") pod \"3d29bd97-53f9-4154-8712-c564d07a07e0\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.957538 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-config\") pod \"3d29bd97-53f9-4154-8712-c564d07a07e0\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.957591 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-dns-svc\") pod \"3d29bd97-53f9-4154-8712-c564d07a07e0\" (UID: \"3d29bd97-53f9-4154-8712-c564d07a07e0\") " Oct 07 12:44:24 crc kubenswrapper[4854]: I1007 12:44:24.979496 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d29bd97-53f9-4154-8712-c564d07a07e0-kube-api-access-6pxtf" (OuterVolumeSpecName: "kube-api-access-6pxtf") pod "3d29bd97-53f9-4154-8712-c564d07a07e0" (UID: "3d29bd97-53f9-4154-8712-c564d07a07e0"). InnerVolumeSpecName "kube-api-access-6pxtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.013354 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3d29bd97-53f9-4154-8712-c564d07a07e0" (UID: "3d29bd97-53f9-4154-8712-c564d07a07e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.032067 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3d29bd97-53f9-4154-8712-c564d07a07e0" (UID: "3d29bd97-53f9-4154-8712-c564d07a07e0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.052029 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-config" (OuterVolumeSpecName: "config") pod "3d29bd97-53f9-4154-8712-c564d07a07e0" (UID: "3d29bd97-53f9-4154-8712-c564d07a07e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.060051 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.060091 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pxtf\" (UniqueName: \"kubernetes.io/projected/3d29bd97-53f9-4154-8712-c564d07a07e0-kube-api-access-6pxtf\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.060103 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.060115 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.060410 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3d29bd97-53f9-4154-8712-c564d07a07e0" (UID: "3d29bd97-53f9-4154-8712-c564d07a07e0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.082516 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.093525 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d29bd97-53f9-4154-8712-c564d07a07e0" (UID: "3d29bd97-53f9-4154-8712-c564d07a07e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.097573 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.161448 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-config-data\") pod \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.161514 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-scripts\") pod \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.161598 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdlq7\" (UniqueName: \"kubernetes.io/projected/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-kube-api-access-vdlq7\") pod \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.161675 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-combined-ca-bundle\") pod \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\" (UID: \"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6\") " Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.162055 4854 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.162071 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d29bd97-53f9-4154-8712-c564d07a07e0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.166122 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-kube-api-access-vdlq7" (OuterVolumeSpecName: "kube-api-access-vdlq7") pod "00fa0fe0-f7e2-4049-9158-248e0d3f1ea6" (UID: "00fa0fe0-f7e2-4049-9158-248e0d3f1ea6"). InnerVolumeSpecName "kube-api-access-vdlq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.166135 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-scripts" (OuterVolumeSpecName: "scripts") pod "00fa0fe0-f7e2-4049-9158-248e0d3f1ea6" (UID: "00fa0fe0-f7e2-4049-9158-248e0d3f1ea6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.197406 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00fa0fe0-f7e2-4049-9158-248e0d3f1ea6" (UID: "00fa0fe0-f7e2-4049-9158-248e0d3f1ea6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.197746 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-config-data" (OuterVolumeSpecName: "config-data") pod "00fa0fe0-f7e2-4049-9158-248e0d3f1ea6" (UID: "00fa0fe0-f7e2-4049-9158-248e0d3f1ea6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.264121 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.264187 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.264200 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.264212 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdlq7\" (UniqueName: \"kubernetes.io/projected/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6-kube-api-access-vdlq7\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.437451 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zpl49" event={"ID":"00fa0fe0-f7e2-4049-9158-248e0d3f1ea6","Type":"ContainerDied","Data":"84e861d253b4356ccbd0ef47957e6426f339f97699dbbf5b6b70f1c706b6eea4"} Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.437508 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84e861d253b4356ccbd0ef47957e6426f339f97699dbbf5b6b70f1c706b6eea4" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.437473 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zpl49" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.440090 4854 generic.go:334] "Generic (PLEG): container finished" podID="821711bc-a957-460f-8614-5cb1d5ed1629" containerID="f42b18664bd90b68979032118d734abfa9226c23ca07312ac6c6b1f0bbb02aba" exitCode=143 Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.440226 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"821711bc-a957-460f-8614-5cb1d5ed1629","Type":"ContainerDied","Data":"f42b18664bd90b68979032118d734abfa9226c23ca07312ac6c6b1f0bbb02aba"} Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.444057 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.447265 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-wjtgg" event={"ID":"3d29bd97-53f9-4154-8712-c564d07a07e0","Type":"ContainerDied","Data":"e7f09dfd7d37a98d0b037c431da83d6af53cdfed5c21d6ee01524c266f4fe0f0"} Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.447355 4854 scope.go:117] "RemoveContainer" containerID="6011f6e5c82aac945460eaf9b07dd392036697ba6d3a9396036ef971f5d74aa5" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.485877 4854 scope.go:117] "RemoveContainer" containerID="13f9096214a06dfbf1d19f339d4fca37c68814b59579cab2f9d96d047bd885fa" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.521214 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-wjtgg"] Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.534914 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-wjtgg"] Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.543528 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 12:44:25 crc kubenswrapper[4854]: E1007 12:44:25.543959 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00fa0fe0-f7e2-4049-9158-248e0d3f1ea6" containerName="nova-cell1-conductor-db-sync" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.543977 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="00fa0fe0-f7e2-4049-9158-248e0d3f1ea6" containerName="nova-cell1-conductor-db-sync" Oct 07 12:44:25 crc kubenswrapper[4854]: E1007 12:44:25.543995 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d29bd97-53f9-4154-8712-c564d07a07e0" containerName="init" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.544003 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d29bd97-53f9-4154-8712-c564d07a07e0" containerName="init" Oct 07 12:44:25 crc kubenswrapper[4854]: E1007 12:44:25.544011 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d29bd97-53f9-4154-8712-c564d07a07e0" containerName="dnsmasq-dns" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.544018 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d29bd97-53f9-4154-8712-c564d07a07e0" containerName="dnsmasq-dns" Oct 07 12:44:25 crc kubenswrapper[4854]: E1007 12:44:25.544040 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="563fb438-9713-4237-8ad3-16a4614cbcfd" containerName="nova-manage" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.544049 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="563fb438-9713-4237-8ad3-16a4614cbcfd" containerName="nova-manage" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.544248 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="00fa0fe0-f7e2-4049-9158-248e0d3f1ea6" containerName="nova-cell1-conductor-db-sync" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.544269 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d29bd97-53f9-4154-8712-c564d07a07e0" containerName="dnsmasq-dns" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.544298 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="563fb438-9713-4237-8ad3-16a4614cbcfd" containerName="nova-manage" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.544927 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.548417 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.556115 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.571380 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhxlw\" (UniqueName: \"kubernetes.io/projected/cac73de2-996a-4e04-abde-1153b44058bc-kube-api-access-lhxlw\") pod \"nova-cell1-conductor-0\" (UID: \"cac73de2-996a-4e04-abde-1153b44058bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.571467 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac73de2-996a-4e04-abde-1153b44058bc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cac73de2-996a-4e04-abde-1153b44058bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.571595 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cac73de2-996a-4e04-abde-1153b44058bc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cac73de2-996a-4e04-abde-1153b44058bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.672804 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac73de2-996a-4e04-abde-1153b44058bc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cac73de2-996a-4e04-abde-1153b44058bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.672918 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cac73de2-996a-4e04-abde-1153b44058bc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cac73de2-996a-4e04-abde-1153b44058bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.673007 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhxlw\" (UniqueName: \"kubernetes.io/projected/cac73de2-996a-4e04-abde-1153b44058bc-kube-api-access-lhxlw\") pod \"nova-cell1-conductor-0\" (UID: \"cac73de2-996a-4e04-abde-1153b44058bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.688976 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cac73de2-996a-4e04-abde-1153b44058bc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cac73de2-996a-4e04-abde-1153b44058bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.692844 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac73de2-996a-4e04-abde-1153b44058bc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cac73de2-996a-4e04-abde-1153b44058bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.699038 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhxlw\" (UniqueName: \"kubernetes.io/projected/cac73de2-996a-4e04-abde-1153b44058bc-kube-api-access-lhxlw\") pod \"nova-cell1-conductor-0\" (UID: \"cac73de2-996a-4e04-abde-1153b44058bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 12:44:25 crc kubenswrapper[4854]: I1007 12:44:25.883548 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 12:44:26 crc kubenswrapper[4854]: I1007 12:44:26.357686 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 12:44:26 crc kubenswrapper[4854]: I1007 12:44:26.460040 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cac73de2-996a-4e04-abde-1153b44058bc","Type":"ContainerStarted","Data":"e57d352e529b217f9eee7228993f4cb047af8376ef7600b4f508594cb288e477"} Oct 07 12:44:26 crc kubenswrapper[4854]: I1007 12:44:26.460221 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6" containerName="nova-scheduler-scheduler" containerID="cri-o://47f68e491ac5a34fe7785c9f6d6c6ea758453bb4929da41a0de88f06a94d27ad" gracePeriod=30 Oct 07 12:44:26 crc kubenswrapper[4854]: I1007 12:44:26.716491 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d29bd97-53f9-4154-8712-c564d07a07e0" path="/var/lib/kubelet/pods/3d29bd97-53f9-4154-8712-c564d07a07e0/volumes" Oct 07 12:44:27 crc kubenswrapper[4854]: I1007 12:44:27.475882 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cac73de2-996a-4e04-abde-1153b44058bc","Type":"ContainerStarted","Data":"efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700"} Oct 07 12:44:27 crc kubenswrapper[4854]: I1007 12:44:27.476115 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 07 12:44:27 crc kubenswrapper[4854]: I1007 12:44:27.505487 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.505460801 podStartE2EDuration="2.505460801s" podCreationTimestamp="2025-10-07 12:44:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:44:27.498384564 +0000 UTC m=+1183.486216839" watchObservedRunningTime="2025-10-07 12:44:27.505460801 +0000 UTC m=+1183.493293096" Oct 07 12:44:28 crc kubenswrapper[4854]: E1007 12:44:28.880446 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47f68e491ac5a34fe7785c9f6d6c6ea758453bb4929da41a0de88f06a94d27ad" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 12:44:28 crc kubenswrapper[4854]: E1007 12:44:28.882885 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47f68e491ac5a34fe7785c9f6d6c6ea758453bb4929da41a0de88f06a94d27ad" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 12:44:28 crc kubenswrapper[4854]: E1007 12:44:28.885302 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47f68e491ac5a34fe7785c9f6d6c6ea758453bb4929da41a0de88f06a94d27ad" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 12:44:28 crc kubenswrapper[4854]: E1007 12:44:28.885405 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6" containerName="nova-scheduler-scheduler" Oct 07 12:44:29 crc kubenswrapper[4854]: I1007 12:44:29.513768 4854 generic.go:334] "Generic (PLEG): container finished" podID="2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6" containerID="47f68e491ac5a34fe7785c9f6d6c6ea758453bb4929da41a0de88f06a94d27ad" exitCode=0 Oct 07 12:44:29 crc kubenswrapper[4854]: I1007 12:44:29.514026 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6","Type":"ContainerDied","Data":"47f68e491ac5a34fe7785c9f6d6c6ea758453bb4929da41a0de88f06a94d27ad"} Oct 07 12:44:29 crc kubenswrapper[4854]: I1007 12:44:29.769715 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 12:44:29 crc kubenswrapper[4854]: I1007 12:44:29.868363 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6sgx\" (UniqueName: \"kubernetes.io/projected/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-kube-api-access-w6sgx\") pod \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\" (UID: \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\") " Oct 07 12:44:29 crc kubenswrapper[4854]: I1007 12:44:29.868522 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-combined-ca-bundle\") pod \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\" (UID: \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\") " Oct 07 12:44:29 crc kubenswrapper[4854]: I1007 12:44:29.868816 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-config-data\") pod \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\" (UID: \"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6\") " Oct 07 12:44:29 crc kubenswrapper[4854]: I1007 12:44:29.874980 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-kube-api-access-w6sgx" (OuterVolumeSpecName: "kube-api-access-w6sgx") pod "2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6" (UID: "2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6"). InnerVolumeSpecName "kube-api-access-w6sgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:44:29 crc kubenswrapper[4854]: I1007 12:44:29.915496 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-config-data" (OuterVolumeSpecName: "config-data") pod "2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6" (UID: "2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:29 crc kubenswrapper[4854]: I1007 12:44:29.917300 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6" (UID: "2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:29 crc kubenswrapper[4854]: I1007 12:44:29.972385 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:29 crc kubenswrapper[4854]: I1007 12:44:29.972429 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6sgx\" (UniqueName: \"kubernetes.io/projected/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-kube-api-access-w6sgx\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:29 crc kubenswrapper[4854]: I1007 12:44:29.972446 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.531027 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6","Type":"ContainerDied","Data":"a55a0a22c791de5aa769b30af6c52362bf7aa370ab1b7e3dd85f11cab3af6818"} Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.531450 4854 scope.go:117] "RemoveContainer" containerID="47f68e491ac5a34fe7785c9f6d6c6ea758453bb4929da41a0de88f06a94d27ad" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.531156 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.582293 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.594478 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.610515 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:44:30 crc kubenswrapper[4854]: E1007 12:44:30.611067 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6" containerName="nova-scheduler-scheduler" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.611088 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6" containerName="nova-scheduler-scheduler" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.611335 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6" containerName="nova-scheduler-scheduler" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.612180 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.615203 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.622355 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.714505 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6" path="/var/lib/kubelet/pods/2b7a0f72-0c95-44aa-86bb-7e4e090e6cf6/volumes" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.785536 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjfcf\" (UniqueName: \"kubernetes.io/projected/a9b25c4d-8bd7-4135-bf3e-aff177971588-kube-api-access-gjfcf\") pod \"nova-scheduler-0\" (UID: \"a9b25c4d-8bd7-4135-bf3e-aff177971588\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.786286 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b25c4d-8bd7-4135-bf3e-aff177971588-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a9b25c4d-8bd7-4135-bf3e-aff177971588\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.786884 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9b25c4d-8bd7-4135-bf3e-aff177971588-config-data\") pod \"nova-scheduler-0\" (UID: \"a9b25c4d-8bd7-4135-bf3e-aff177971588\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.888958 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9b25c4d-8bd7-4135-bf3e-aff177971588-config-data\") pod \"nova-scheduler-0\" (UID: \"a9b25c4d-8bd7-4135-bf3e-aff177971588\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.889063 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjfcf\" (UniqueName: \"kubernetes.io/projected/a9b25c4d-8bd7-4135-bf3e-aff177971588-kube-api-access-gjfcf\") pod \"nova-scheduler-0\" (UID: \"a9b25c4d-8bd7-4135-bf3e-aff177971588\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.889310 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b25c4d-8bd7-4135-bf3e-aff177971588-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a9b25c4d-8bd7-4135-bf3e-aff177971588\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.893919 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9b25c4d-8bd7-4135-bf3e-aff177971588-config-data\") pod \"nova-scheduler-0\" (UID: \"a9b25c4d-8bd7-4135-bf3e-aff177971588\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.902941 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b25c4d-8bd7-4135-bf3e-aff177971588-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a9b25c4d-8bd7-4135-bf3e-aff177971588\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.905563 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjfcf\" (UniqueName: \"kubernetes.io/projected/a9b25c4d-8bd7-4135-bf3e-aff177971588-kube-api-access-gjfcf\") pod \"nova-scheduler-0\" (UID: \"a9b25c4d-8bd7-4135-bf3e-aff177971588\") " pod="openstack/nova-scheduler-0" Oct 07 12:44:30 crc kubenswrapper[4854]: I1007 12:44:30.980628 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.483675 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.500463 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cc4n\" (UniqueName: \"kubernetes.io/projected/821711bc-a957-460f-8614-5cb1d5ed1629-kube-api-access-4cc4n\") pod \"821711bc-a957-460f-8614-5cb1d5ed1629\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.500728 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821711bc-a957-460f-8614-5cb1d5ed1629-combined-ca-bundle\") pod \"821711bc-a957-460f-8614-5cb1d5ed1629\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.500869 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821711bc-a957-460f-8614-5cb1d5ed1629-config-data\") pod \"821711bc-a957-460f-8614-5cb1d5ed1629\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.500988 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821711bc-a957-460f-8614-5cb1d5ed1629-logs\") pod \"821711bc-a957-460f-8614-5cb1d5ed1629\" (UID: \"821711bc-a957-460f-8614-5cb1d5ed1629\") " Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.502002 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/821711bc-a957-460f-8614-5cb1d5ed1629-logs" (OuterVolumeSpecName: "logs") pod "821711bc-a957-460f-8614-5cb1d5ed1629" (UID: "821711bc-a957-460f-8614-5cb1d5ed1629"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.509261 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/821711bc-a957-460f-8614-5cb1d5ed1629-kube-api-access-4cc4n" (OuterVolumeSpecName: "kube-api-access-4cc4n") pod "821711bc-a957-460f-8614-5cb1d5ed1629" (UID: "821711bc-a957-460f-8614-5cb1d5ed1629"). InnerVolumeSpecName "kube-api-access-4cc4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.526621 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:44:31 crc kubenswrapper[4854]: W1007 12:44:31.545471 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9b25c4d_8bd7_4135_bf3e_aff177971588.slice/crio-a7931ab57de1e6d0d5b3f61e5524fc1387582f8dca784d689925b1d44198bb8e WatchSource:0}: Error finding container a7931ab57de1e6d0d5b3f61e5524fc1387582f8dca784d689925b1d44198bb8e: Status 404 returned error can't find the container with id a7931ab57de1e6d0d5b3f61e5524fc1387582f8dca784d689925b1d44198bb8e Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.549575 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/821711bc-a957-460f-8614-5cb1d5ed1629-config-data" (OuterVolumeSpecName: "config-data") pod "821711bc-a957-460f-8614-5cb1d5ed1629" (UID: "821711bc-a957-460f-8614-5cb1d5ed1629"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.554220 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a9b25c4d-8bd7-4135-bf3e-aff177971588","Type":"ContainerStarted","Data":"a7931ab57de1e6d0d5b3f61e5524fc1387582f8dca784d689925b1d44198bb8e"} Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.560882 4854 generic.go:334] "Generic (PLEG): container finished" podID="821711bc-a957-460f-8614-5cb1d5ed1629" containerID="c5818a9a203876d0a745e806000f69d03a43b6347b0d2635aeb9d78c60fc9348" exitCode=0 Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.560928 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"821711bc-a957-460f-8614-5cb1d5ed1629","Type":"ContainerDied","Data":"c5818a9a203876d0a745e806000f69d03a43b6347b0d2635aeb9d78c60fc9348"} Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.561208 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"821711bc-a957-460f-8614-5cb1d5ed1629","Type":"ContainerDied","Data":"4d619af198ba3526c352a486008ed1840cfc695c8b1127ad3b576c0e7d22377c"} Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.560963 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.561257 4854 scope.go:117] "RemoveContainer" containerID="c5818a9a203876d0a745e806000f69d03a43b6347b0d2635aeb9d78c60fc9348" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.576525 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/821711bc-a957-460f-8614-5cb1d5ed1629-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "821711bc-a957-460f-8614-5cb1d5ed1629" (UID: "821711bc-a957-460f-8614-5cb1d5ed1629"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.586867 4854 scope.go:117] "RemoveContainer" containerID="f42b18664bd90b68979032118d734abfa9226c23ca07312ac6c6b1f0bbb02aba" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.602270 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cc4n\" (UniqueName: \"kubernetes.io/projected/821711bc-a957-460f-8614-5cb1d5ed1629-kube-api-access-4cc4n\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.602300 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821711bc-a957-460f-8614-5cb1d5ed1629-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.602310 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821711bc-a957-460f-8614-5cb1d5ed1629-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.602337 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821711bc-a957-460f-8614-5cb1d5ed1629-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.629492 4854 scope.go:117] "RemoveContainer" containerID="c5818a9a203876d0a745e806000f69d03a43b6347b0d2635aeb9d78c60fc9348" Oct 07 12:44:31 crc kubenswrapper[4854]: E1007 12:44:31.630124 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5818a9a203876d0a745e806000f69d03a43b6347b0d2635aeb9d78c60fc9348\": container with ID starting with c5818a9a203876d0a745e806000f69d03a43b6347b0d2635aeb9d78c60fc9348 not found: ID does not exist" containerID="c5818a9a203876d0a745e806000f69d03a43b6347b0d2635aeb9d78c60fc9348" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.630193 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5818a9a203876d0a745e806000f69d03a43b6347b0d2635aeb9d78c60fc9348"} err="failed to get container status \"c5818a9a203876d0a745e806000f69d03a43b6347b0d2635aeb9d78c60fc9348\": rpc error: code = NotFound desc = could not find container \"c5818a9a203876d0a745e806000f69d03a43b6347b0d2635aeb9d78c60fc9348\": container with ID starting with c5818a9a203876d0a745e806000f69d03a43b6347b0d2635aeb9d78c60fc9348 not found: ID does not exist" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.630220 4854 scope.go:117] "RemoveContainer" containerID="f42b18664bd90b68979032118d734abfa9226c23ca07312ac6c6b1f0bbb02aba" Oct 07 12:44:31 crc kubenswrapper[4854]: E1007 12:44:31.630814 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f42b18664bd90b68979032118d734abfa9226c23ca07312ac6c6b1f0bbb02aba\": container with ID starting with f42b18664bd90b68979032118d734abfa9226c23ca07312ac6c6b1f0bbb02aba not found: ID does not exist" containerID="f42b18664bd90b68979032118d734abfa9226c23ca07312ac6c6b1f0bbb02aba" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.630856 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42b18664bd90b68979032118d734abfa9226c23ca07312ac6c6b1f0bbb02aba"} err="failed to get container status \"f42b18664bd90b68979032118d734abfa9226c23ca07312ac6c6b1f0bbb02aba\": rpc error: code = NotFound desc = could not find container \"f42b18664bd90b68979032118d734abfa9226c23ca07312ac6c6b1f0bbb02aba\": container with ID starting with f42b18664bd90b68979032118d734abfa9226c23ca07312ac6c6b1f0bbb02aba not found: ID does not exist" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.910655 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.916788 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.951604 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 12:44:31 crc kubenswrapper[4854]: E1007 12:44:31.952211 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821711bc-a957-460f-8614-5cb1d5ed1629" containerName="nova-api-log" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.952238 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="821711bc-a957-460f-8614-5cb1d5ed1629" containerName="nova-api-log" Oct 07 12:44:31 crc kubenswrapper[4854]: E1007 12:44:31.952264 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821711bc-a957-460f-8614-5cb1d5ed1629" containerName="nova-api-api" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.952273 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="821711bc-a957-460f-8614-5cb1d5ed1629" containerName="nova-api-api" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.952514 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="821711bc-a957-460f-8614-5cb1d5ed1629" containerName="nova-api-api" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.952540 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="821711bc-a957-460f-8614-5cb1d5ed1629" containerName="nova-api-log" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.960543 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.962946 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:44:31 crc kubenswrapper[4854]: I1007 12:44:31.966213 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.011880 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69xx6\" (UniqueName: \"kubernetes.io/projected/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-kube-api-access-69xx6\") pod \"nova-api-0\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " pod="openstack/nova-api-0" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.011957 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-config-data\") pod \"nova-api-0\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " pod="openstack/nova-api-0" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.012259 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-logs\") pod \"nova-api-0\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " pod="openstack/nova-api-0" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.012358 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " pod="openstack/nova-api-0" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.113296 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69xx6\" (UniqueName: \"kubernetes.io/projected/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-kube-api-access-69xx6\") pod \"nova-api-0\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " pod="openstack/nova-api-0" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.113413 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-config-data\") pod \"nova-api-0\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " pod="openstack/nova-api-0" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.113562 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-logs\") pod \"nova-api-0\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " pod="openstack/nova-api-0" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.113632 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " pod="openstack/nova-api-0" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.114560 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-logs\") pod \"nova-api-0\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " pod="openstack/nova-api-0" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.119545 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " pod="openstack/nova-api-0" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.120496 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-config-data\") pod \"nova-api-0\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " pod="openstack/nova-api-0" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.130436 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69xx6\" (UniqueName: \"kubernetes.io/projected/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-kube-api-access-69xx6\") pod \"nova-api-0\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " pod="openstack/nova-api-0" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.305863 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.574613 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a9b25c4d-8bd7-4135-bf3e-aff177971588","Type":"ContainerStarted","Data":"39620a647b460de5f35393c3799df63e681c9e03d0decae7a6f3e51d1f325986"} Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.606756 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:44:32 crc kubenswrapper[4854]: W1007 12:44:32.609073 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63f2f561_9c9b_4ab6_8c4f_562f2c07b939.slice/crio-8ba51421484d8209c449156b91bf7ca4b0024c8c8fcae78f8110477210131222 WatchSource:0}: Error finding container 8ba51421484d8209c449156b91bf7ca4b0024c8c8fcae78f8110477210131222: Status 404 returned error can't find the container with id 8ba51421484d8209c449156b91bf7ca4b0024c8c8fcae78f8110477210131222 Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.609473 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.609453007 podStartE2EDuration="2.609453007s" podCreationTimestamp="2025-10-07 12:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:44:32.591530873 +0000 UTC m=+1188.579363148" watchObservedRunningTime="2025-10-07 12:44:32.609453007 +0000 UTC m=+1188.597285262" Oct 07 12:44:32 crc kubenswrapper[4854]: I1007 12:44:32.720957 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="821711bc-a957-460f-8614-5cb1d5ed1629" path="/var/lib/kubelet/pods/821711bc-a957-460f-8614-5cb1d5ed1629/volumes" Oct 07 12:44:33 crc kubenswrapper[4854]: I1007 12:44:33.594430 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63f2f561-9c9b-4ab6-8c4f-562f2c07b939","Type":"ContainerStarted","Data":"fe49d9c9485668cdd667d1d48273f49c58b9ea0277e5bd42a8d64aea4581e174"} Oct 07 12:44:33 crc kubenswrapper[4854]: I1007 12:44:33.594811 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63f2f561-9c9b-4ab6-8c4f-562f2c07b939","Type":"ContainerStarted","Data":"ce88328c293e5e115f27097a6d6dbc585ad5507df43cc4bd760f45459b22e3f9"} Oct 07 12:44:33 crc kubenswrapper[4854]: I1007 12:44:33.594835 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63f2f561-9c9b-4ab6-8c4f-562f2c07b939","Type":"ContainerStarted","Data":"8ba51421484d8209c449156b91bf7ca4b0024c8c8fcae78f8110477210131222"} Oct 07 12:44:33 crc kubenswrapper[4854]: I1007 12:44:33.624755 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.624734611 podStartE2EDuration="2.624734611s" podCreationTimestamp="2025-10-07 12:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:44:33.622132655 +0000 UTC m=+1189.609964980" watchObservedRunningTime="2025-10-07 12:44:33.624734611 +0000 UTC m=+1189.612566876" Oct 07 12:44:35 crc kubenswrapper[4854]: I1007 12:44:35.917609 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 07 12:44:35 crc kubenswrapper[4854]: I1007 12:44:35.980830 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 12:44:40 crc kubenswrapper[4854]: I1007 12:44:40.808690 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:44:40 crc kubenswrapper[4854]: I1007 12:44:40.809043 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:44:40 crc kubenswrapper[4854]: I1007 12:44:40.981658 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 12:44:41 crc kubenswrapper[4854]: I1007 12:44:41.013692 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 12:44:41 crc kubenswrapper[4854]: I1007 12:44:41.739925 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 12:44:42 crc kubenswrapper[4854]: I1007 12:44:42.306566 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 12:44:42 crc kubenswrapper[4854]: I1007 12:44:42.306635 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 12:44:43 crc kubenswrapper[4854]: I1007 12:44:43.388355 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="63f2f561-9c9b-4ab6-8c4f-562f2c07b939" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 12:44:43 crc kubenswrapper[4854]: I1007 12:44:43.388356 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="63f2f561-9c9b-4ab6-8c4f-562f2c07b939" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 12:44:48 crc kubenswrapper[4854]: E1007 12:44:48.618450 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2062490_ee95_46b5_9f6c_20ecd2ab4003.slice/crio-conmon-3318ca888028ea7d06858c1fbf4206179181c704acdcc3a3ddadfaea72e5b3a0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf61f5de0_c6b5_4761_a8ee_9f54ecdb3e0b.slice/crio-conmon-ed5fef242c07e5006a5c9e021bb5ac1cf15a88bbaf3824b5c1abeec3b9cc3490.scope\": RecentStats: unable to find data in memory cache]" Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.768950 4854 generic.go:334] "Generic (PLEG): container finished" podID="f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" containerID="ed5fef242c07e5006a5c9e021bb5ac1cf15a88bbaf3824b5c1abeec3b9cc3490" exitCode=137 Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.769227 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b","Type":"ContainerDied","Data":"ed5fef242c07e5006a5c9e021bb5ac1cf15a88bbaf3824b5c1abeec3b9cc3490"} Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.769346 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b","Type":"ContainerDied","Data":"a93fa624873aaa45ac7118549105ef118962481217284a6474d2ecb5be505cc4"} Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.769363 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a93fa624873aaa45ac7118549105ef118962481217284a6474d2ecb5be505cc4" Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.771404 4854 generic.go:334] "Generic (PLEG): container finished" podID="a2062490-ee95-46b5-9f6c-20ecd2ab4003" containerID="3318ca888028ea7d06858c1fbf4206179181c704acdcc3a3ddadfaea72e5b3a0" exitCode=137 Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.772320 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a2062490-ee95-46b5-9f6c-20ecd2ab4003","Type":"ContainerDied","Data":"3318ca888028ea7d06858c1fbf4206179181c704acdcc3a3ddadfaea72e5b3a0"} Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.772359 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a2062490-ee95-46b5-9f6c-20ecd2ab4003","Type":"ContainerDied","Data":"ca58e32c17a75fe6731625375facdf27bde90fd9e790abe4a71addb31f882239"} Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.772375 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca58e32c17a75fe6731625375facdf27bde90fd9e790abe4a71addb31f882239" Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.811022 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.817175 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.887554 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2062490-ee95-46b5-9f6c-20ecd2ab4003-combined-ca-bundle\") pod \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\" (UID: \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\") " Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.887649 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2062490-ee95-46b5-9f6c-20ecd2ab4003-config-data\") pod \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\" (UID: \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\") " Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.887697 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb2ql\" (UniqueName: \"kubernetes.io/projected/a2062490-ee95-46b5-9f6c-20ecd2ab4003-kube-api-access-vb2ql\") pod \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\" (UID: \"a2062490-ee95-46b5-9f6c-20ecd2ab4003\") " Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.893403 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2062490-ee95-46b5-9f6c-20ecd2ab4003-kube-api-access-vb2ql" (OuterVolumeSpecName: "kube-api-access-vb2ql") pod "a2062490-ee95-46b5-9f6c-20ecd2ab4003" (UID: "a2062490-ee95-46b5-9f6c-20ecd2ab4003"). InnerVolumeSpecName "kube-api-access-vb2ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.917274 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2062490-ee95-46b5-9f6c-20ecd2ab4003-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2062490-ee95-46b5-9f6c-20ecd2ab4003" (UID: "a2062490-ee95-46b5-9f6c-20ecd2ab4003"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.947444 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2062490-ee95-46b5-9f6c-20ecd2ab4003-config-data" (OuterVolumeSpecName: "config-data") pod "a2062490-ee95-46b5-9f6c-20ecd2ab4003" (UID: "a2062490-ee95-46b5-9f6c-20ecd2ab4003"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.989280 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-config-data\") pod \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.989367 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-logs\") pod \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.989405 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-combined-ca-bundle\") pod \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.989463 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9qlg\" (UniqueName: \"kubernetes.io/projected/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-kube-api-access-r9qlg\") pod \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\" (UID: \"f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b\") " Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.989821 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2062490-ee95-46b5-9f6c-20ecd2ab4003-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.989837 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2062490-ee95-46b5-9f6c-20ecd2ab4003-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.989845 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb2ql\" (UniqueName: \"kubernetes.io/projected/a2062490-ee95-46b5-9f6c-20ecd2ab4003-kube-api-access-vb2ql\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.990350 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-logs" (OuterVolumeSpecName: "logs") pod "f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" (UID: "f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:44:48 crc kubenswrapper[4854]: I1007 12:44:48.993141 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-kube-api-access-r9qlg" (OuterVolumeSpecName: "kube-api-access-r9qlg") pod "f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" (UID: "f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b"). InnerVolumeSpecName "kube-api-access-r9qlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.013453 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" (UID: "f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.015459 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-config-data" (OuterVolumeSpecName: "config-data") pod "f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" (UID: "f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.092918 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.092964 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.092979 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.092994 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9qlg\" (UniqueName: \"kubernetes.io/projected/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b-kube-api-access-r9qlg\") on node \"crc\" DevicePath \"\"" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.779578 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.779578 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.815725 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.825542 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.836652 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.847919 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.856333 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 12:44:49 crc kubenswrapper[4854]: E1007 12:44:49.856799 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" containerName="nova-metadata-metadata" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.856827 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" containerName="nova-metadata-metadata" Oct 07 12:44:49 crc kubenswrapper[4854]: E1007 12:44:49.856860 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2062490-ee95-46b5-9f6c-20ecd2ab4003" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.856870 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2062490-ee95-46b5-9f6c-20ecd2ab4003" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 12:44:49 crc kubenswrapper[4854]: E1007 12:44:49.856897 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" containerName="nova-metadata-log" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.856907 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" containerName="nova-metadata-log" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.857159 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2062490-ee95-46b5-9f6c-20ecd2ab4003" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.857192 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" containerName="nova-metadata-metadata" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.857209 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" containerName="nova-metadata-log" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.857982 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.860108 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.860336 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.860501 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.865232 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.867051 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.869647 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.870015 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.888576 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 12:44:49 crc kubenswrapper[4854]: I1007 12:44:49.903429 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.008713 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.008772 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqbj7\" (UniqueName: \"kubernetes.io/projected/4f6c2d92-425d-4406-9436-b98f0f0a313f-kube-api-access-nqbj7\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.008809 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2m56\" (UniqueName: \"kubernetes.io/projected/22343ee2-64b7-4496-b74c-c9860920e953-kube-api-access-g2m56\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.008872 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.008954 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f6c2d92-425d-4406-9436-b98f0f0a313f-logs\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.008982 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-config-data\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.009011 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.009079 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.009105 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.009132 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.110914 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.111009 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqbj7\" (UniqueName: \"kubernetes.io/projected/4f6c2d92-425d-4406-9436-b98f0f0a313f-kube-api-access-nqbj7\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.111039 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2m56\" (UniqueName: \"kubernetes.io/projected/22343ee2-64b7-4496-b74c-c9860920e953-kube-api-access-g2m56\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.111085 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.111183 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f6c2d92-425d-4406-9436-b98f0f0a313f-logs\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.111210 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-config-data\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.111251 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.111291 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.111320 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.111356 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.112041 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f6c2d92-425d-4406-9436-b98f0f0a313f-logs\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.115845 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.116746 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.117115 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.117327 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-config-data\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.118334 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.118606 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.119005 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.128703 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2m56\" (UniqueName: \"kubernetes.io/projected/22343ee2-64b7-4496-b74c-c9860920e953-kube-api-access-g2m56\") pod \"nova-cell1-novncproxy-0\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.129504 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqbj7\" (UniqueName: \"kubernetes.io/projected/4f6c2d92-425d-4406-9436-b98f0f0a313f-kube-api-access-nqbj7\") pod \"nova-metadata-0\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.187421 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.198107 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.501441 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.639022 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.716342 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2062490-ee95-46b5-9f6c-20ecd2ab4003" path="/var/lib/kubelet/pods/a2062490-ee95-46b5-9f6c-20ecd2ab4003/volumes" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.717010 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b" path="/var/lib/kubelet/pods/f61f5de0-c6b5-4761-a8ee-9f54ecdb3e0b/volumes" Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.802289 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"22343ee2-64b7-4496-b74c-c9860920e953","Type":"ContainerStarted","Data":"d3a004987a0b14c1a22e75d69824fa876f445af6182a991647fba7625ef4d751"} Oct 07 12:44:50 crc kubenswrapper[4854]: I1007 12:44:50.806902 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f6c2d92-425d-4406-9436-b98f0f0a313f","Type":"ContainerStarted","Data":"761ad40ba1f8ab89eeb04811a35954452820d329763fa49ec7f5120628991d06"} Oct 07 12:44:51 crc kubenswrapper[4854]: I1007 12:44:51.821913 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f6c2d92-425d-4406-9436-b98f0f0a313f","Type":"ContainerStarted","Data":"66b39f54008ac95dd36d0bd04bcb71d6007fd479b442082345ebd139fb1623e3"} Oct 07 12:44:51 crc kubenswrapper[4854]: I1007 12:44:51.822364 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f6c2d92-425d-4406-9436-b98f0f0a313f","Type":"ContainerStarted","Data":"07698d73044f4030e36d18f8657ce12a43b74bcd1e6b2ca2802cb8dc6179c19d"} Oct 07 12:44:51 crc kubenswrapper[4854]: I1007 12:44:51.823778 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"22343ee2-64b7-4496-b74c-c9860920e953","Type":"ContainerStarted","Data":"59e04fd8406ec48a71ffcc20755a831bd6b1ec8e9383f0d8c7df80e46452738a"} Oct 07 12:44:51 crc kubenswrapper[4854]: I1007 12:44:51.863449 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.863399501 podStartE2EDuration="2.863399501s" podCreationTimestamp="2025-10-07 12:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:44:51.861074973 +0000 UTC m=+1207.848907258" watchObservedRunningTime="2025-10-07 12:44:51.863399501 +0000 UTC m=+1207.851231776" Oct 07 12:44:51 crc kubenswrapper[4854]: I1007 12:44:51.887509 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.887487336 podStartE2EDuration="2.887487336s" podCreationTimestamp="2025-10-07 12:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:44:51.87735251 +0000 UTC m=+1207.865184785" watchObservedRunningTime="2025-10-07 12:44:51.887487336 +0000 UTC m=+1207.875319591" Oct 07 12:44:52 crc kubenswrapper[4854]: I1007 12:44:52.309852 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 12:44:52 crc kubenswrapper[4854]: I1007 12:44:52.310521 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 12:44:52 crc kubenswrapper[4854]: I1007 12:44:52.310565 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 12:44:52 crc kubenswrapper[4854]: I1007 12:44:52.313805 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 12:44:52 crc kubenswrapper[4854]: I1007 12:44:52.836734 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 12:44:52 crc kubenswrapper[4854]: I1007 12:44:52.841346 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.035624 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qshkq"] Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.037570 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.053768 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qshkq"] Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.172619 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.172861 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-config\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.173021 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2lrz\" (UniqueName: \"kubernetes.io/projected/43bd8c48-fc92-4f02-af3e-db78673cacb9-kube-api-access-x2lrz\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.173321 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.173403 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.173676 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.276170 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.276255 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-config\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.276349 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2lrz\" (UniqueName: \"kubernetes.io/projected/43bd8c48-fc92-4f02-af3e-db78673cacb9-kube-api-access-x2lrz\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.276458 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.276505 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.276620 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.277888 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.278008 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.278462 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-config\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.279081 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.279078 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.300579 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2lrz\" (UniqueName: \"kubernetes.io/projected/43bd8c48-fc92-4f02-af3e-db78673cacb9-kube-api-access-x2lrz\") pod \"dnsmasq-dns-89c5cd4d5-qshkq\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.366341 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:53 crc kubenswrapper[4854]: I1007 12:44:53.877939 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qshkq"] Oct 07 12:44:54 crc kubenswrapper[4854]: I1007 12:44:54.864771 4854 generic.go:334] "Generic (PLEG): container finished" podID="43bd8c48-fc92-4f02-af3e-db78673cacb9" containerID="a50dfeaa00f16b6d847f6b9bed54bcb51e19dccdc1b497636e9040901056764e" exitCode=0 Oct 07 12:44:54 crc kubenswrapper[4854]: I1007 12:44:54.866227 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" event={"ID":"43bd8c48-fc92-4f02-af3e-db78673cacb9","Type":"ContainerDied","Data":"a50dfeaa00f16b6d847f6b9bed54bcb51e19dccdc1b497636e9040901056764e"} Oct 07 12:44:54 crc kubenswrapper[4854]: I1007 12:44:54.866257 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" event={"ID":"43bd8c48-fc92-4f02-af3e-db78673cacb9","Type":"ContainerStarted","Data":"0ddbe79b9602fd8d93add2ee7a3ad6de13e93baac76695f4efc390c72e1935f0"} Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.188052 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.199088 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.199159 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.224386 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.224674 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="ceilometer-central-agent" containerID="cri-o://e52a5fe19c22ed0d1d7f7e107733d571c35976c26ad6d63e0d2d4bd822824beb" gracePeriod=30 Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.224782 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="sg-core" containerID="cri-o://a52f4c10743192f2c44c050f76016293b1b1b4e4acb038044e33e6710bfa676c" gracePeriod=30 Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.224807 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="ceilometer-notification-agent" containerID="cri-o://bc607149edb68e28a38fa027d6f7857c2f8eea9ca9f90368d61c3f02967a426e" gracePeriod=30 Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.224955 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="proxy-httpd" containerID="cri-o://ada33b6dd438b612ea3e209d98c214192ea4704b45010905b5df97852888b529" gracePeriod=30 Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.876044 4854 generic.go:334] "Generic (PLEG): container finished" podID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerID="ada33b6dd438b612ea3e209d98c214192ea4704b45010905b5df97852888b529" exitCode=0 Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.876366 4854 generic.go:334] "Generic (PLEG): container finished" podID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerID="a52f4c10743192f2c44c050f76016293b1b1b4e4acb038044e33e6710bfa676c" exitCode=2 Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.876378 4854 generic.go:334] "Generic (PLEG): container finished" podID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerID="e52a5fe19c22ed0d1d7f7e107733d571c35976c26ad6d63e0d2d4bd822824beb" exitCode=0 Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.876109 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01251017-3c38-4f4d-a3cb-5fd3bf8eba53","Type":"ContainerDied","Data":"ada33b6dd438b612ea3e209d98c214192ea4704b45010905b5df97852888b529"} Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.876449 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01251017-3c38-4f4d-a3cb-5fd3bf8eba53","Type":"ContainerDied","Data":"a52f4c10743192f2c44c050f76016293b1b1b4e4acb038044e33e6710bfa676c"} Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.876465 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01251017-3c38-4f4d-a3cb-5fd3bf8eba53","Type":"ContainerDied","Data":"e52a5fe19c22ed0d1d7f7e107733d571c35976c26ad6d63e0d2d4bd822824beb"} Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.878576 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" event={"ID":"43bd8c48-fc92-4f02-af3e-db78673cacb9","Type":"ContainerStarted","Data":"222c24d46a308342692e4bc3749f04b6f56b937db2818c8260cd786457220f04"} Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.878776 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:44:55 crc kubenswrapper[4854]: I1007 12:44:55.899694 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" podStartSLOduration=2.899618827 podStartE2EDuration="2.899618827s" podCreationTimestamp="2025-10-07 12:44:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:44:55.895893048 +0000 UTC m=+1211.883725333" watchObservedRunningTime="2025-10-07 12:44:55.899618827 +0000 UTC m=+1211.887451092" Oct 07 12:44:56 crc kubenswrapper[4854]: I1007 12:44:56.310538 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:44:56 crc kubenswrapper[4854]: I1007 12:44:56.310803 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="63f2f561-9c9b-4ab6-8c4f-562f2c07b939" containerName="nova-api-log" containerID="cri-o://ce88328c293e5e115f27097a6d6dbc585ad5507df43cc4bd760f45459b22e3f9" gracePeriod=30 Oct 07 12:44:56 crc kubenswrapper[4854]: I1007 12:44:56.310853 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="63f2f561-9c9b-4ab6-8c4f-562f2c07b939" containerName="nova-api-api" containerID="cri-o://fe49d9c9485668cdd667d1d48273f49c58b9ea0277e5bd42a8d64aea4581e174" gracePeriod=30 Oct 07 12:44:56 crc kubenswrapper[4854]: I1007 12:44:56.889272 4854 generic.go:334] "Generic (PLEG): container finished" podID="63f2f561-9c9b-4ab6-8c4f-562f2c07b939" containerID="ce88328c293e5e115f27097a6d6dbc585ad5507df43cc4bd760f45459b22e3f9" exitCode=143 Oct 07 12:44:56 crc kubenswrapper[4854]: I1007 12:44:56.889354 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63f2f561-9c9b-4ab6-8c4f-562f2c07b939","Type":"ContainerDied","Data":"ce88328c293e5e115f27097a6d6dbc585ad5507df43cc4bd760f45459b22e3f9"} Oct 07 12:44:59 crc kubenswrapper[4854]: I1007 12:44:59.940757 4854 generic.go:334] "Generic (PLEG): container finished" podID="63f2f561-9c9b-4ab6-8c4f-562f2c07b939" containerID="fe49d9c9485668cdd667d1d48273f49c58b9ea0277e5bd42a8d64aea4581e174" exitCode=0 Oct 07 12:44:59 crc kubenswrapper[4854]: I1007 12:44:59.941420 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63f2f561-9c9b-4ab6-8c4f-562f2c07b939","Type":"ContainerDied","Data":"fe49d9c9485668cdd667d1d48273f49c58b9ea0277e5bd42a8d64aea4581e174"} Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.156231 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s"] Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.157753 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.165059 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.166239 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.170599 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s"] Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.188597 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.195875 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.200704 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.201787 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.249759 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.319205 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-logs\") pod \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.319427 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-combined-ca-bundle\") pod \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.319523 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-config-data\") pod \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.319629 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69xx6\" (UniqueName: \"kubernetes.io/projected/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-kube-api-access-69xx6\") pod \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\" (UID: \"63f2f561-9c9b-4ab6-8c4f-562f2c07b939\") " Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.319825 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-logs" (OuterVolumeSpecName: "logs") pod "63f2f561-9c9b-4ab6-8c4f-562f2c07b939" (UID: "63f2f561-9c9b-4ab6-8c4f-562f2c07b939"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.320096 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1779af00-05df-4865-b313-cd038772c19f-secret-volume\") pod \"collect-profiles-29330685-qwz9s\" (UID: \"1779af00-05df-4865-b313-cd038772c19f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.320225 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1779af00-05df-4865-b313-cd038772c19f-config-volume\") pod \"collect-profiles-29330685-qwz9s\" (UID: \"1779af00-05df-4865-b313-cd038772c19f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.320332 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gsc7\" (UniqueName: \"kubernetes.io/projected/1779af00-05df-4865-b313-cd038772c19f-kube-api-access-9gsc7\") pod \"collect-profiles-29330685-qwz9s\" (UID: \"1779af00-05df-4865-b313-cd038772c19f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.320410 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.325277 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-kube-api-access-69xx6" (OuterVolumeSpecName: "kube-api-access-69xx6") pod "63f2f561-9c9b-4ab6-8c4f-562f2c07b939" (UID: "63f2f561-9c9b-4ab6-8c4f-562f2c07b939"). InnerVolumeSpecName "kube-api-access-69xx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.367772 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-config-data" (OuterVolumeSpecName: "config-data") pod "63f2f561-9c9b-4ab6-8c4f-562f2c07b939" (UID: "63f2f561-9c9b-4ab6-8c4f-562f2c07b939"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.414282 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63f2f561-9c9b-4ab6-8c4f-562f2c07b939" (UID: "63f2f561-9c9b-4ab6-8c4f-562f2c07b939"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.422035 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1779af00-05df-4865-b313-cd038772c19f-config-volume\") pod \"collect-profiles-29330685-qwz9s\" (UID: \"1779af00-05df-4865-b313-cd038772c19f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.422135 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gsc7\" (UniqueName: \"kubernetes.io/projected/1779af00-05df-4865-b313-cd038772c19f-kube-api-access-9gsc7\") pod \"collect-profiles-29330685-qwz9s\" (UID: \"1779af00-05df-4865-b313-cd038772c19f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.422342 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1779af00-05df-4865-b313-cd038772c19f-secret-volume\") pod \"collect-profiles-29330685-qwz9s\" (UID: \"1779af00-05df-4865-b313-cd038772c19f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.422431 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69xx6\" (UniqueName: \"kubernetes.io/projected/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-kube-api-access-69xx6\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.422448 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.422461 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f2f561-9c9b-4ab6-8c4f-562f2c07b939-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.423290 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1779af00-05df-4865-b313-cd038772c19f-config-volume\") pod \"collect-profiles-29330685-qwz9s\" (UID: \"1779af00-05df-4865-b313-cd038772c19f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.425861 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1779af00-05df-4865-b313-cd038772c19f-secret-volume\") pod \"collect-profiles-29330685-qwz9s\" (UID: \"1779af00-05df-4865-b313-cd038772c19f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.438298 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gsc7\" (UniqueName: \"kubernetes.io/projected/1779af00-05df-4865-b313-cd038772c19f-kube-api-access-9gsc7\") pod \"collect-profiles-29330685-qwz9s\" (UID: \"1779af00-05df-4865-b313-cd038772c19f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.508029 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.670491 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.839975 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-run-httpd\") pod \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.840048 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-config-data\") pod \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.840221 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-scripts\") pod \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.840243 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpg7r\" (UniqueName: \"kubernetes.io/projected/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-kube-api-access-mpg7r\") pod \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.840292 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-combined-ca-bundle\") pod \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.840347 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-ceilometer-tls-certs\") pod \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.840379 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-log-httpd\") pod \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.840415 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-sg-core-conf-yaml\") pod \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\" (UID: \"01251017-3c38-4f4d-a3cb-5fd3bf8eba53\") " Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.840535 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "01251017-3c38-4f4d-a3cb-5fd3bf8eba53" (UID: "01251017-3c38-4f4d-a3cb-5fd3bf8eba53"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.840823 4854 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.841634 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "01251017-3c38-4f4d-a3cb-5fd3bf8eba53" (UID: "01251017-3c38-4f4d-a3cb-5fd3bf8eba53"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.846309 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-scripts" (OuterVolumeSpecName: "scripts") pod "01251017-3c38-4f4d-a3cb-5fd3bf8eba53" (UID: "01251017-3c38-4f4d-a3cb-5fd3bf8eba53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.860174 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-kube-api-access-mpg7r" (OuterVolumeSpecName: "kube-api-access-mpg7r") pod "01251017-3c38-4f4d-a3cb-5fd3bf8eba53" (UID: "01251017-3c38-4f4d-a3cb-5fd3bf8eba53"). InnerVolumeSpecName "kube-api-access-mpg7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.929245 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "01251017-3c38-4f4d-a3cb-5fd3bf8eba53" (UID: "01251017-3c38-4f4d-a3cb-5fd3bf8eba53"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.944235 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpg7r\" (UniqueName: \"kubernetes.io/projected/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-kube-api-access-mpg7r\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.944267 4854 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.944279 4854 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.944291 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.952081 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "01251017-3c38-4f4d-a3cb-5fd3bf8eba53" (UID: "01251017-3c38-4f4d-a3cb-5fd3bf8eba53"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.967014 4854 generic.go:334] "Generic (PLEG): container finished" podID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerID="bc607149edb68e28a38fa027d6f7857c2f8eea9ca9f90368d61c3f02967a426e" exitCode=0 Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.967085 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01251017-3c38-4f4d-a3cb-5fd3bf8eba53","Type":"ContainerDied","Data":"bc607149edb68e28a38fa027d6f7857c2f8eea9ca9f90368d61c3f02967a426e"} Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.967096 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.967113 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01251017-3c38-4f4d-a3cb-5fd3bf8eba53","Type":"ContainerDied","Data":"de9cd7a6e7faacf596ad95f0f7250775d6ad9cd26e7c441c179b4c0630bf4fa8"} Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.967130 4854 scope.go:117] "RemoveContainer" containerID="ada33b6dd438b612ea3e209d98c214192ea4704b45010905b5df97852888b529" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.975084 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:45:00 crc kubenswrapper[4854]: I1007 12:45:00.976386 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63f2f561-9c9b-4ab6-8c4f-562f2c07b939","Type":"ContainerDied","Data":"8ba51421484d8209c449156b91bf7ca4b0024c8c8fcae78f8110477210131222"} Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.032120 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.046230 4854 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.050430 4854 scope.go:117] "RemoveContainer" containerID="a52f4c10743192f2c44c050f76016293b1b1b4e4acb038044e33e6710bfa676c" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.054867 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.073234 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.073306 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-config-data" (OuterVolumeSpecName: "config-data") pod "01251017-3c38-4f4d-a3cb-5fd3bf8eba53" (UID: "01251017-3c38-4f4d-a3cb-5fd3bf8eba53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.075282 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01251017-3c38-4f4d-a3cb-5fd3bf8eba53" (UID: "01251017-3c38-4f4d-a3cb-5fd3bf8eba53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.088096 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 12:45:01 crc kubenswrapper[4854]: E1007 12:45:01.088511 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="sg-core" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.088532 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="sg-core" Oct 07 12:45:01 crc kubenswrapper[4854]: E1007 12:45:01.088545 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f2f561-9c9b-4ab6-8c4f-562f2c07b939" containerName="nova-api-api" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.088552 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f2f561-9c9b-4ab6-8c4f-562f2c07b939" containerName="nova-api-api" Oct 07 12:45:01 crc kubenswrapper[4854]: E1007 12:45:01.088564 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="proxy-httpd" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.088573 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="proxy-httpd" Oct 07 12:45:01 crc kubenswrapper[4854]: E1007 12:45:01.088588 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f2f561-9c9b-4ab6-8c4f-562f2c07b939" containerName="nova-api-log" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.088595 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f2f561-9c9b-4ab6-8c4f-562f2c07b939" containerName="nova-api-log" Oct 07 12:45:01 crc kubenswrapper[4854]: E1007 12:45:01.088609 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="ceilometer-notification-agent" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.088614 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="ceilometer-notification-agent" Oct 07 12:45:01 crc kubenswrapper[4854]: E1007 12:45:01.088624 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="ceilometer-central-agent" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.088630 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="ceilometer-central-agent" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.088830 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="proxy-httpd" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.088853 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="sg-core" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.088862 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f2f561-9c9b-4ab6-8c4f-562f2c07b939" containerName="nova-api-api" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.088878 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="ceilometer-notification-agent" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.088885 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f2f561-9c9b-4ab6-8c4f-562f2c07b939" containerName="nova-api-log" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.088893 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" containerName="ceilometer-central-agent" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.090005 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.093968 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.094008 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.095613 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.105407 4854 scope.go:117] "RemoveContainer" containerID="bc607149edb68e28a38fa027d6f7857c2f8eea9ca9f90368d61c3f02967a426e" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.145274 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.151750 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.151784 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01251017-3c38-4f4d-a3cb-5fd3bf8eba53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.153905 4854 scope.go:117] "RemoveContainer" containerID="e52a5fe19c22ed0d1d7f7e107733d571c35976c26ad6d63e0d2d4bd822824beb" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.164200 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s"] Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.206638 4854 scope.go:117] "RemoveContainer" containerID="ada33b6dd438b612ea3e209d98c214192ea4704b45010905b5df97852888b529" Oct 07 12:45:01 crc kubenswrapper[4854]: E1007 12:45:01.207432 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ada33b6dd438b612ea3e209d98c214192ea4704b45010905b5df97852888b529\": container with ID starting with ada33b6dd438b612ea3e209d98c214192ea4704b45010905b5df97852888b529 not found: ID does not exist" containerID="ada33b6dd438b612ea3e209d98c214192ea4704b45010905b5df97852888b529" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.207472 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada33b6dd438b612ea3e209d98c214192ea4704b45010905b5df97852888b529"} err="failed to get container status \"ada33b6dd438b612ea3e209d98c214192ea4704b45010905b5df97852888b529\": rpc error: code = NotFound desc = could not find container \"ada33b6dd438b612ea3e209d98c214192ea4704b45010905b5df97852888b529\": container with ID starting with ada33b6dd438b612ea3e209d98c214192ea4704b45010905b5df97852888b529 not found: ID does not exist" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.207502 4854 scope.go:117] "RemoveContainer" containerID="a52f4c10743192f2c44c050f76016293b1b1b4e4acb038044e33e6710bfa676c" Oct 07 12:45:01 crc kubenswrapper[4854]: E1007 12:45:01.208024 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a52f4c10743192f2c44c050f76016293b1b1b4e4acb038044e33e6710bfa676c\": container with ID starting with a52f4c10743192f2c44c050f76016293b1b1b4e4acb038044e33e6710bfa676c not found: ID does not exist" containerID="a52f4c10743192f2c44c050f76016293b1b1b4e4acb038044e33e6710bfa676c" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.208075 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a52f4c10743192f2c44c050f76016293b1b1b4e4acb038044e33e6710bfa676c"} err="failed to get container status \"a52f4c10743192f2c44c050f76016293b1b1b4e4acb038044e33e6710bfa676c\": rpc error: code = NotFound desc = could not find container \"a52f4c10743192f2c44c050f76016293b1b1b4e4acb038044e33e6710bfa676c\": container with ID starting with a52f4c10743192f2c44c050f76016293b1b1b4e4acb038044e33e6710bfa676c not found: ID does not exist" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.208106 4854 scope.go:117] "RemoveContainer" containerID="bc607149edb68e28a38fa027d6f7857c2f8eea9ca9f90368d61c3f02967a426e" Oct 07 12:45:01 crc kubenswrapper[4854]: E1007 12:45:01.208624 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc607149edb68e28a38fa027d6f7857c2f8eea9ca9f90368d61c3f02967a426e\": container with ID starting with bc607149edb68e28a38fa027d6f7857c2f8eea9ca9f90368d61c3f02967a426e not found: ID does not exist" containerID="bc607149edb68e28a38fa027d6f7857c2f8eea9ca9f90368d61c3f02967a426e" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.208652 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc607149edb68e28a38fa027d6f7857c2f8eea9ca9f90368d61c3f02967a426e"} err="failed to get container status \"bc607149edb68e28a38fa027d6f7857c2f8eea9ca9f90368d61c3f02967a426e\": rpc error: code = NotFound desc = could not find container \"bc607149edb68e28a38fa027d6f7857c2f8eea9ca9f90368d61c3f02967a426e\": container with ID starting with bc607149edb68e28a38fa027d6f7857c2f8eea9ca9f90368d61c3f02967a426e not found: ID does not exist" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.208677 4854 scope.go:117] "RemoveContainer" containerID="e52a5fe19c22ed0d1d7f7e107733d571c35976c26ad6d63e0d2d4bd822824beb" Oct 07 12:45:01 crc kubenswrapper[4854]: E1007 12:45:01.209029 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e52a5fe19c22ed0d1d7f7e107733d571c35976c26ad6d63e0d2d4bd822824beb\": container with ID starting with e52a5fe19c22ed0d1d7f7e107733d571c35976c26ad6d63e0d2d4bd822824beb not found: ID does not exist" containerID="e52a5fe19c22ed0d1d7f7e107733d571c35976c26ad6d63e0d2d4bd822824beb" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.209064 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e52a5fe19c22ed0d1d7f7e107733d571c35976c26ad6d63e0d2d4bd822824beb"} err="failed to get container status \"e52a5fe19c22ed0d1d7f7e107733d571c35976c26ad6d63e0d2d4bd822824beb\": rpc error: code = NotFound desc = could not find container \"e52a5fe19c22ed0d1d7f7e107733d571c35976c26ad6d63e0d2d4bd822824beb\": container with ID starting with e52a5fe19c22ed0d1d7f7e107733d571c35976c26ad6d63e0d2d4bd822824beb not found: ID does not exist" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.209084 4854 scope.go:117] "RemoveContainer" containerID="fe49d9c9485668cdd667d1d48273f49c58b9ea0277e5bd42a8d64aea4581e174" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.231281 4854 scope.go:117] "RemoveContainer" containerID="ce88328c293e5e115f27097a6d6dbc585ad5507df43cc4bd760f45459b22e3f9" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.237496 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4f6c2d92-425d-4406-9436-b98f0f0a313f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.237966 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4f6c2d92-425d-4406-9436-b98f0f0a313f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.254080 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.254405 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-config-data\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.254593 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h74rg\" (UniqueName: \"kubernetes.io/projected/86de1377-2b1d-4938-ac24-8b7d4f48902d-kube-api-access-h74rg\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.254729 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-public-tls-certs\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.254879 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.255043 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86de1377-2b1d-4938-ac24-8b7d4f48902d-logs\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.317900 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.329606 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.340674 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.343012 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.345580 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.345740 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.345643 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.358281 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86de1377-2b1d-4938-ac24-8b7d4f48902d-logs\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.358576 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.358710 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-config-data\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.358832 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h74rg\" (UniqueName: \"kubernetes.io/projected/86de1377-2b1d-4938-ac24-8b7d4f48902d-kube-api-access-h74rg\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.359112 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-public-tls-certs\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.359371 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.361398 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86de1377-2b1d-4938-ac24-8b7d4f48902d-logs\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.362139 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-prrs8"] Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.362979 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.363367 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.364474 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.368776 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-public-tls-certs\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.372001 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.372332 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.383387 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h74rg\" (UniqueName: \"kubernetes.io/projected/86de1377-2b1d-4938-ac24-8b7d4f48902d-kube-api-access-h74rg\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.387051 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.388363 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-config-data\") pod \"nova-api-0\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.399780 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-prrs8"] Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.426140 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.460772 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.460815 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-scripts\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.460850 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c42pz\" (UniqueName: \"kubernetes.io/projected/47e6a48a-4ef0-4764-a132-50140d86a6b2-kube-api-access-c42pz\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.460877 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-prrs8\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.460906 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-scripts\") pod \"nova-cell1-cell-mapping-prrs8\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.460921 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pmhj\" (UniqueName: \"kubernetes.io/projected/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-kube-api-access-5pmhj\") pod \"nova-cell1-cell-mapping-prrs8\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.460937 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47e6a48a-4ef0-4764-a132-50140d86a6b2-log-httpd\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.460986 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-config-data\") pod \"nova-cell1-cell-mapping-prrs8\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.461040 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.461059 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-config-data\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.461073 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.461094 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47e6a48a-4ef0-4764-a132-50140d86a6b2-run-httpd\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.563129 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c42pz\" (UniqueName: \"kubernetes.io/projected/47e6a48a-4ef0-4764-a132-50140d86a6b2-kube-api-access-c42pz\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.563188 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-prrs8\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.563227 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-scripts\") pod \"nova-cell1-cell-mapping-prrs8\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.563243 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pmhj\" (UniqueName: \"kubernetes.io/projected/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-kube-api-access-5pmhj\") pod \"nova-cell1-cell-mapping-prrs8\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.563260 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47e6a48a-4ef0-4764-a132-50140d86a6b2-log-httpd\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.563325 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-config-data\") pod \"nova-cell1-cell-mapping-prrs8\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.563394 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.563420 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-config-data\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.563440 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.563469 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47e6a48a-4ef0-4764-a132-50140d86a6b2-run-httpd\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.563495 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.563513 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-scripts\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.566617 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47e6a48a-4ef0-4764-a132-50140d86a6b2-run-httpd\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.569368 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47e6a48a-4ef0-4764-a132-50140d86a6b2-log-httpd\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.570560 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-scripts\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.571263 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.571816 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-scripts\") pod \"nova-cell1-cell-mapping-prrs8\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.571890 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-prrs8\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.572637 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-config-data\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.575533 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.577129 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-config-data\") pod \"nova-cell1-cell-mapping-prrs8\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.594874 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pmhj\" (UniqueName: \"kubernetes.io/projected/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-kube-api-access-5pmhj\") pod \"nova-cell1-cell-mapping-prrs8\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.597313 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c42pz\" (UniqueName: \"kubernetes.io/projected/47e6a48a-4ef0-4764-a132-50140d86a6b2-kube-api-access-c42pz\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.601056 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.665875 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:45:01 crc kubenswrapper[4854]: I1007 12:45:01.696053 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:02 crc kubenswrapper[4854]: I1007 12:45:02.013424 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86de1377-2b1d-4938-ac24-8b7d4f48902d","Type":"ContainerStarted","Data":"162a224bb8a2bd94170be801240a9f973f388c76df776a3ce8959824b8b59a60"} Oct 07 12:45:02 crc kubenswrapper[4854]: I1007 12:45:02.022025 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:45:02 crc kubenswrapper[4854]: I1007 12:45:02.034742 4854 generic.go:334] "Generic (PLEG): container finished" podID="1779af00-05df-4865-b313-cd038772c19f" containerID="acb66eb6bd6eb8076f9dddda371d67eb437edfd33d9a97a46ccdb5edca43390f" exitCode=0 Oct 07 12:45:02 crc kubenswrapper[4854]: I1007 12:45:02.036061 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" event={"ID":"1779af00-05df-4865-b313-cd038772c19f","Type":"ContainerDied","Data":"acb66eb6bd6eb8076f9dddda371d67eb437edfd33d9a97a46ccdb5edca43390f"} Oct 07 12:45:02 crc kubenswrapper[4854]: I1007 12:45:02.036124 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" event={"ID":"1779af00-05df-4865-b313-cd038772c19f","Type":"ContainerStarted","Data":"e3279955c059b8f0a8bfbd045043259109239b38df6915731bef9dd542a101a4"} Oct 07 12:45:02 crc kubenswrapper[4854]: I1007 12:45:02.282100 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:45:02 crc kubenswrapper[4854]: I1007 12:45:02.292125 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 12:45:02 crc kubenswrapper[4854]: I1007 12:45:02.389328 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-prrs8"] Oct 07 12:45:02 crc kubenswrapper[4854]: W1007 12:45:02.393636 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bf3cb62_a4b1_4252_bbee_41e1ccc47dd5.slice/crio-189a4e3f6c323eeca51735a8cfa0bfeaf238437dd4f2d0dc3772daa68401ae8a WatchSource:0}: Error finding container 189a4e3f6c323eeca51735a8cfa0bfeaf238437dd4f2d0dc3772daa68401ae8a: Status 404 returned error can't find the container with id 189a4e3f6c323eeca51735a8cfa0bfeaf238437dd4f2d0dc3772daa68401ae8a Oct 07 12:45:02 crc kubenswrapper[4854]: I1007 12:45:02.714982 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01251017-3c38-4f4d-a3cb-5fd3bf8eba53" path="/var/lib/kubelet/pods/01251017-3c38-4f4d-a3cb-5fd3bf8eba53/volumes" Oct 07 12:45:02 crc kubenswrapper[4854]: I1007 12:45:02.716628 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63f2f561-9c9b-4ab6-8c4f-562f2c07b939" path="/var/lib/kubelet/pods/63f2f561-9c9b-4ab6-8c4f-562f2c07b939/volumes" Oct 07 12:45:03 crc kubenswrapper[4854]: I1007 12:45:03.046108 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47e6a48a-4ef0-4764-a132-50140d86a6b2","Type":"ContainerStarted","Data":"e5ddfb8ca73ff551352cd917d51450d9d9d9a3f562d75cea88f86c7793cfc38e"} Oct 07 12:45:03 crc kubenswrapper[4854]: I1007 12:45:03.049121 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86de1377-2b1d-4938-ac24-8b7d4f48902d","Type":"ContainerStarted","Data":"0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981"} Oct 07 12:45:03 crc kubenswrapper[4854]: I1007 12:45:03.049177 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86de1377-2b1d-4938-ac24-8b7d4f48902d","Type":"ContainerStarted","Data":"307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470"} Oct 07 12:45:03 crc kubenswrapper[4854]: I1007 12:45:03.054303 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-prrs8" event={"ID":"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5","Type":"ContainerStarted","Data":"9b03a80d2912fe2dddcd14b778e7f6883a6f0136c4cf7cecbaf81439b6bff4fd"} Oct 07 12:45:03 crc kubenswrapper[4854]: I1007 12:45:03.054363 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-prrs8" event={"ID":"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5","Type":"ContainerStarted","Data":"189a4e3f6c323eeca51735a8cfa0bfeaf238437dd4f2d0dc3772daa68401ae8a"} Oct 07 12:45:03 crc kubenswrapper[4854]: I1007 12:45:03.112409 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.1123920800000002 podStartE2EDuration="2.11239208s" podCreationTimestamp="2025-10-07 12:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:45:03.07549803 +0000 UTC m=+1219.063330285" watchObservedRunningTime="2025-10-07 12:45:03.11239208 +0000 UTC m=+1219.100224325" Oct 07 12:45:03 crc kubenswrapper[4854]: I1007 12:45:03.115227 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-prrs8" podStartSLOduration=2.1152184529999998 podStartE2EDuration="2.115218453s" podCreationTimestamp="2025-10-07 12:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:45:03.105257521 +0000 UTC m=+1219.093089776" watchObservedRunningTime="2025-10-07 12:45:03.115218453 +0000 UTC m=+1219.103050708" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:03.368314 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:03.434018 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-qrkmh"] Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:03.434329 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" podUID="08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" containerName="dnsmasq-dns" containerID="cri-o://bd5c499f0e3e8753c745d12687bdb59d32b4860f35ea257290cecbb1a4600428" gracePeriod=10 Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:03.517590 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:03.624319 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1779af00-05df-4865-b313-cd038772c19f-secret-volume\") pod \"1779af00-05df-4865-b313-cd038772c19f\" (UID: \"1779af00-05df-4865-b313-cd038772c19f\") " Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:03.624583 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gsc7\" (UniqueName: \"kubernetes.io/projected/1779af00-05df-4865-b313-cd038772c19f-kube-api-access-9gsc7\") pod \"1779af00-05df-4865-b313-cd038772c19f\" (UID: \"1779af00-05df-4865-b313-cd038772c19f\") " Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:03.624621 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1779af00-05df-4865-b313-cd038772c19f-config-volume\") pod \"1779af00-05df-4865-b313-cd038772c19f\" (UID: \"1779af00-05df-4865-b313-cd038772c19f\") " Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:03.625438 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1779af00-05df-4865-b313-cd038772c19f-config-volume" (OuterVolumeSpecName: "config-volume") pod "1779af00-05df-4865-b313-cd038772c19f" (UID: "1779af00-05df-4865-b313-cd038772c19f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:03.642051 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1779af00-05df-4865-b313-cd038772c19f-kube-api-access-9gsc7" (OuterVolumeSpecName: "kube-api-access-9gsc7") pod "1779af00-05df-4865-b313-cd038772c19f" (UID: "1779af00-05df-4865-b313-cd038772c19f"). InnerVolumeSpecName "kube-api-access-9gsc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:03.642213 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1779af00-05df-4865-b313-cd038772c19f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1779af00-05df-4865-b313-cd038772c19f" (UID: "1779af00-05df-4865-b313-cd038772c19f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:03.726518 4854 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1779af00-05df-4865-b313-cd038772c19f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:03.726553 4854 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1779af00-05df-4865-b313-cd038772c19f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:03.726563 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gsc7\" (UniqueName: \"kubernetes.io/projected/1779af00-05df-4865-b313-cd038772c19f-kube-api-access-9gsc7\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:04.065081 4854 generic.go:334] "Generic (PLEG): container finished" podID="08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" containerID="bd5c499f0e3e8753c745d12687bdb59d32b4860f35ea257290cecbb1a4600428" exitCode=0 Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:04.065363 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" event={"ID":"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b","Type":"ContainerDied","Data":"bd5c499f0e3e8753c745d12687bdb59d32b4860f35ea257290cecbb1a4600428"} Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:04.069380 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:04.071421 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s" event={"ID":"1779af00-05df-4865-b313-cd038772c19f","Type":"ContainerDied","Data":"e3279955c059b8f0a8bfbd045043259109239b38df6915731bef9dd542a101a4"} Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:04.071449 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3279955c059b8f0a8bfbd045043259109239b38df6915731bef9dd542a101a4" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:04.091749 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" podUID="08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.186:5353: connect: connection refused" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:05.091132 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47e6a48a-4ef0-4764-a132-50140d86a6b2","Type":"ContainerStarted","Data":"b9891d843c836c188cc5c0d2b4df4db333aab6e636378878b75cfbd65ecdb10b"} Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.104980 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47e6a48a-4ef0-4764-a132-50140d86a6b2","Type":"ContainerStarted","Data":"e2647a1c022554b80ce34c49ff2bbcb7d62b1334b8589db6fa5444a0549bf597"} Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.353625 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.481892 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-ovsdbserver-nb\") pod \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.481945 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-ovsdbserver-sb\") pod \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.481970 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-dns-swift-storage-0\") pod \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.481996 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-dns-svc\") pod \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.482012 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8gbx\" (UniqueName: \"kubernetes.io/projected/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-kube-api-access-k8gbx\") pod \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.482085 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-config\") pod \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\" (UID: \"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b\") " Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.487928 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-kube-api-access-k8gbx" (OuterVolumeSpecName: "kube-api-access-k8gbx") pod "08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" (UID: "08918bc4-e9c7-4b84-8983-ea3c5a62aa3b"). InnerVolumeSpecName "kube-api-access-k8gbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.535793 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" (UID: "08918bc4-e9c7-4b84-8983-ea3c5a62aa3b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.544662 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" (UID: "08918bc4-e9c7-4b84-8983-ea3c5a62aa3b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.544697 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" (UID: "08918bc4-e9c7-4b84-8983-ea3c5a62aa3b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.549864 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-config" (OuterVolumeSpecName: "config") pod "08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" (UID: "08918bc4-e9c7-4b84-8983-ea3c5a62aa3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.551479 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" (UID: "08918bc4-e9c7-4b84-8983-ea3c5a62aa3b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.584498 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.584528 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.584538 4854 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.584550 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.584559 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8gbx\" (UniqueName: \"kubernetes.io/projected/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-kube-api-access-k8gbx\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:06 crc kubenswrapper[4854]: I1007 12:45:06.584570 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:07 crc kubenswrapper[4854]: I1007 12:45:07.122133 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47e6a48a-4ef0-4764-a132-50140d86a6b2","Type":"ContainerStarted","Data":"1bb3783752e90f8f65fc57d4fca86c174ad0436acd90b5d1426146067be381e1"} Oct 07 12:45:07 crc kubenswrapper[4854]: I1007 12:45:07.125280 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" event={"ID":"08918bc4-e9c7-4b84-8983-ea3c5a62aa3b","Type":"ContainerDied","Data":"8461a64008afbb098d179ebbd8511acd49c252c0dd6c91c1a66d9df5c6d89aa3"} Oct 07 12:45:07 crc kubenswrapper[4854]: I1007 12:45:07.125325 4854 scope.go:117] "RemoveContainer" containerID="bd5c499f0e3e8753c745d12687bdb59d32b4860f35ea257290cecbb1a4600428" Oct 07 12:45:07 crc kubenswrapper[4854]: I1007 12:45:07.125466 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-qrkmh" Oct 07 12:45:07 crc kubenswrapper[4854]: I1007 12:45:07.154940 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-qrkmh"] Oct 07 12:45:07 crc kubenswrapper[4854]: I1007 12:45:07.163960 4854 scope.go:117] "RemoveContainer" containerID="a201fe1b30730ebacc80338598ad0e731d64d92665cd87dafb32a0d8fee2a829" Oct 07 12:45:07 crc kubenswrapper[4854]: I1007 12:45:07.166179 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-qrkmh"] Oct 07 12:45:08 crc kubenswrapper[4854]: I1007 12:45:08.139484 4854 generic.go:334] "Generic (PLEG): container finished" podID="7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5" containerID="9b03a80d2912fe2dddcd14b778e7f6883a6f0136c4cf7cecbaf81439b6bff4fd" exitCode=0 Oct 07 12:45:08 crc kubenswrapper[4854]: I1007 12:45:08.139564 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-prrs8" event={"ID":"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5","Type":"ContainerDied","Data":"9b03a80d2912fe2dddcd14b778e7f6883a6f0136c4cf7cecbaf81439b6bff4fd"} Oct 07 12:45:08 crc kubenswrapper[4854]: I1007 12:45:08.712752 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" path="/var/lib/kubelet/pods/08918bc4-e9c7-4b84-8983-ea3c5a62aa3b/volumes" Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.163117 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47e6a48a-4ef0-4764-a132-50140d86a6b2","Type":"ContainerStarted","Data":"43911c074518df2d12734bda5be2d0438565e37df58ee0f1bb19e3735feca614"} Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.164791 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.527050 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.551293 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.565371161 podStartE2EDuration="8.551272735s" podCreationTimestamp="2025-10-07 12:45:01 +0000 UTC" firstStartedPulling="2025-10-07 12:45:02.2919361 +0000 UTC m=+1218.279768355" lastFinishedPulling="2025-10-07 12:45:08.277837664 +0000 UTC m=+1224.265669929" observedRunningTime="2025-10-07 12:45:09.187007221 +0000 UTC m=+1225.174839496" watchObservedRunningTime="2025-10-07 12:45:09.551272735 +0000 UTC m=+1225.539104990" Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.643424 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-combined-ca-bundle\") pod \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.643672 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pmhj\" (UniqueName: \"kubernetes.io/projected/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-kube-api-access-5pmhj\") pod \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.643766 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-config-data\") pod \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.643930 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-scripts\") pod \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\" (UID: \"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5\") " Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.649342 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-scripts" (OuterVolumeSpecName: "scripts") pod "7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5" (UID: "7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.649397 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-kube-api-access-5pmhj" (OuterVolumeSpecName: "kube-api-access-5pmhj") pod "7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5" (UID: "7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5"). InnerVolumeSpecName "kube-api-access-5pmhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.670026 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-config-data" (OuterVolumeSpecName: "config-data") pod "7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5" (UID: "7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.696216 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5" (UID: "7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.747222 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pmhj\" (UniqueName: \"kubernetes.io/projected/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-kube-api-access-5pmhj\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.747284 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.747313 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:09 crc kubenswrapper[4854]: I1007 12:45:09.747337 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:10 crc kubenswrapper[4854]: I1007 12:45:10.183774 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-prrs8" Oct 07 12:45:10 crc kubenswrapper[4854]: I1007 12:45:10.185001 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-prrs8" event={"ID":"7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5","Type":"ContainerDied","Data":"189a4e3f6c323eeca51735a8cfa0bfeaf238437dd4f2d0dc3772daa68401ae8a"} Oct 07 12:45:10 crc kubenswrapper[4854]: I1007 12:45:10.185126 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="189a4e3f6c323eeca51735a8cfa0bfeaf238437dd4f2d0dc3772daa68401ae8a" Oct 07 12:45:10 crc kubenswrapper[4854]: I1007 12:45:10.207891 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 12:45:10 crc kubenswrapper[4854]: I1007 12:45:10.214070 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 12:45:10 crc kubenswrapper[4854]: I1007 12:45:10.230523 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 12:45:10 crc kubenswrapper[4854]: I1007 12:45:10.391726 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:45:10 crc kubenswrapper[4854]: I1007 12:45:10.392243 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="86de1377-2b1d-4938-ac24-8b7d4f48902d" containerName="nova-api-log" containerID="cri-o://307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470" gracePeriod=30 Oct 07 12:45:10 crc kubenswrapper[4854]: I1007 12:45:10.392481 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="86de1377-2b1d-4938-ac24-8b7d4f48902d" containerName="nova-api-api" containerID="cri-o://0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981" gracePeriod=30 Oct 07 12:45:10 crc kubenswrapper[4854]: I1007 12:45:10.405969 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:45:10 crc kubenswrapper[4854]: I1007 12:45:10.406296 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a9b25c4d-8bd7-4135-bf3e-aff177971588" containerName="nova-scheduler-scheduler" containerID="cri-o://39620a647b460de5f35393c3799df63e681c9e03d0decae7a6f3e51d1f325986" gracePeriod=30 Oct 07 12:45:10 crc kubenswrapper[4854]: I1007 12:45:10.417526 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:45:10 crc kubenswrapper[4854]: I1007 12:45:10.807952 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:45:10 crc kubenswrapper[4854]: I1007 12:45:10.808386 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:45:10 crc kubenswrapper[4854]: E1007 12:45:10.982631 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="39620a647b460de5f35393c3799df63e681c9e03d0decae7a6f3e51d1f325986" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 12:45:10 crc kubenswrapper[4854]: E1007 12:45:10.986028 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="39620a647b460de5f35393c3799df63e681c9e03d0decae7a6f3e51d1f325986" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 12:45:10 crc kubenswrapper[4854]: E1007 12:45:10.989193 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="39620a647b460de5f35393c3799df63e681c9e03d0decae7a6f3e51d1f325986" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 12:45:10 crc kubenswrapper[4854]: E1007 12:45:10.989254 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a9b25c4d-8bd7-4135-bf3e-aff177971588" containerName="nova-scheduler-scheduler" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.130266 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.191255 4854 generic.go:334] "Generic (PLEG): container finished" podID="86de1377-2b1d-4938-ac24-8b7d4f48902d" containerID="0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981" exitCode=0 Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.192091 4854 generic.go:334] "Generic (PLEG): container finished" podID="86de1377-2b1d-4938-ac24-8b7d4f48902d" containerID="307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470" exitCode=143 Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.192074 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86de1377-2b1d-4938-ac24-8b7d4f48902d","Type":"ContainerDied","Data":"0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981"} Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.192298 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86de1377-2b1d-4938-ac24-8b7d4f48902d","Type":"ContainerDied","Data":"307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470"} Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.192340 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86de1377-2b1d-4938-ac24-8b7d4f48902d","Type":"ContainerDied","Data":"162a224bb8a2bd94170be801240a9f973f388c76df776a3ce8959824b8b59a60"} Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.192347 4854 scope.go:117] "RemoveContainer" containerID="0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.192074 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.208619 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.250391 4854 scope.go:117] "RemoveContainer" containerID="307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.270088 4854 scope.go:117] "RemoveContainer" containerID="0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981" Oct 07 12:45:11 crc kubenswrapper[4854]: E1007 12:45:11.275129 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981\": container with ID starting with 0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981 not found: ID does not exist" containerID="0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.275200 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981"} err="failed to get container status \"0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981\": rpc error: code = NotFound desc = could not find container \"0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981\": container with ID starting with 0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981 not found: ID does not exist" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.275232 4854 scope.go:117] "RemoveContainer" containerID="307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.275820 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h74rg\" (UniqueName: \"kubernetes.io/projected/86de1377-2b1d-4938-ac24-8b7d4f48902d-kube-api-access-h74rg\") pod \"86de1377-2b1d-4938-ac24-8b7d4f48902d\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.275875 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86de1377-2b1d-4938-ac24-8b7d4f48902d-logs\") pod \"86de1377-2b1d-4938-ac24-8b7d4f48902d\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.275971 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-config-data\") pod \"86de1377-2b1d-4938-ac24-8b7d4f48902d\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.276019 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-combined-ca-bundle\") pod \"86de1377-2b1d-4938-ac24-8b7d4f48902d\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.276044 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-internal-tls-certs\") pod \"86de1377-2b1d-4938-ac24-8b7d4f48902d\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.276122 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-public-tls-certs\") pod \"86de1377-2b1d-4938-ac24-8b7d4f48902d\" (UID: \"86de1377-2b1d-4938-ac24-8b7d4f48902d\") " Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.277331 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86de1377-2b1d-4938-ac24-8b7d4f48902d-logs" (OuterVolumeSpecName: "logs") pod "86de1377-2b1d-4938-ac24-8b7d4f48902d" (UID: "86de1377-2b1d-4938-ac24-8b7d4f48902d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.277518 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86de1377-2b1d-4938-ac24-8b7d4f48902d-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:11 crc kubenswrapper[4854]: E1007 12:45:11.281708 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470\": container with ID starting with 307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470 not found: ID does not exist" containerID="307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.281841 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470"} err="failed to get container status \"307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470\": rpc error: code = NotFound desc = could not find container \"307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470\": container with ID starting with 307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470 not found: ID does not exist" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.281867 4854 scope.go:117] "RemoveContainer" containerID="0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.282261 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981"} err="failed to get container status \"0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981\": rpc error: code = NotFound desc = could not find container \"0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981\": container with ID starting with 0a1f3aad25966cd187b8b362761b7844752a34a05083d48eddf28bdf289e3981 not found: ID does not exist" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.282293 4854 scope.go:117] "RemoveContainer" containerID="307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.282724 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470"} err="failed to get container status \"307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470\": rpc error: code = NotFound desc = could not find container \"307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470\": container with ID starting with 307a63eda99192726f7331057f4b16b6d2bce27c85c3a3ed6702a2be2d018470 not found: ID does not exist" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.289362 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86de1377-2b1d-4938-ac24-8b7d4f48902d-kube-api-access-h74rg" (OuterVolumeSpecName: "kube-api-access-h74rg") pod "86de1377-2b1d-4938-ac24-8b7d4f48902d" (UID: "86de1377-2b1d-4938-ac24-8b7d4f48902d"). InnerVolumeSpecName "kube-api-access-h74rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.340780 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-config-data" (OuterVolumeSpecName: "config-data") pod "86de1377-2b1d-4938-ac24-8b7d4f48902d" (UID: "86de1377-2b1d-4938-ac24-8b7d4f48902d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.343352 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86de1377-2b1d-4938-ac24-8b7d4f48902d" (UID: "86de1377-2b1d-4938-ac24-8b7d4f48902d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.350321 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "86de1377-2b1d-4938-ac24-8b7d4f48902d" (UID: "86de1377-2b1d-4938-ac24-8b7d4f48902d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.360544 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "86de1377-2b1d-4938-ac24-8b7d4f48902d" (UID: "86de1377-2b1d-4938-ac24-8b7d4f48902d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.379526 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h74rg\" (UniqueName: \"kubernetes.io/projected/86de1377-2b1d-4938-ac24-8b7d4f48902d-kube-api-access-h74rg\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.379560 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.379572 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.379581 4854 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.379590 4854 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86de1377-2b1d-4938-ac24-8b7d4f48902d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.577862 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.585311 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.608096 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 12:45:11 crc kubenswrapper[4854]: E1007 12:45:11.626855 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1779af00-05df-4865-b313-cd038772c19f" containerName="collect-profiles" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.626890 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1779af00-05df-4865-b313-cd038772c19f" containerName="collect-profiles" Oct 07 12:45:11 crc kubenswrapper[4854]: E1007 12:45:11.626925 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86de1377-2b1d-4938-ac24-8b7d4f48902d" containerName="nova-api-log" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.626932 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="86de1377-2b1d-4938-ac24-8b7d4f48902d" containerName="nova-api-log" Oct 07 12:45:11 crc kubenswrapper[4854]: E1007 12:45:11.626954 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5" containerName="nova-manage" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.626960 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5" containerName="nova-manage" Oct 07 12:45:11 crc kubenswrapper[4854]: E1007 12:45:11.626973 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" containerName="init" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.626979 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" containerName="init" Oct 07 12:45:11 crc kubenswrapper[4854]: E1007 12:45:11.627018 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" containerName="dnsmasq-dns" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.627025 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" containerName="dnsmasq-dns" Oct 07 12:45:11 crc kubenswrapper[4854]: E1007 12:45:11.627034 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86de1377-2b1d-4938-ac24-8b7d4f48902d" containerName="nova-api-api" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.627043 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="86de1377-2b1d-4938-ac24-8b7d4f48902d" containerName="nova-api-api" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.627648 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="86de1377-2b1d-4938-ac24-8b7d4f48902d" containerName="nova-api-api" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.627711 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="86de1377-2b1d-4938-ac24-8b7d4f48902d" containerName="nova-api-log" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.627739 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="08918bc4-e9c7-4b84-8983-ea3c5a62aa3b" containerName="dnsmasq-dns" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.627749 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="1779af00-05df-4865-b313-cd038772c19f" containerName="collect-profiles" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.627792 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5" containerName="nova-manage" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.636908 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.641091 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.641476 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.641656 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.643499 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.684787 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg69z\" (UniqueName: \"kubernetes.io/projected/37dd5983-0d4d-4097-8657-f408e9bc68c0-kube-api-access-lg69z\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.684865 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.685329 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37dd5983-0d4d-4097-8657-f408e9bc68c0-logs\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.685379 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.685402 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-config-data\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.686298 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.788194 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37dd5983-0d4d-4097-8657-f408e9bc68c0-logs\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.788279 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.788339 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-config-data\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.788549 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.788651 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg69z\" (UniqueName: \"kubernetes.io/projected/37dd5983-0d4d-4097-8657-f408e9bc68c0-kube-api-access-lg69z\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.788688 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37dd5983-0d4d-4097-8657-f408e9bc68c0-logs\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.788743 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.793221 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.794086 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.793387 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-config-data\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.802626 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.826052 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg69z\" (UniqueName: \"kubernetes.io/projected/37dd5983-0d4d-4097-8657-f408e9bc68c0-kube-api-access-lg69z\") pod \"nova-api-0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " pod="openstack/nova-api-0" Oct 07 12:45:11 crc kubenswrapper[4854]: I1007 12:45:11.966224 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:45:12 crc kubenswrapper[4854]: I1007 12:45:12.203484 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4f6c2d92-425d-4406-9436-b98f0f0a313f" containerName="nova-metadata-log" containerID="cri-o://66b39f54008ac95dd36d0bd04bcb71d6007fd479b442082345ebd139fb1623e3" gracePeriod=30 Oct 07 12:45:12 crc kubenswrapper[4854]: I1007 12:45:12.203960 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4f6c2d92-425d-4406-9436-b98f0f0a313f" containerName="nova-metadata-metadata" containerID="cri-o://07698d73044f4030e36d18f8657ce12a43b74bcd1e6b2ca2802cb8dc6179c19d" gracePeriod=30 Oct 07 12:45:12 crc kubenswrapper[4854]: I1007 12:45:12.480881 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:45:12 crc kubenswrapper[4854]: W1007 12:45:12.486077 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37dd5983_0d4d_4097_8657_f408e9bc68c0.slice/crio-5eb854a7a43acaab354b891bdb18b1efb536f1c9425821ac5965dae2fea77fa6 WatchSource:0}: Error finding container 5eb854a7a43acaab354b891bdb18b1efb536f1c9425821ac5965dae2fea77fa6: Status 404 returned error can't find the container with id 5eb854a7a43acaab354b891bdb18b1efb536f1c9425821ac5965dae2fea77fa6 Oct 07 12:45:12 crc kubenswrapper[4854]: I1007 12:45:12.721456 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86de1377-2b1d-4938-ac24-8b7d4f48902d" path="/var/lib/kubelet/pods/86de1377-2b1d-4938-ac24-8b7d4f48902d/volumes" Oct 07 12:45:13 crc kubenswrapper[4854]: I1007 12:45:13.228353 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37dd5983-0d4d-4097-8657-f408e9bc68c0","Type":"ContainerStarted","Data":"bd7103524f01d027fb1b2f752223346853e431f9b045e756199823ef3b9a9a3d"} Oct 07 12:45:13 crc kubenswrapper[4854]: I1007 12:45:13.228674 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37dd5983-0d4d-4097-8657-f408e9bc68c0","Type":"ContainerStarted","Data":"6c3add82afc5a4706f36b57114ee421863a370ea330e776fc1686cfde8841f49"} Oct 07 12:45:13 crc kubenswrapper[4854]: I1007 12:45:13.228692 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37dd5983-0d4d-4097-8657-f408e9bc68c0","Type":"ContainerStarted","Data":"5eb854a7a43acaab354b891bdb18b1efb536f1c9425821ac5965dae2fea77fa6"} Oct 07 12:45:13 crc kubenswrapper[4854]: I1007 12:45:13.232879 4854 generic.go:334] "Generic (PLEG): container finished" podID="4f6c2d92-425d-4406-9436-b98f0f0a313f" containerID="66b39f54008ac95dd36d0bd04bcb71d6007fd479b442082345ebd139fb1623e3" exitCode=143 Oct 07 12:45:13 crc kubenswrapper[4854]: I1007 12:45:13.232953 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f6c2d92-425d-4406-9436-b98f0f0a313f","Type":"ContainerDied","Data":"66b39f54008ac95dd36d0bd04bcb71d6007fd479b442082345ebd139fb1623e3"} Oct 07 12:45:13 crc kubenswrapper[4854]: I1007 12:45:13.257886 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.257862471 podStartE2EDuration="2.257862471s" podCreationTimestamp="2025-10-07 12:45:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:45:13.247770635 +0000 UTC m=+1229.235602880" watchObservedRunningTime="2025-10-07 12:45:13.257862471 +0000 UTC m=+1229.245694726" Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.259223 4854 generic.go:334] "Generic (PLEG): container finished" podID="a9b25c4d-8bd7-4135-bf3e-aff177971588" containerID="39620a647b460de5f35393c3799df63e681c9e03d0decae7a6f3e51d1f325986" exitCode=0 Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.259321 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a9b25c4d-8bd7-4135-bf3e-aff177971588","Type":"ContainerDied","Data":"39620a647b460de5f35393c3799df63e681c9e03d0decae7a6f3e51d1f325986"} Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.259868 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a9b25c4d-8bd7-4135-bf3e-aff177971588","Type":"ContainerDied","Data":"a7931ab57de1e6d0d5b3f61e5524fc1387582f8dca784d689925b1d44198bb8e"} Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.259889 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7931ab57de1e6d0d5b3f61e5524fc1387582f8dca784d689925b1d44198bb8e" Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.332871 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4f6c2d92-425d-4406-9436-b98f0f0a313f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:51362->10.217.0.192:8775: read: connection reset by peer" Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.332941 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4f6c2d92-425d-4406-9436-b98f0f0a313f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:51368->10.217.0.192:8775: read: connection reset by peer" Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.411042 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.465306 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b25c4d-8bd7-4135-bf3e-aff177971588-combined-ca-bundle\") pod \"a9b25c4d-8bd7-4135-bf3e-aff177971588\" (UID: \"a9b25c4d-8bd7-4135-bf3e-aff177971588\") " Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.465508 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjfcf\" (UniqueName: \"kubernetes.io/projected/a9b25c4d-8bd7-4135-bf3e-aff177971588-kube-api-access-gjfcf\") pod \"a9b25c4d-8bd7-4135-bf3e-aff177971588\" (UID: \"a9b25c4d-8bd7-4135-bf3e-aff177971588\") " Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.466495 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9b25c4d-8bd7-4135-bf3e-aff177971588-config-data\") pod \"a9b25c4d-8bd7-4135-bf3e-aff177971588\" (UID: \"a9b25c4d-8bd7-4135-bf3e-aff177971588\") " Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.481331 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b25c4d-8bd7-4135-bf3e-aff177971588-kube-api-access-gjfcf" (OuterVolumeSpecName: "kube-api-access-gjfcf") pod "a9b25c4d-8bd7-4135-bf3e-aff177971588" (UID: "a9b25c4d-8bd7-4135-bf3e-aff177971588"). InnerVolumeSpecName "kube-api-access-gjfcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.495214 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9b25c4d-8bd7-4135-bf3e-aff177971588-config-data" (OuterVolumeSpecName: "config-data") pod "a9b25c4d-8bd7-4135-bf3e-aff177971588" (UID: "a9b25c4d-8bd7-4135-bf3e-aff177971588"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.507324 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9b25c4d-8bd7-4135-bf3e-aff177971588-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9b25c4d-8bd7-4135-bf3e-aff177971588" (UID: "a9b25c4d-8bd7-4135-bf3e-aff177971588"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.568607 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b25c4d-8bd7-4135-bf3e-aff177971588-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.568638 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjfcf\" (UniqueName: \"kubernetes.io/projected/a9b25c4d-8bd7-4135-bf3e-aff177971588-kube-api-access-gjfcf\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.568650 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9b25c4d-8bd7-4135-bf3e-aff177971588-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.831570 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.975844 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-combined-ca-bundle\") pod \"4f6c2d92-425d-4406-9436-b98f0f0a313f\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.975904 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f6c2d92-425d-4406-9436-b98f0f0a313f-logs\") pod \"4f6c2d92-425d-4406-9436-b98f0f0a313f\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.975955 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-nova-metadata-tls-certs\") pod \"4f6c2d92-425d-4406-9436-b98f0f0a313f\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.976031 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqbj7\" (UniqueName: \"kubernetes.io/projected/4f6c2d92-425d-4406-9436-b98f0f0a313f-kube-api-access-nqbj7\") pod \"4f6c2d92-425d-4406-9436-b98f0f0a313f\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.976229 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-config-data\") pod \"4f6c2d92-425d-4406-9436-b98f0f0a313f\" (UID: \"4f6c2d92-425d-4406-9436-b98f0f0a313f\") " Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.976560 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f6c2d92-425d-4406-9436-b98f0f0a313f-logs" (OuterVolumeSpecName: "logs") pod "4f6c2d92-425d-4406-9436-b98f0f0a313f" (UID: "4f6c2d92-425d-4406-9436-b98f0f0a313f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:45:15 crc kubenswrapper[4854]: I1007 12:45:15.989836 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f6c2d92-425d-4406-9436-b98f0f0a313f-kube-api-access-nqbj7" (OuterVolumeSpecName: "kube-api-access-nqbj7") pod "4f6c2d92-425d-4406-9436-b98f0f0a313f" (UID: "4f6c2d92-425d-4406-9436-b98f0f0a313f"). InnerVolumeSpecName "kube-api-access-nqbj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.004119 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-config-data" (OuterVolumeSpecName: "config-data") pod "4f6c2d92-425d-4406-9436-b98f0f0a313f" (UID: "4f6c2d92-425d-4406-9436-b98f0f0a313f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.008882 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f6c2d92-425d-4406-9436-b98f0f0a313f" (UID: "4f6c2d92-425d-4406-9436-b98f0f0a313f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.043018 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4f6c2d92-425d-4406-9436-b98f0f0a313f" (UID: "4f6c2d92-425d-4406-9436-b98f0f0a313f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.079396 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.079447 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.079462 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f6c2d92-425d-4406-9436-b98f0f0a313f-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.079474 4854 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6c2d92-425d-4406-9436-b98f0f0a313f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.079486 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqbj7\" (UniqueName: \"kubernetes.io/projected/4f6c2d92-425d-4406-9436-b98f0f0a313f-kube-api-access-nqbj7\") on node \"crc\" DevicePath \"\"" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.273388 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.276556 4854 generic.go:334] "Generic (PLEG): container finished" podID="4f6c2d92-425d-4406-9436-b98f0f0a313f" containerID="07698d73044f4030e36d18f8657ce12a43b74bcd1e6b2ca2802cb8dc6179c19d" exitCode=0 Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.276630 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f6c2d92-425d-4406-9436-b98f0f0a313f","Type":"ContainerDied","Data":"07698d73044f4030e36d18f8657ce12a43b74bcd1e6b2ca2802cb8dc6179c19d"} Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.276708 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f6c2d92-425d-4406-9436-b98f0f0a313f","Type":"ContainerDied","Data":"761ad40ba1f8ab89eeb04811a35954452820d329763fa49ec7f5120628991d06"} Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.276710 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.276734 4854 scope.go:117] "RemoveContainer" containerID="07698d73044f4030e36d18f8657ce12a43b74bcd1e6b2ca2802cb8dc6179c19d" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.314518 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.320316 4854 scope.go:117] "RemoveContainer" containerID="66b39f54008ac95dd36d0bd04bcb71d6007fd479b442082345ebd139fb1623e3" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.339815 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.345110 4854 scope.go:117] "RemoveContainer" containerID="07698d73044f4030e36d18f8657ce12a43b74bcd1e6b2ca2802cb8dc6179c19d" Oct 07 12:45:16 crc kubenswrapper[4854]: E1007 12:45:16.345542 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07698d73044f4030e36d18f8657ce12a43b74bcd1e6b2ca2802cb8dc6179c19d\": container with ID starting with 07698d73044f4030e36d18f8657ce12a43b74bcd1e6b2ca2802cb8dc6179c19d not found: ID does not exist" containerID="07698d73044f4030e36d18f8657ce12a43b74bcd1e6b2ca2802cb8dc6179c19d" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.345574 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07698d73044f4030e36d18f8657ce12a43b74bcd1e6b2ca2802cb8dc6179c19d"} err="failed to get container status \"07698d73044f4030e36d18f8657ce12a43b74bcd1e6b2ca2802cb8dc6179c19d\": rpc error: code = NotFound desc = could not find container \"07698d73044f4030e36d18f8657ce12a43b74bcd1e6b2ca2802cb8dc6179c19d\": container with ID starting with 07698d73044f4030e36d18f8657ce12a43b74bcd1e6b2ca2802cb8dc6179c19d not found: ID does not exist" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.345596 4854 scope.go:117] "RemoveContainer" containerID="66b39f54008ac95dd36d0bd04bcb71d6007fd479b442082345ebd139fb1623e3" Oct 07 12:45:16 crc kubenswrapper[4854]: E1007 12:45:16.345774 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66b39f54008ac95dd36d0bd04bcb71d6007fd479b442082345ebd139fb1623e3\": container with ID starting with 66b39f54008ac95dd36d0bd04bcb71d6007fd479b442082345ebd139fb1623e3 not found: ID does not exist" containerID="66b39f54008ac95dd36d0bd04bcb71d6007fd479b442082345ebd139fb1623e3" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.345796 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b39f54008ac95dd36d0bd04bcb71d6007fd479b442082345ebd139fb1623e3"} err="failed to get container status \"66b39f54008ac95dd36d0bd04bcb71d6007fd479b442082345ebd139fb1623e3\": rpc error: code = NotFound desc = could not find container \"66b39f54008ac95dd36d0bd04bcb71d6007fd479b442082345ebd139fb1623e3\": container with ID starting with 66b39f54008ac95dd36d0bd04bcb71d6007fd479b442082345ebd139fb1623e3 not found: ID does not exist" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.387915 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.396748 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.405378 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:45:16 crc kubenswrapper[4854]: E1007 12:45:16.405799 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b25c4d-8bd7-4135-bf3e-aff177971588" containerName="nova-scheduler-scheduler" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.405816 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b25c4d-8bd7-4135-bf3e-aff177971588" containerName="nova-scheduler-scheduler" Oct 07 12:45:16 crc kubenswrapper[4854]: E1007 12:45:16.405839 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f6c2d92-425d-4406-9436-b98f0f0a313f" containerName="nova-metadata-log" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.405846 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6c2d92-425d-4406-9436-b98f0f0a313f" containerName="nova-metadata-log" Oct 07 12:45:16 crc kubenswrapper[4854]: E1007 12:45:16.405866 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f6c2d92-425d-4406-9436-b98f0f0a313f" containerName="nova-metadata-metadata" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.405873 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6c2d92-425d-4406-9436-b98f0f0a313f" containerName="nova-metadata-metadata" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.406038 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f6c2d92-425d-4406-9436-b98f0f0a313f" containerName="nova-metadata-log" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.406056 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f6c2d92-425d-4406-9436-b98f0f0a313f" containerName="nova-metadata-metadata" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.406075 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b25c4d-8bd7-4135-bf3e-aff177971588" containerName="nova-scheduler-scheduler" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.407234 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.409540 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.409702 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.411945 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.412995 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.422351 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.422434 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.427901 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.487864 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttgk7\" (UniqueName: \"kubernetes.io/projected/770ca0a9-4c48-446b-be08-84b06d20d501-kube-api-access-ttgk7\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.487931 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/770ca0a9-4c48-446b-be08-84b06d20d501-logs\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.488047 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.488282 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/016c2264-9ba4-48c0-b416-02c468232b6b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"016c2264-9ba4-48c0-b416-02c468232b6b\") " pod="openstack/nova-scheduler-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.488438 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.488478 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/016c2264-9ba4-48c0-b416-02c468232b6b-config-data\") pod \"nova-scheduler-0\" (UID: \"016c2264-9ba4-48c0-b416-02c468232b6b\") " pod="openstack/nova-scheduler-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.488502 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8hv7\" (UniqueName: \"kubernetes.io/projected/016c2264-9ba4-48c0-b416-02c468232b6b-kube-api-access-c8hv7\") pod \"nova-scheduler-0\" (UID: \"016c2264-9ba4-48c0-b416-02c468232b6b\") " pod="openstack/nova-scheduler-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.488527 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-config-data\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.590976 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.591032 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/016c2264-9ba4-48c0-b416-02c468232b6b-config-data\") pod \"nova-scheduler-0\" (UID: \"016c2264-9ba4-48c0-b416-02c468232b6b\") " pod="openstack/nova-scheduler-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.591060 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8hv7\" (UniqueName: \"kubernetes.io/projected/016c2264-9ba4-48c0-b416-02c468232b6b-kube-api-access-c8hv7\") pod \"nova-scheduler-0\" (UID: \"016c2264-9ba4-48c0-b416-02c468232b6b\") " pod="openstack/nova-scheduler-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.591095 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-config-data\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.591202 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttgk7\" (UniqueName: \"kubernetes.io/projected/770ca0a9-4c48-446b-be08-84b06d20d501-kube-api-access-ttgk7\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.591294 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/770ca0a9-4c48-446b-be08-84b06d20d501-logs\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.591866 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/770ca0a9-4c48-446b-be08-84b06d20d501-logs\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.592215 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.592935 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/016c2264-9ba4-48c0-b416-02c468232b6b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"016c2264-9ba4-48c0-b416-02c468232b6b\") " pod="openstack/nova-scheduler-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.595421 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-config-data\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.595676 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.595751 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.596884 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/016c2264-9ba4-48c0-b416-02c468232b6b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"016c2264-9ba4-48c0-b416-02c468232b6b\") " pod="openstack/nova-scheduler-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.596950 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/016c2264-9ba4-48c0-b416-02c468232b6b-config-data\") pod \"nova-scheduler-0\" (UID: \"016c2264-9ba4-48c0-b416-02c468232b6b\") " pod="openstack/nova-scheduler-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.608379 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttgk7\" (UniqueName: \"kubernetes.io/projected/770ca0a9-4c48-446b-be08-84b06d20d501-kube-api-access-ttgk7\") pod \"nova-metadata-0\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.608478 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8hv7\" (UniqueName: \"kubernetes.io/projected/016c2264-9ba4-48c0-b416-02c468232b6b-kube-api-access-c8hv7\") pod \"nova-scheduler-0\" (UID: \"016c2264-9ba4-48c0-b416-02c468232b6b\") " pod="openstack/nova-scheduler-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.713305 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f6c2d92-425d-4406-9436-b98f0f0a313f" path="/var/lib/kubelet/pods/4f6c2d92-425d-4406-9436-b98f0f0a313f/volumes" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.713992 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9b25c4d-8bd7-4135-bf3e-aff177971588" path="/var/lib/kubelet/pods/a9b25c4d-8bd7-4135-bf3e-aff177971588/volumes" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.732395 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 12:45:16 crc kubenswrapper[4854]: I1007 12:45:16.740530 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 12:45:17 crc kubenswrapper[4854]: I1007 12:45:17.188966 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:45:17 crc kubenswrapper[4854]: W1007 12:45:17.260655 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod016c2264_9ba4_48c0_b416_02c468232b6b.slice/crio-e6c6a2e60341c34ca76475fd4d69991f224cb18e5de374d907f485fb628b47a0 WatchSource:0}: Error finding container e6c6a2e60341c34ca76475fd4d69991f224cb18e5de374d907f485fb628b47a0: Status 404 returned error can't find the container with id e6c6a2e60341c34ca76475fd4d69991f224cb18e5de374d907f485fb628b47a0 Oct 07 12:45:17 crc kubenswrapper[4854]: I1007 12:45:17.261846 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:45:17 crc kubenswrapper[4854]: I1007 12:45:17.298344 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"016c2264-9ba4-48c0-b416-02c468232b6b","Type":"ContainerStarted","Data":"e6c6a2e60341c34ca76475fd4d69991f224cb18e5de374d907f485fb628b47a0"} Oct 07 12:45:17 crc kubenswrapper[4854]: I1007 12:45:17.299610 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"770ca0a9-4c48-446b-be08-84b06d20d501","Type":"ContainerStarted","Data":"f73202056a1e805a5e2c0e483143fd82b536fa11f7554dece33ccf0abc6bbe11"} Oct 07 12:45:18 crc kubenswrapper[4854]: I1007 12:45:18.315104 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"770ca0a9-4c48-446b-be08-84b06d20d501","Type":"ContainerStarted","Data":"16fe06c197602fe7fb28e9bd7f56721c49f40f87574fc4f159cb1d7c6685606e"} Oct 07 12:45:18 crc kubenswrapper[4854]: I1007 12:45:18.315471 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"770ca0a9-4c48-446b-be08-84b06d20d501","Type":"ContainerStarted","Data":"6cbd3121741a98fa970c3ecfd5f8c014c006ce0662866c5090335c8dec86fdc4"} Oct 07 12:45:18 crc kubenswrapper[4854]: I1007 12:45:18.317040 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"016c2264-9ba4-48c0-b416-02c468232b6b","Type":"ContainerStarted","Data":"b33111a1b0f14dc72a4fc34de5d78d44cbf9938a6e3632fa063f8bb7985aeaf3"} Oct 07 12:45:18 crc kubenswrapper[4854]: I1007 12:45:18.346577 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.346556125 podStartE2EDuration="2.346556125s" podCreationTimestamp="2025-10-07 12:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:45:18.335021067 +0000 UTC m=+1234.322853342" watchObservedRunningTime="2025-10-07 12:45:18.346556125 +0000 UTC m=+1234.334388380" Oct 07 12:45:18 crc kubenswrapper[4854]: I1007 12:45:18.379368 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.379340744 podStartE2EDuration="2.379340744s" podCreationTimestamp="2025-10-07 12:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:45:18.370433014 +0000 UTC m=+1234.358265269" watchObservedRunningTime="2025-10-07 12:45:18.379340744 +0000 UTC m=+1234.367173019" Oct 07 12:45:21 crc kubenswrapper[4854]: I1007 12:45:21.733461 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 12:45:21 crc kubenswrapper[4854]: I1007 12:45:21.735158 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 12:45:21 crc kubenswrapper[4854]: I1007 12:45:21.741138 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 12:45:21 crc kubenswrapper[4854]: I1007 12:45:21.968725 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 12:45:21 crc kubenswrapper[4854]: I1007 12:45:21.968773 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 12:45:23 crc kubenswrapper[4854]: I1007 12:45:23.016407 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37dd5983-0d4d-4097-8657-f408e9bc68c0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 12:45:23 crc kubenswrapper[4854]: I1007 12:45:23.016435 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37dd5983-0d4d-4097-8657-f408e9bc68c0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 12:45:26 crc kubenswrapper[4854]: I1007 12:45:26.733616 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 12:45:26 crc kubenswrapper[4854]: I1007 12:45:26.734119 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 12:45:26 crc kubenswrapper[4854]: I1007 12:45:26.740824 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 12:45:26 crc kubenswrapper[4854]: I1007 12:45:26.767890 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 12:45:27 crc kubenswrapper[4854]: I1007 12:45:27.465247 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 12:45:27 crc kubenswrapper[4854]: I1007 12:45:27.747385 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="770ca0a9-4c48-446b-be08-84b06d20d501" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 12:45:27 crc kubenswrapper[4854]: I1007 12:45:27.747388 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="770ca0a9-4c48-446b-be08-84b06d20d501" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 12:45:31 crc kubenswrapper[4854]: I1007 12:45:31.679443 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 12:45:31 crc kubenswrapper[4854]: I1007 12:45:31.974654 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 12:45:31 crc kubenswrapper[4854]: I1007 12:45:31.975139 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 12:45:31 crc kubenswrapper[4854]: I1007 12:45:31.982397 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 12:45:31 crc kubenswrapper[4854]: I1007 12:45:31.983077 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 12:45:32 crc kubenswrapper[4854]: I1007 12:45:32.465874 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 12:45:32 crc kubenswrapper[4854]: I1007 12:45:32.475688 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 12:45:36 crc kubenswrapper[4854]: I1007 12:45:36.741865 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 12:45:36 crc kubenswrapper[4854]: I1007 12:45:36.747346 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 12:45:36 crc kubenswrapper[4854]: I1007 12:45:36.756842 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 12:45:37 crc kubenswrapper[4854]: I1007 12:45:37.539246 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 12:45:40 crc kubenswrapper[4854]: I1007 12:45:40.807578 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:45:40 crc kubenswrapper[4854]: I1007 12:45:40.807924 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:45:40 crc kubenswrapper[4854]: I1007 12:45:40.807978 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:45:40 crc kubenswrapper[4854]: I1007 12:45:40.809065 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"294ebdf1eb05d7d685f741557c635439a170a1959729e348cac4935535da7799"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 12:45:40 crc kubenswrapper[4854]: I1007 12:45:40.809136 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://294ebdf1eb05d7d685f741557c635439a170a1959729e348cac4935535da7799" gracePeriod=600 Oct 07 12:45:41 crc kubenswrapper[4854]: I1007 12:45:41.576788 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="294ebdf1eb05d7d685f741557c635439a170a1959729e348cac4935535da7799" exitCode=0 Oct 07 12:45:41 crc kubenswrapper[4854]: I1007 12:45:41.576839 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"294ebdf1eb05d7d685f741557c635439a170a1959729e348cac4935535da7799"} Oct 07 12:45:41 crc kubenswrapper[4854]: I1007 12:45:41.577269 4854 scope.go:117] "RemoveContainer" containerID="ce4378e583bfb2c39ca2c2683e3f5d09095f8b3086e8373f80ece7dbb8bdaa5a" Oct 07 12:45:42 crc kubenswrapper[4854]: I1007 12:45:42.587530 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"00a7905fded3000a5bba04fc59df5ffada6a53acc5865f31e13e38f5ecdf35c6"} Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.783637 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.794548 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="21065050-7bdc-4f4e-9a7b-9dbcc2dab200" containerName="cinder-scheduler" containerID="cri-o://32ad432a8d636a25c21369c163b0875cef6bace27819c32cee125a524960df32" gracePeriod=30 Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.794622 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="21065050-7bdc-4f4e-9a7b-9dbcc2dab200" containerName="probe" containerID="cri-o://3cca971c6b0c47f58de9e946459a6537fae5efe97def4c9f7b9f8c2749fe2fc8" gracePeriod=30 Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.864507 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.864752 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="0416b15d-a0a1-4bf2-bd86-4209c14c8e48" containerName="openstackclient" containerID="cri-o://d08d57f3c4a2642e473841c6986c3097f5c785af5a96f4a083b730e4a989d74c" gracePeriod=2 Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.891327 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.917253 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.917514 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" containerName="cinder-api-log" containerID="cri-o://0775213d263bbb54ab38859826cdc168c18bef5c43a091d586b83ae90efc35b8" gracePeriod=30 Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.917937 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" containerName="cinder-api" containerID="cri-o://66d61a36b5d71d29e4fc833793e7b8e2d512bb92c6cdc2ceee1421b70515026b" gracePeriod=30 Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.949926 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.972104 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-75fd88c566-5j4xn"] Oct 07 12:45:58 crc kubenswrapper[4854]: E1007 12:45:58.972638 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0416b15d-a0a1-4bf2-bd86-4209c14c8e48" containerName="openstackclient" Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.972656 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="0416b15d-a0a1-4bf2-bd86-4209c14c8e48" containerName="openstackclient" Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.972872 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="0416b15d-a0a1-4bf2-bd86-4209c14c8e48" containerName="openstackclient" Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.973907 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:58 crc kubenswrapper[4854]: I1007 12:45:58.989423 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.172:8776/healthcheck\": EOF" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.016659 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-65b8874fd7-dnnjf"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.031233 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65b8874fd7-dnnjf"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.031327 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.043078 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75fd88c566-5j4xn"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.052288 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-d4cpx"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.052565 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-d4cpx" podUID="47f8159b-e07f-47bd-92e8-a57f3e0c545d" containerName="openstack-network-exporter" containerID="cri-o://3e6647d1edb8724d040f38fe0892860ca4c3bb6eab536177c27f549d3f099144" gracePeriod=30 Oct 07 12:45:59 crc kubenswrapper[4854]: E1007 12:45:59.075497 4854 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 07 12:45:59 crc kubenswrapper[4854]: E1007 12:45:59.075579 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data podName:4c293f13-b2a5-4d4b-9f69-fd118e34eab2 nodeName:}" failed. No retries permitted until 2025-10-07 12:45:59.575556513 +0000 UTC m=+1275.563388768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data") pod "rabbitmq-server-0" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2") : configmap "rabbitmq-config-data" not found Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.114662 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance3267-account-delete-cwdxm"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.126183 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance3267-account-delete-cwdxm" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.137512 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance3267-account-delete-cwdxm"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.150225 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hllqq"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.167238 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-j5h2b"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.180610 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw68d\" (UniqueName: \"kubernetes.io/projected/04ffb838-3774-402c-9cdf-d11e51fb21e5-kube-api-access-dw68d\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.180722 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-config-data-custom\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.180796 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-config-data-custom\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.184017 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5179f78d-3a8f-4621-95f3-e147ff8da79f-logs\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.185982 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-config-data\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.186421 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-config-data\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.186456 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drm6n\" (UniqueName: \"kubernetes.io/projected/5179f78d-3a8f-4621-95f3-e147ff8da79f-kube-api-access-drm6n\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.186615 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-combined-ca-bundle\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.186745 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c4598cc6b-tl8vg"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.187028 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-combined-ca-bundle\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.187096 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04ffb838-3774-402c-9cdf-d11e51fb21e5-logs\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.201688 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.211910 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c4598cc6b-tl8vg"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.279459 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-4jdkf"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.289517 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-config-data-custom\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.289614 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5179f78d-3a8f-4621-95f3-e147ff8da79f-logs\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.289655 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-config-data\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.289736 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-config-data\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.289764 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drm6n\" (UniqueName: \"kubernetes.io/projected/5179f78d-3a8f-4621-95f3-e147ff8da79f-kube-api-access-drm6n\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.289795 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd7dp\" (UniqueName: \"kubernetes.io/projected/e7453b38-f6c3-4fe7-b15d-5bd8112dc687-kube-api-access-kd7dp\") pod \"glance3267-account-delete-cwdxm\" (UID: \"e7453b38-f6c3-4fe7-b15d-5bd8112dc687\") " pod="openstack/glance3267-account-delete-cwdxm" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.289820 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-combined-ca-bundle\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.289919 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-combined-ca-bundle\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.289948 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04ffb838-3774-402c-9cdf-d11e51fb21e5-logs\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.289969 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw68d\" (UniqueName: \"kubernetes.io/projected/04ffb838-3774-402c-9cdf-d11e51fb21e5-kube-api-access-dw68d\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.290048 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-config-data-custom\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.304822 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5179f78d-3a8f-4621-95f3-e147ff8da79f-logs\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.307819 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04ffb838-3774-402c-9cdf-d11e51fb21e5-logs\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.308819 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-config-data-custom\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.310103 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-config-data\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.325738 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-combined-ca-bundle\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.327899 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-config-data-custom\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.329341 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-config-data\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.337890 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-combined-ca-bundle\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.361218 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-4jdkf"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.370602 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw68d\" (UniqueName: \"kubernetes.io/projected/04ffb838-3774-402c-9cdf-d11e51fb21e5-kube-api-access-dw68d\") pod \"barbican-keystone-listener-75fd88c566-5j4xn\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.380676 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drm6n\" (UniqueName: \"kubernetes.io/projected/5179f78d-3a8f-4621-95f3-e147ff8da79f-kube-api-access-drm6n\") pod \"barbican-worker-65b8874fd7-dnnjf\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.382532 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.396245 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd7dp\" (UniqueName: \"kubernetes.io/projected/e7453b38-f6c3-4fe7-b15d-5bd8112dc687-kube-api-access-kd7dp\") pod \"glance3267-account-delete-cwdxm\" (UID: \"e7453b38-f6c3-4fe7-b15d-5bd8112dc687\") " pod="openstack/glance3267-account-delete-cwdxm" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.396315 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data-custom\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.396344 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32501fcc-b225-4b92-8aef-15d69474e7a1-logs\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.396364 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcnng\" (UniqueName: \"kubernetes.io/projected/32501fcc-b225-4b92-8aef-15d69474e7a1-kube-api-access-dcnng\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.396394 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-internal-tls-certs\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.396444 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.396476 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-combined-ca-bundle\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.396537 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-public-tls-certs\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.489219 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutronb378-account-delete-jzbpj"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.490474 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronb378-account-delete-jzbpj" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.492697 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd7dp\" (UniqueName: \"kubernetes.io/projected/e7453b38-f6c3-4fe7-b15d-5bd8112dc687-kube-api-access-kd7dp\") pod \"glance3267-account-delete-cwdxm\" (UID: \"e7453b38-f6c3-4fe7-b15d-5bd8112dc687\") " pod="openstack/glance3267-account-delete-cwdxm" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.500508 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-internal-tls-certs\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.500589 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.500640 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-combined-ca-bundle\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.500709 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-public-tls-certs\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.500769 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data-custom\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.500794 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32501fcc-b225-4b92-8aef-15d69474e7a1-logs\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.500816 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcnng\" (UniqueName: \"kubernetes.io/projected/32501fcc-b225-4b92-8aef-15d69474e7a1-kube-api-access-dcnng\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: E1007 12:45:59.502717 4854 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 07 12:45:59 crc kubenswrapper[4854]: E1007 12:45:59.502778 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data podName:32501fcc-b225-4b92-8aef-15d69474e7a1 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:00.002762225 +0000 UTC m=+1275.990594480 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data") pod "barbican-api-7c4598cc6b-tl8vg" (UID: "32501fcc-b225-4b92-8aef-15d69474e7a1") : secret "barbican-config-data" not found Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.504776 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32501fcc-b225-4b92-8aef-15d69474e7a1-logs\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.517117 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-internal-tls-certs\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.522727 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-public-tls-certs\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.540061 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-combined-ca-bundle\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.541717 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data-custom\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.542910 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcnng\" (UniqueName: \"kubernetes.io/projected/32501fcc-b225-4b92-8aef-15d69474e7a1-kube-api-access-dcnng\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.572046 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutronb378-account-delete-jzbpj"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.595386 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance3267-account-delete-cwdxm" Oct 07 12:45:59 crc kubenswrapper[4854]: E1007 12:45:59.617666 4854 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 07 12:45:59 crc kubenswrapper[4854]: E1007 12:45:59.617728 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data podName:4c293f13-b2a5-4d4b-9f69-fd118e34eab2 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:00.617711379 +0000 UTC m=+1276.605543634 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data") pod "rabbitmq-server-0" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2") : configmap "rabbitmq-config-data" not found Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.618043 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cskfq\" (UniqueName: \"kubernetes.io/projected/ac535972-fa59-4e7f-818b-345da6937c14-kube-api-access-cskfq\") pod \"neutronb378-account-delete-jzbpj\" (UID: \"ac535972-fa59-4e7f-818b-345da6937c14\") " pod="openstack/neutronb378-account-delete-jzbpj" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.638657 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.648465 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinderc7c1-account-delete-dz57l"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.649854 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderc7c1-account-delete-dz57l" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.686624 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.703254 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinderc7c1-account-delete-dz57l"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.717103 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placementd4d8-account-delete-fh2n2"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.718316 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementd4d8-account-delete-fh2n2" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.720185 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cskfq\" (UniqueName: \"kubernetes.io/projected/ac535972-fa59-4e7f-818b-345da6937c14-kube-api-access-cskfq\") pod \"neutronb378-account-delete-jzbpj\" (UID: \"ac535972-fa59-4e7f-818b-345da6937c14\") " pod="openstack/neutronb378-account-delete-jzbpj" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.737066 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementd4d8-account-delete-fh2n2"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.748828 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.749403 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="6f63842c-f85f-4e07-8221-4ce96b22bf44" containerName="openstack-network-exporter" containerID="cri-o://3b5428679fa70871d5e7daa3b281bec044977353bc58c91be36e9d4e54d19bb6" gracePeriod=300 Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.754049 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-qk4ws"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.766346 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cskfq\" (UniqueName: \"kubernetes.io/projected/ac535972-fa59-4e7f-818b-345da6937c14-kube-api-access-cskfq\") pod \"neutronb378-account-delete-jzbpj\" (UID: \"ac535972-fa59-4e7f-818b-345da6937c14\") " pod="openstack/neutronb378-account-delete-jzbpj" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.792382 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-qk4ws"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.870249 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbj7s\" (UniqueName: \"kubernetes.io/projected/787f934e-4f31-4b00-8cf6-380efd34aaad-kube-api-access-vbj7s\") pod \"cinderc7c1-account-delete-dz57l\" (UID: \"787f934e-4f31-4b00-8cf6-380efd34aaad\") " pod="openstack/cinderc7c1-account-delete-dz57l" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.870404 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzz66\" (UniqueName: \"kubernetes.io/projected/66f80399-ed98-4aba-9db5-759ad2e314fa-kube-api-access-wzz66\") pod \"placementd4d8-account-delete-fh2n2\" (UID: \"66f80399-ed98-4aba-9db5-759ad2e314fa\") " pod="openstack/placementd4d8-account-delete-fh2n2" Oct 07 12:45:59 crc kubenswrapper[4854]: E1007 12:45:59.870826 4854 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 07 12:45:59 crc kubenswrapper[4854]: E1007 12:45:59.870891 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data podName:79513100-48d2-4e7b-ae14-888322cab8f3 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:00.370871709 +0000 UTC m=+1276.358703964 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data") pod "rabbitmq-cell1-server-0" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3") : configmap "rabbitmq-cell1-config-data" not found Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.871847 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.872728 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="dbbd823b-2e1d-4901-855a-72cd9a13a6fd" containerName="openstack-network-exporter" containerID="cri-o://cabf2a84de378bf0db16b49e75a56dc1796963bdeaadcbaca2e8f231777ff7d6" gracePeriod=300 Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.906135 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronb378-account-delete-jzbpj" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.908189 4854 generic.go:334] "Generic (PLEG): container finished" podID="2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" containerID="0775213d263bbb54ab38859826cdc168c18bef5c43a091d586b83ae90efc35b8" exitCode=143 Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.910098 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf","Type":"ContainerDied","Data":"0775213d263bbb54ab38859826cdc168c18bef5c43a091d586b83ae90efc35b8"} Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.936980 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jbv7l"] Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.966971 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-d4cpx_47f8159b-e07f-47bd-92e8-a57f3e0c545d/openstack-network-exporter/0.log" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.967260 4854 generic.go:334] "Generic (PLEG): container finished" podID="47f8159b-e07f-47bd-92e8-a57f3e0c545d" containerID="3e6647d1edb8724d040f38fe0892860ca4c3bb6eab536177c27f549d3f099144" exitCode=2 Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.967300 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-d4cpx" event={"ID":"47f8159b-e07f-47bd-92e8-a57f3e0c545d","Type":"ContainerDied","Data":"3e6647d1edb8724d040f38fe0892860ca4c3bb6eab536177c27f549d3f099144"} Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.986352 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbj7s\" (UniqueName: \"kubernetes.io/projected/787f934e-4f31-4b00-8cf6-380efd34aaad-kube-api-access-vbj7s\") pod \"cinderc7c1-account-delete-dz57l\" (UID: \"787f934e-4f31-4b00-8cf6-380efd34aaad\") " pod="openstack/cinderc7c1-account-delete-dz57l" Oct 07 12:45:59 crc kubenswrapper[4854]: I1007 12:45:59.986826 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzz66\" (UniqueName: \"kubernetes.io/projected/66f80399-ed98-4aba-9db5-759ad2e314fa-kube-api-access-wzz66\") pod \"placementd4d8-account-delete-fh2n2\" (UID: \"66f80399-ed98-4aba-9db5-759ad2e314fa\") " pod="openstack/placementd4d8-account-delete-fh2n2" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.000088 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jbv7l"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.022636 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.022901 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a936b898-5163-4c5e-ac30-64af1533cec7" containerName="glance-log" containerID="cri-o://ab9b63d9a92d2b3c1a80cf5ab47e0bdb681cabb6aa9a7547201bf16b8e4df31e" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.023426 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a936b898-5163-4c5e-ac30-64af1533cec7" containerName="glance-httpd" containerID="cri-o://a963e64a5e60796e5674815e7ff05f24676f7da4a8d14514981bf0f0c9a9c739" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.055923 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzz66\" (UniqueName: \"kubernetes.io/projected/66f80399-ed98-4aba-9db5-759ad2e314fa-kube-api-access-wzz66\") pod \"placementd4d8-account-delete-fh2n2\" (UID: \"66f80399-ed98-4aba-9db5-759ad2e314fa\") " pod="openstack/placementd4d8-account-delete-fh2n2" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.078098 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-7zmqn"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.079852 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbj7s\" (UniqueName: \"kubernetes.io/projected/787f934e-4f31-4b00-8cf6-380efd34aaad-kube-api-access-vbj7s\") pod \"cinderc7c1-account-delete-dz57l\" (UID: \"787f934e-4f31-4b00-8cf6-380efd34aaad\") " pod="openstack/cinderc7c1-account-delete-dz57l" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.092265 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:46:00 crc kubenswrapper[4854]: E1007 12:46:00.092445 4854 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 07 12:46:00 crc kubenswrapper[4854]: E1007 12:46:00.092507 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data podName:32501fcc-b225-4b92-8aef-15d69474e7a1 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:01.092488704 +0000 UTC m=+1277.080320969 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data") pod "barbican-api-7c4598cc6b-tl8vg" (UID: "32501fcc-b225-4b92-8aef-15d69474e7a1") : secret "barbican-config-data" not found Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.094045 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="6f63842c-f85f-4e07-8221-4ce96b22bf44" containerName="ovsdbserver-sb" containerID="cri-o://746867422f38d038a9f9c789e768db3e7fd483a9e53e306db5812c58e0fc6129" gracePeriod=300 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.123406 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-7zmqn"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.167503 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell0e568-account-delete-srq52"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.183433 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0e568-account-delete-srq52" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.207516 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0e568-account-delete-srq52"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.232866 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-phsdv"] Oct 07 12:46:00 crc kubenswrapper[4854]: E1007 12:46:00.236715 4854 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.243:41662->38.102.83.243:43445: write tcp 38.102.83.243:41662->38.102.83.243:43445: write: broken pipe Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.236744 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementd4d8-account-delete-fh2n2" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.260184 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-phsdv"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.286727 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qshkq"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.287075 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" podUID="43bd8c48-fc92-4f02-af3e-db78673cacb9" containerName="dnsmasq-dns" containerID="cri-o://222c24d46a308342692e4bc3749f04b6f56b937db2818c8260cd786457220f04" gracePeriod=10 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.298220 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt7gn\" (UniqueName: \"kubernetes.io/projected/7c47fe71-3b92-4490-8488-d98d0e25519e-kube-api-access-vt7gn\") pod \"novacell0e568-account-delete-srq52\" (UID: \"7c47fe71-3b92-4490-8488-d98d0e25519e\") " pod="openstack/novacell0e568-account-delete-srq52" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.305846 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-prrs8"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.310669 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderc7c1-account-delete-dz57l" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.318616 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="dbbd823b-2e1d-4901-855a-72cd9a13a6fd" containerName="ovsdbserver-nb" containerID="cri-o://f92c3a568292b0a3d969cd5adf77c55dbce4f1e154b300face2ca6a1e33c82df" gracePeriod=300 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.328450 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-prrs8"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.337182 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi643b-account-delete-92r97"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.344239 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi643b-account-delete-92r97" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.384602 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi643b-account-delete-92r97"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.401233 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt7gn\" (UniqueName: \"kubernetes.io/projected/7c47fe71-3b92-4490-8488-d98d0e25519e-kube-api-access-vt7gn\") pod \"novacell0e568-account-delete-srq52\" (UID: \"7c47fe71-3b92-4490-8488-d98d0e25519e\") " pod="openstack/novacell0e568-account-delete-srq52" Oct 07 12:46:00 crc kubenswrapper[4854]: E1007 12:46:00.401631 4854 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 07 12:46:00 crc kubenswrapper[4854]: E1007 12:46:00.401680 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data podName:79513100-48d2-4e7b-ae14-888322cab8f3 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:01.401665163 +0000 UTC m=+1277.389497418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data") pod "rabbitmq-cell1-server-0" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3") : configmap "rabbitmq-cell1-config-data" not found Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.432651 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433136 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-server" containerID="cri-o://e7d1be76b597fbb83db1e8cb27aca94cb03bb1ca865e7042ef7e99b025c26823" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433542 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="swift-recon-cron" containerID="cri-o://bd423040637c8b41729c179d3e2612729ced5992aafd5990c3c9b5c7ca5727c0" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433591 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="rsync" containerID="cri-o://6e9be7ed32f4b1f0286eb9c12fca3356d25542eaefb68cc3134f42aca219812e" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433623 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-expirer" containerID="cri-o://3ead17d08219606cf249dcd61e3873b2c9b921a7f531b3b5ff7b4f38fcea5a39" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433653 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-updater" containerID="cri-o://16029fd8b03b23770b20c5cb19ee5eac37c556a4d1e730daaeee2955d74d1015" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433681 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-auditor" containerID="cri-o://d139c560fe8a80f0502b14596ea2825172ee972c7730deb571bbc27b9d43e3a4" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433708 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-replicator" containerID="cri-o://ef4e6844e5b6720c32f74f7e722421b7675513872680d1a7f5bcc693b7983ed2" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433738 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-server" containerID="cri-o://41fbd8df61c4cf14f1e960d26b75dfc2bc04a3598dc2d46c94fb7688efb55eb9" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433764 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-updater" containerID="cri-o://c1daeb00bdab450dbfd0e8a2fc796a0cac58f94569069238d66bafbd697e21fe" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433789 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-auditor" containerID="cri-o://04a36a15910bb4bfdcb57384d35dbb7269dfced501b76e98351c42047a3a007d" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433816 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-replicator" containerID="cri-o://bf5285e76e2b7c58fdaeb7235c95ba9d504989e3ace943cdfc50b7e56f2b19f6" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433862 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-server" containerID="cri-o://6c57c9247bb283b1cdb97c6d4a8501239230072515b3dfc540b0bb0d946a0d67" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433890 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-reaper" containerID="cri-o://7ec2bd59e5244b757b1dc78381bc28563a41c3f4cb1938a20f0abef176871dca" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433924 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-auditor" containerID="cri-o://46e4da89bda9a00a4ff3b8768a33badf8cfee5c8e4687bcdeb7761033e1a22b5" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.433953 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-replicator" containerID="cri-o://64082d363d907b3ab83993dcfe80f76c9b9736fb8b76e7c87b840d99d91af5c8" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.462547 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt7gn\" (UniqueName: \"kubernetes.io/projected/7c47fe71-3b92-4490-8488-d98d0e25519e-kube-api-access-vt7gn\") pod \"novacell0e568-account-delete-srq52\" (UID: \"7c47fe71-3b92-4490-8488-d98d0e25519e\") " pod="openstack/novacell0e568-account-delete-srq52" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.507942 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbwkw\" (UniqueName: \"kubernetes.io/projected/aaa0738a-daf1-479f-9dbd-913806703370-kube-api-access-qbwkw\") pod \"novaapi643b-account-delete-92r97\" (UID: \"aaa0738a-daf1-479f-9dbd-913806703370\") " pod="openstack/novaapi643b-account-delete-92r97" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.508072 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.508338 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="698aae03-92da-4cc2-a9d2-ecdb5f143439" containerName="ovn-northd" containerID="cri-o://89f0eb49b67715625593c9e8be3b27b5e68c8ad630cb00e0f86cd890329ac748" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.509991 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="698aae03-92da-4cc2-a9d2-ecdb5f143439" containerName="openstack-network-exporter" containerID="cri-o://b1471691e8926652d5ee413690eb324d89b03c02d93cd735cdce65f268baf71f" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.539797 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbicana01a-account-delete-8phrr"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.541086 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicana01a-account-delete-8phrr" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.563954 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7445f79585-rckdn"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.564227 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7445f79585-rckdn" podUID="c2c26a76-531a-4a6b-ac0f-6aa23680f903" containerName="neutron-api" containerID="cri-o://82adcb574899325e5c5ccb3b6bb13576d4a4618eac34de03953fb2c28cd68412" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.564360 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7445f79585-rckdn" podUID="c2c26a76-531a-4a6b-ac0f-6aa23680f903" containerName="neutron-httpd" containerID="cri-o://4442c10882343c29c4d45b4876f4c5029aa8758304578d81129d781083582260" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.573868 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicana01a-account-delete-8phrr"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.584846 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.585067 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="13453d02-4f55-45a7-98be-1cd41c741a3e" containerName="glance-log" containerID="cri-o://b70abd6d9bcc3a200f109c4995793bf820e1b93bdac5dd9db4c650a6ff934a3d" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.585746 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="13453d02-4f55-45a7-98be-1cd41c741a3e" containerName="glance-httpd" containerID="cri-o://6ab3625afd9d5554eb97be59fc81dda17398623a3a3702f50c0d684653f13606" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.595064 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-vsdfv"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.605593 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-vsdfv"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.614695 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbwkw\" (UniqueName: \"kubernetes.io/projected/aaa0738a-daf1-479f-9dbd-913806703370-kube-api-access-qbwkw\") pod \"novaapi643b-account-delete-92r97\" (UID: \"aaa0738a-daf1-479f-9dbd-913806703370\") " pod="openstack/novaapi643b-account-delete-92r97" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.617503 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-hnhnd"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.635704 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-hnhnd"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.643219 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6f54bfd6b4-g5gq4"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.643543 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6f54bfd6b4-g5gq4" podUID="a03c4a0d-6346-43e4-8db1-f653b5dfa420" containerName="placement-log" containerID="cri-o://85e71836a3ed17c4ff8324140a6164289c633f406fdb437654545bf3071cb059" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.643987 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6f54bfd6b4-g5gq4" podUID="a03c4a0d-6346-43e4-8db1-f653b5dfa420" containerName="placement-api" containerID="cri-o://dda148e71ce3d2269edc8e05951c34651b383998b2ea4117e0f5773827d60523" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.658562 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbwkw\" (UniqueName: \"kubernetes.io/projected/aaa0738a-daf1-479f-9dbd-913806703370-kube-api-access-qbwkw\") pod \"novaapi643b-account-delete-92r97\" (UID: \"aaa0738a-daf1-479f-9dbd-913806703370\") " pod="openstack/novaapi643b-account-delete-92r97" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.689376 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.720498 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptt4r\" (UniqueName: \"kubernetes.io/projected/735fa97d-e751-4957-bdae-3ae0b10635d2-kube-api-access-ptt4r\") pod \"barbicana01a-account-delete-8phrr\" (UID: \"735fa97d-e751-4957-bdae-3ae0b10635d2\") " pod="openstack/barbicana01a-account-delete-8phrr" Oct 07 12:46:00 crc kubenswrapper[4854]: E1007 12:46:00.720660 4854 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 07 12:46:00 crc kubenswrapper[4854]: E1007 12:46:00.720695 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data podName:4c293f13-b2a5-4d4b-9f69-fd118e34eab2 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:02.720683969 +0000 UTC m=+1278.708516214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data") pod "rabbitmq-server-0" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2") : configmap "rabbitmq-config-data" not found Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.766915 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06a39602-5206-42d4-a283-31650db9bd54" path="/var/lib/kubelet/pods/06a39602-5206-42d4-a283-31650db9bd54/volumes" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.767666 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="245510a3-00ce-4faa-9a0e-4e06482e8a0e" path="/var/lib/kubelet/pods/245510a3-00ce-4faa-9a0e-4e06482e8a0e/volumes" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.791625 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="563fb438-9713-4237-8ad3-16a4614cbcfd" path="/var/lib/kubelet/pods/563fb438-9713-4237-8ad3-16a4614cbcfd/volumes" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.792271 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57eafbaf-440c-432a-a1ab-55032cd2f54a" path="/var/lib/kubelet/pods/57eafbaf-440c-432a-a1ab-55032cd2f54a/volumes" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.792788 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74714c8f-dea6-40be-9985-d254729920c9" path="/var/lib/kubelet/pods/74714c8f-dea6-40be-9985-d254729920c9/volumes" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.804445 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5" path="/var/lib/kubelet/pods/7bf3cb62-a4b1-4252-bbee-41e1ccc47dd5/volumes" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.805055 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="933b6600-078b-4555-b765-7a22cf257e7c" path="/var/lib/kubelet/pods/933b6600-078b-4555-b765-7a22cf257e7c/volumes" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.806133 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c462d02f-dfcd-48f7-b755-fb203afcb213" path="/var/lib/kubelet/pods/c462d02f-dfcd-48f7-b755-fb203afcb213/volumes" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.807804 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.808003 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="016c2264-9ba4-48c0-b416-02c468232b6b" containerName="nova-scheduler-scheduler" containerID="cri-o://b33111a1b0f14dc72a4fc34de5d78d44cbf9938a6e3632fa063f8bb7985aeaf3" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.821735 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptt4r\" (UniqueName: \"kubernetes.io/projected/735fa97d-e751-4957-bdae-3ae0b10635d2-kube-api-access-ptt4r\") pod \"barbicana01a-account-delete-8phrr\" (UID: \"735fa97d-e751-4957-bdae-3ae0b10635d2\") " pod="openstack/barbicana01a-account-delete-8phrr" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.863394 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptt4r\" (UniqueName: \"kubernetes.io/projected/735fa97d-e751-4957-bdae-3ae0b10635d2-kube-api-access-ptt4r\") pod \"barbicana01a-account-delete-8phrr\" (UID: \"735fa97d-e751-4957-bdae-3ae0b10635d2\") " pod="openstack/barbicana01a-account-delete-8phrr" Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.934861 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.935115 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="37dd5983-0d4d-4097-8657-f408e9bc68c0" containerName="nova-api-log" containerID="cri-o://6c3add82afc5a4706f36b57114ee421863a370ea330e776fc1686cfde8841f49" gracePeriod=30 Oct 07 12:46:00 crc kubenswrapper[4854]: I1007 12:46:00.935313 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="37dd5983-0d4d-4097-8657-f408e9bc68c0" containerName="nova-api-api" containerID="cri-o://bd7103524f01d027fb1b2f752223346853e431f9b045e756199823ef3b9a9a3d" gracePeriod=30 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.081196 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="4c293f13-b2a5-4d4b-9f69-fd118e34eab2" containerName="rabbitmq" containerID="cri-o://773b06224b9794e065a5371a7a6969873348f5679927494931f1681bc7245e22" gracePeriod=604800 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092464 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"3ead17d08219606cf249dcd61e3873b2c9b921a7f531b3b5ff7b4f38fcea5a39"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092526 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="3ead17d08219606cf249dcd61e3873b2c9b921a7f531b3b5ff7b4f38fcea5a39" exitCode=0 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092540 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="16029fd8b03b23770b20c5cb19ee5eac37c556a4d1e730daaeee2955d74d1015" exitCode=0 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092547 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="d139c560fe8a80f0502b14596ea2825172ee972c7730deb571bbc27b9d43e3a4" exitCode=0 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092555 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="ef4e6844e5b6720c32f74f7e722421b7675513872680d1a7f5bcc693b7983ed2" exitCode=0 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092562 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="c1daeb00bdab450dbfd0e8a2fc796a0cac58f94569069238d66bafbd697e21fe" exitCode=0 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092568 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="04a36a15910bb4bfdcb57384d35dbb7269dfced501b76e98351c42047a3a007d" exitCode=0 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092574 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="bf5285e76e2b7c58fdaeb7235c95ba9d504989e3ace943cdfc50b7e56f2b19f6" exitCode=0 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092581 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="7ec2bd59e5244b757b1dc78381bc28563a41c3f4cb1938a20f0abef176871dca" exitCode=0 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092589 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="46e4da89bda9a00a4ff3b8768a33badf8cfee5c8e4687bcdeb7761033e1a22b5" exitCode=0 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092597 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="64082d363d907b3ab83993dcfe80f76c9b9736fb8b76e7c87b840d99d91af5c8" exitCode=0 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092685 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"16029fd8b03b23770b20c5cb19ee5eac37c556a4d1e730daaeee2955d74d1015"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092698 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"d139c560fe8a80f0502b14596ea2825172ee972c7730deb571bbc27b9d43e3a4"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092708 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"ef4e6844e5b6720c32f74f7e722421b7675513872680d1a7f5bcc693b7983ed2"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092717 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"c1daeb00bdab450dbfd0e8a2fc796a0cac58f94569069238d66bafbd697e21fe"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092726 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"04a36a15910bb4bfdcb57384d35dbb7269dfced501b76e98351c42047a3a007d"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092736 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"bf5285e76e2b7c58fdaeb7235c95ba9d504989e3ace943cdfc50b7e56f2b19f6"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092748 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"7ec2bd59e5244b757b1dc78381bc28563a41c3f4cb1938a20f0abef176871dca"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092759 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"46e4da89bda9a00a4ff3b8768a33badf8cfee5c8e4687bcdeb7761033e1a22b5"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.092767 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"64082d363d907b3ab83993dcfe80f76c9b9736fb8b76e7c87b840d99d91af5c8"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.109412 4854 generic.go:334] "Generic (PLEG): container finished" podID="c2c26a76-531a-4a6b-ac0f-6aa23680f903" containerID="4442c10882343c29c4d45b4876f4c5029aa8758304578d81129d781083582260" exitCode=0 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.109523 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445f79585-rckdn" event={"ID":"c2c26a76-531a-4a6b-ac0f-6aa23680f903","Type":"ContainerDied","Data":"4442c10882343c29c4d45b4876f4c5029aa8758304578d81129d781083582260"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.116471 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dbbd823b-2e1d-4901-855a-72cd9a13a6fd/ovsdbserver-nb/0.log" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.116529 4854 generic.go:334] "Generic (PLEG): container finished" podID="dbbd823b-2e1d-4901-855a-72cd9a13a6fd" containerID="cabf2a84de378bf0db16b49e75a56dc1796963bdeaadcbaca2e8f231777ff7d6" exitCode=2 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.116551 4854 generic.go:334] "Generic (PLEG): container finished" podID="dbbd823b-2e1d-4901-855a-72cd9a13a6fd" containerID="f92c3a568292b0a3d969cd5adf77c55dbce4f1e154b300face2ca6a1e33c82df" exitCode=143 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.116600 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dbbd823b-2e1d-4901-855a-72cd9a13a6fd","Type":"ContainerDied","Data":"cabf2a84de378bf0db16b49e75a56dc1796963bdeaadcbaca2e8f231777ff7d6"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.116632 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dbbd823b-2e1d-4901-855a-72cd9a13a6fd","Type":"ContainerDied","Data":"f92c3a568292b0a3d969cd5adf77c55dbce4f1e154b300face2ca6a1e33c82df"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.118699 4854 generic.go:334] "Generic (PLEG): container finished" podID="43bd8c48-fc92-4f02-af3e-db78673cacb9" containerID="222c24d46a308342692e4bc3749f04b6f56b937db2818c8260cd786457220f04" exitCode=0 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.118759 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" event={"ID":"43bd8c48-fc92-4f02-af3e-db78673cacb9","Type":"ContainerDied","Data":"222c24d46a308342692e4bc3749f04b6f56b937db2818c8260cd786457220f04"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.122266 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.122546 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="770ca0a9-4c48-446b-be08-84b06d20d501" containerName="nova-metadata-log" containerID="cri-o://6cbd3121741a98fa970c3ecfd5f8c014c006ce0662866c5090335c8dec86fdc4" gracePeriod=30 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.123085 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="770ca0a9-4c48-446b-be08-84b06d20d501" containerName="nova-metadata-metadata" containerID="cri-o://16fe06c197602fe7fb28e9bd7f56721c49f40f87574fc4f159cb1d7c6685606e" gracePeriod=30 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.124961 4854 generic.go:334] "Generic (PLEG): container finished" podID="13453d02-4f55-45a7-98be-1cd41c741a3e" containerID="b70abd6d9bcc3a200f109c4995793bf820e1b93bdac5dd9db4c650a6ff934a3d" exitCode=143 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.125015 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13453d02-4f55-45a7-98be-1cd41c741a3e","Type":"ContainerDied","Data":"b70abd6d9bcc3a200f109c4995793bf820e1b93bdac5dd9db4c650a6ff934a3d"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.133088 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.133407 4854 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.133452 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data podName:32501fcc-b225-4b92-8aef-15d69474e7a1 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:03.133437659 +0000 UTC m=+1279.121269924 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data") pod "barbican-api-7c4598cc6b-tl8vg" (UID: "32501fcc-b225-4b92-8aef-15d69474e7a1") : secret "barbican-config-data" not found Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.139922 4854 generic.go:334] "Generic (PLEG): container finished" podID="0416b15d-a0a1-4bf2-bd86-4209c14c8e48" containerID="d08d57f3c4a2642e473841c6986c3097f5c785af5a96f4a083b730e4a989d74c" exitCode=137 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.144727 4854 generic.go:334] "Generic (PLEG): container finished" podID="698aae03-92da-4cc2-a9d2-ecdb5f143439" containerID="b1471691e8926652d5ee413690eb324d89b03c02d93cd735cdce65f268baf71f" exitCode=2 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.144795 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"698aae03-92da-4cc2-a9d2-ecdb5f143439","Type":"ContainerDied","Data":"b1471691e8926652d5ee413690eb324d89b03c02d93cd735cdce65f268baf71f"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.153596 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6f63842c-f85f-4e07-8221-4ce96b22bf44/ovsdbserver-sb/0.log" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.153653 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f63842c-f85f-4e07-8221-4ce96b22bf44" containerID="3b5428679fa70871d5e7daa3b281bec044977353bc58c91be36e9d4e54d19bb6" exitCode=2 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.153668 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f63842c-f85f-4e07-8221-4ce96b22bf44" containerID="746867422f38d038a9f9c789e768db3e7fd483a9e53e306db5812c58e0fc6129" exitCode=143 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.153748 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6f63842c-f85f-4e07-8221-4ce96b22bf44","Type":"ContainerDied","Data":"3b5428679fa70871d5e7daa3b281bec044977353bc58c91be36e9d4e54d19bb6"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.153775 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6f63842c-f85f-4e07-8221-4ce96b22bf44","Type":"ContainerDied","Data":"746867422f38d038a9f9c789e768db3e7fd483a9e53e306db5812c58e0fc6129"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.162634 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.165701 4854 generic.go:334] "Generic (PLEG): container finished" podID="a936b898-5163-4c5e-ac30-64af1533cec7" containerID="ab9b63d9a92d2b3c1a80cf5ab47e0bdb681cabb6aa9a7547201bf16b8e4df31e" exitCode=143 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.165752 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a936b898-5163-4c5e-ac30-64af1533cec7","Type":"ContainerDied","Data":"ab9b63d9a92d2b3c1a80cf5ab47e0bdb681cabb6aa9a7547201bf16b8e4df31e"} Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.172919 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-k7545"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.179108 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-k7545"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.186505 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-531a-account-create-zzlft"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.191217 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-531a-account-create-zzlft"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.197874 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-m7b9h"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.209841 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-m7b9h"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.221691 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.222089 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="22343ee2-64b7-4496-b74c-c9860920e953" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://59e04fd8406ec48a71ffcc20755a831bd6b1ec8e9383f0d8c7df80e46452738a" gracePeriod=30 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.233362 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-65b8874fd7-dnnjf"] Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.241814 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2c26a76_531a_4a6b_ac0f_6aa23680f903.slice/crio-4442c10882343c29c4d45b4876f4c5029aa8758304578d81129d781083582260.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9410d0_f08a_4288_901b_8c28b54f6d53.slice/crio-46e4da89bda9a00a4ff3b8768a33badf8cfee5c8e4687bcdeb7761033e1a22b5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9410d0_f08a_4288_901b_8c28b54f6d53.slice/crio-conmon-c1daeb00bdab450dbfd0e8a2fc796a0cac58f94569069238d66bafbd697e21fe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9410d0_f08a_4288_901b_8c28b54f6d53.slice/crio-7ec2bd59e5244b757b1dc78381bc28563a41c3f4cb1938a20f0abef176871dca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod698aae03_92da_4cc2_a9d2_ecdb5f143439.slice/crio-b1471691e8926652d5ee413690eb324d89b03c02d93cd735cdce65f268baf71f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2c26a76_531a_4a6b_ac0f_6aa23680f903.slice/crio-conmon-4442c10882343c29c4d45b4876f4c5029aa8758304578d81129d781083582260.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13453d02_4f55_45a7_98be_1cd41c741a3e.slice/crio-conmon-b70abd6d9bcc3a200f109c4995793bf820e1b93bdac5dd9db4c650a6ff934a3d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9410d0_f08a_4288_901b_8c28b54f6d53.slice/crio-conmon-41fbd8df61c4cf14f1e960d26b75dfc2bc04a3598dc2d46c94fb7688efb55eb9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9410d0_f08a_4288_901b_8c28b54f6d53.slice/crio-conmon-46e4da89bda9a00a4ff3b8768a33badf8cfee5c8e4687bcdeb7761033e1a22b5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9410d0_f08a_4288_901b_8c28b54f6d53.slice/crio-conmon-bf5285e76e2b7c58fdaeb7235c95ba9d504989e3ace943cdfc50b7e56f2b19f6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9410d0_f08a_4288_901b_8c28b54f6d53.slice/crio-ef4e6844e5b6720c32f74f7e722421b7675513872680d1a7f5bcc693b7983ed2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbbd823b_2e1d_4901_855a_72cd9a13a6fd.slice/crio-conmon-f92c3a568292b0a3d969cd5adf77c55dbce4f1e154b300face2ca6a1e33c82df.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9410d0_f08a_4288_901b_8c28b54f6d53.slice/crio-16029fd8b03b23770b20c5cb19ee5eac37c556a4d1e730daaeee2955d74d1015.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13453d02_4f55_45a7_98be_1cd41c741a3e.slice/crio-b70abd6d9bcc3a200f109c4995793bf820e1b93bdac5dd9db4c650a6ff934a3d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0416b15d_a0a1_4bf2_bd86_4209c14c8e48.slice/crio-conmon-d08d57f3c4a2642e473841c6986c3097f5c785af5a96f4a083b730e4a989d74c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37dd5983_0d4d_4097_8657_f408e9bc68c0.slice/crio-conmon-6c3add82afc5a4706f36b57114ee421863a370ea330e776fc1686cfde8841f49.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9410d0_f08a_4288_901b_8c28b54f6d53.slice/crio-conmon-16029fd8b03b23770b20c5cb19ee5eac37c556a4d1e730daaeee2955d74d1015.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod770ca0a9_4c48_446b_be08_84b06d20d501.slice/crio-6cbd3121741a98fa970c3ecfd5f8c014c006ce0662866c5090335c8dec86fdc4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9410d0_f08a_4288_901b_8c28b54f6d53.slice/crio-d139c560fe8a80f0502b14596ea2825172ee972c7730deb571bbc27b9d43e3a4.scope\": RecentStats: unable to find data in memory cache]" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.289567 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7b8cbbf7f5-8gzrj"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.289862 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" podUID="77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" containerName="barbican-worker-log" containerID="cri-o://6a8cb87e42de1baa0fe321962804fbc2f8635887612d2573377887b576e22a6c" gracePeriod=30 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.290290 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" podUID="77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" containerName="barbican-worker" containerID="cri-o://0b7c87e468f2b8ba32d54b131d0c6194d6f52f79c88bd597eeb0b8875705a1d6" gracePeriod=30 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.305426 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-77f6984dd6-vxjwr"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.306298 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" podUID="39b6e215-6643-4003-9c53-d33f6af39494" containerName="barbican-keystone-listener-log" containerID="cri-o://55c2f083da705ca36f1812e92722e74d07fb8806209d2dfd59e204e7c4638f72" gracePeriod=30 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.307070 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" podUID="39b6e215-6643-4003-9c53-d33f6af39494" containerName="barbican-keystone-listener" containerID="cri-o://5e0219a6ee2ee21252dc5506f149e4cf4652bd9388a64c5f4c96dec5453578f6" gracePeriod=30 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.318748 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d4d8-account-create-rl9sz"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.334702 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-75fd88c566-5j4xn"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.345073 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementd4d8-account-delete-fh2n2"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.358313 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d4d8-account-create-rl9sz"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.384680 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-8bbnw"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.423417 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7595d98994-smt7c"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.423925 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7595d98994-smt7c" podUID="6002f7a4-27d6-4554-a486-87926ebcf57e" containerName="barbican-api-log" containerID="cri-o://ab6b9b6df7a6df7e7a7e15ca4a8d64babc1a2ae8feac7c3dc87e742af29e9944" gracePeriod=30 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.424074 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7595d98994-smt7c" podUID="6002f7a4-27d6-4554-a486-87926ebcf57e" containerName="barbican-api" containerID="cri-o://f0ff8ce507731e5f1fa6188aabef5933b1d02dc7dee5ef5454692c6f625fabed" gracePeriod=30 Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.433189 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f92c3a568292b0a3d969cd5adf77c55dbce4f1e154b300face2ca6a1e33c82df is running failed: container process not found" containerID="f92c3a568292b0a3d969cd5adf77c55dbce4f1e154b300face2ca6a1e33c82df" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.435611 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f92c3a568292b0a3d969cd5adf77c55dbce4f1e154b300face2ca6a1e33c82df is running failed: container process not found" containerID="f92c3a568292b0a3d969cd5adf77c55dbce4f1e154b300face2ca6a1e33c82df" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.439013 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f92c3a568292b0a3d969cd5adf77c55dbce4f1e154b300face2ca6a1e33c82df is running failed: container process not found" containerID="f92c3a568292b0a3d969cd5adf77c55dbce4f1e154b300face2ca6a1e33c82df" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.439079 4854 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f92c3a568292b0a3d969cd5adf77c55dbce4f1e154b300face2ca6a1e33c82df is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="dbbd823b-2e1d-4901-855a-72cd9a13a6fd" containerName="ovsdbserver-nb" Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.441885 4854 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.441946 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data podName:79513100-48d2-4e7b-ae14-888322cab8f3 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:03.441928237 +0000 UTC m=+1279.429760492 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data") pod "rabbitmq-cell1-server-0" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3") : configmap "rabbitmq-cell1-config-data" not found Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.442045 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-j5h2b" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovs-vswitchd" containerID="cri-o://1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" gracePeriod=28 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.468410 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-8bbnw"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.475398 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c4598cc6b-tl8vg"] Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.476200 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config-data], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/barbican-api-7c4598cc6b-tl8vg" podUID="32501fcc-b225-4b92-8aef-15d69474e7a1" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.478315 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="f725ba88-4d40-4eab-890d-e114448fabe9" containerName="galera" containerID="cri-o://11b5b9529231881fc76b4bbf3d8f51ee97af2d3afad7b47b620ba4ffbcd4d4ec" gracePeriod=30 Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.491559 4854 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 07 12:46:01 crc kubenswrapper[4854]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 07 12:46:01 crc kubenswrapper[4854]: + source /usr/local/bin/container-scripts/functions Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNBridge=br-int Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNRemote=tcp:localhost:6642 Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNEncapType=geneve Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNAvailabilityZones= Oct 07 12:46:01 crc kubenswrapper[4854]: ++ EnableChassisAsGateway=true Oct 07 12:46:01 crc kubenswrapper[4854]: ++ PhysicalNetworks= Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNHostName= Oct 07 12:46:01 crc kubenswrapper[4854]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 07 12:46:01 crc kubenswrapper[4854]: ++ ovs_dir=/var/lib/openvswitch Oct 07 12:46:01 crc kubenswrapper[4854]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 07 12:46:01 crc kubenswrapper[4854]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 07 12:46:01 crc kubenswrapper[4854]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 07 12:46:01 crc kubenswrapper[4854]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 12:46:01 crc kubenswrapper[4854]: + sleep 0.5 Oct 07 12:46:01 crc kubenswrapper[4854]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 12:46:01 crc kubenswrapper[4854]: + sleep 0.5 Oct 07 12:46:01 crc kubenswrapper[4854]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 12:46:01 crc kubenswrapper[4854]: + sleep 0.5 Oct 07 12:46:01 crc kubenswrapper[4854]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 12:46:01 crc kubenswrapper[4854]: + cleanup_ovsdb_server_semaphore Oct 07 12:46:01 crc kubenswrapper[4854]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 07 12:46:01 crc kubenswrapper[4854]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 07 12:46:01 crc kubenswrapper[4854]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-j5h2b" message=< Oct 07 12:46:01 crc kubenswrapper[4854]: Exiting ovsdb-server (5) [ OK ] Oct 07 12:46:01 crc kubenswrapper[4854]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 07 12:46:01 crc kubenswrapper[4854]: + source /usr/local/bin/container-scripts/functions Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNBridge=br-int Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNRemote=tcp:localhost:6642 Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNEncapType=geneve Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNAvailabilityZones= Oct 07 12:46:01 crc kubenswrapper[4854]: ++ EnableChassisAsGateway=true Oct 07 12:46:01 crc kubenswrapper[4854]: ++ PhysicalNetworks= Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNHostName= Oct 07 12:46:01 crc kubenswrapper[4854]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 07 12:46:01 crc kubenswrapper[4854]: ++ ovs_dir=/var/lib/openvswitch Oct 07 12:46:01 crc kubenswrapper[4854]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 07 12:46:01 crc kubenswrapper[4854]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 07 12:46:01 crc kubenswrapper[4854]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 07 12:46:01 crc kubenswrapper[4854]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 12:46:01 crc kubenswrapper[4854]: + sleep 0.5 Oct 07 12:46:01 crc kubenswrapper[4854]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 12:46:01 crc kubenswrapper[4854]: + sleep 0.5 Oct 07 12:46:01 crc kubenswrapper[4854]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 12:46:01 crc kubenswrapper[4854]: + sleep 0.5 Oct 07 12:46:01 crc kubenswrapper[4854]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 12:46:01 crc kubenswrapper[4854]: + cleanup_ovsdb_server_semaphore Oct 07 12:46:01 crc kubenswrapper[4854]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 07 12:46:01 crc kubenswrapper[4854]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 07 12:46:01 crc kubenswrapper[4854]: > Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.491600 4854 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 07 12:46:01 crc kubenswrapper[4854]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 07 12:46:01 crc kubenswrapper[4854]: + source /usr/local/bin/container-scripts/functions Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNBridge=br-int Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNRemote=tcp:localhost:6642 Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNEncapType=geneve Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNAvailabilityZones= Oct 07 12:46:01 crc kubenswrapper[4854]: ++ EnableChassisAsGateway=true Oct 07 12:46:01 crc kubenswrapper[4854]: ++ PhysicalNetworks= Oct 07 12:46:01 crc kubenswrapper[4854]: ++ OVNHostName= Oct 07 12:46:01 crc kubenswrapper[4854]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 07 12:46:01 crc kubenswrapper[4854]: ++ ovs_dir=/var/lib/openvswitch Oct 07 12:46:01 crc kubenswrapper[4854]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 07 12:46:01 crc kubenswrapper[4854]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 07 12:46:01 crc kubenswrapper[4854]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 07 12:46:01 crc kubenswrapper[4854]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 12:46:01 crc kubenswrapper[4854]: + sleep 0.5 Oct 07 12:46:01 crc kubenswrapper[4854]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 12:46:01 crc kubenswrapper[4854]: + sleep 0.5 Oct 07 12:46:01 crc kubenswrapper[4854]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 12:46:01 crc kubenswrapper[4854]: + sleep 0.5 Oct 07 12:46:01 crc kubenswrapper[4854]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 07 12:46:01 crc kubenswrapper[4854]: + cleanup_ovsdb_server_semaphore Oct 07 12:46:01 crc kubenswrapper[4854]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 07 12:46:01 crc kubenswrapper[4854]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 07 12:46:01 crc kubenswrapper[4854]: > pod="openstack/ovn-controller-ovs-j5h2b" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovsdb-server" containerID="cri-o://49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.491636 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-j5h2b" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovsdb-server" containerID="cri-o://49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" gracePeriod=28 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.506215 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0e568-account-delete-srq52"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.520278 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e568-account-create-hjhqj"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.528198 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e568-account-create-hjhqj"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.541302 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi643b-account-delete-92r97"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.556890 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vz2cc"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.570536 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vz2cc"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.578252 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-643b-account-create-q5znc"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.582416 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-643b-account-create-q5znc"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.593205 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-276gh"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.596243 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a01a-account-create-c6crg"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.602101 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0e568-account-delete-srq52" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.604651 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicana01a-account-delete-8phrr"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.612951 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-276gh"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.619358 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a01a-account-create-c6crg"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.627008 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.632692 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi643b-account-delete-92r97" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.632757 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-65b8874fd7-dnnjf"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.643458 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicana01a-account-delete-8phrr" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.661240 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-d4cpx_47f8159b-e07f-47bd-92e8-a57f3e0c545d/openstack-network-exporter/0.log" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.661313 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.684608 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6f63842c-f85f-4e07-8221-4ce96b22bf44/ovsdbserver-sb/0.log" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.684675 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.699097 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="79513100-48d2-4e7b-ae14-888322cab8f3" containerName="rabbitmq" containerID="cri-o://6b0f9e765f42e0528d289da2b61440b957c5df88bdf61759d5eef1939df5e18b" gracePeriod=604800 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.710501 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dbbd823b-2e1d-4901-855a-72cd9a13a6fd/ovsdbserver-nb/0.log" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.710569 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.755157 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47f8159b-e07f-47bd-92e8-a57f3e0c545d-config\") pod \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.755274 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/47f8159b-e07f-47bd-92e8-a57f3e0c545d-ovn-rundir\") pod \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.755519 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p28ml\" (UniqueName: \"kubernetes.io/projected/47f8159b-e07f-47bd-92e8-a57f3e0c545d-kube-api-access-p28ml\") pod \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.755649 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/47f8159b-e07f-47bd-92e8-a57f3e0c545d-ovs-rundir\") pod \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.755742 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/47f8159b-e07f-47bd-92e8-a57f3e0c545d-metrics-certs-tls-certs\") pod \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.755793 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f8159b-e07f-47bd-92e8-a57f3e0c545d-combined-ca-bundle\") pod \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.763036 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.765139 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47f8159b-e07f-47bd-92e8-a57f3e0c545d-config" (OuterVolumeSpecName: "config") pod "47f8159b-e07f-47bd-92e8-a57f3e0c545d" (UID: "47f8159b-e07f-47bd-92e8-a57f3e0c545d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.765513 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47f8159b-e07f-47bd-92e8-a57f3e0c545d-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "47f8159b-e07f-47bd-92e8-a57f3e0c545d" (UID: "47f8159b-e07f-47bd-92e8-a57f3e0c545d"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.765548 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47f8159b-e07f-47bd-92e8-a57f3e0c545d-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "47f8159b-e07f-47bd-92e8-a57f3e0c545d" (UID: "47f8159b-e07f-47bd-92e8-a57f3e0c545d"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.782878 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f8159b-e07f-47bd-92e8-a57f3e0c545d-kube-api-access-p28ml" (OuterVolumeSpecName: "kube-api-access-p28ml") pod "47f8159b-e07f-47bd-92e8-a57f3e0c545d" (UID: "47f8159b-e07f-47bd-92e8-a57f3e0c545d"). InnerVolumeSpecName "kube-api-access-p28ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.791322 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b33111a1b0f14dc72a4fc34de5d78d44cbf9938a6e3632fa063f8bb7985aeaf3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.823526 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b33111a1b0f14dc72a4fc34de5d78d44cbf9938a6e3632fa063f8bb7985aeaf3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.863312 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f63842c-f85f-4e07-8221-4ce96b22bf44-config\") pod \"6f63842c-f85f-4e07-8221-4ce96b22bf44\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.863449 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-scripts\") pod \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.863518 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-dns-svc\") pod \"43bd8c48-fc92-4f02-af3e-db78673cacb9\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.863620 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f63842c-f85f-4e07-8221-4ce96b22bf44-scripts\") pod \"6f63842c-f85f-4e07-8221-4ce96b22bf44\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.864017 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-metrics-certs-tls-certs\") pod \"6f63842c-f85f-4e07-8221-4ce96b22bf44\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.864096 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-combined-ca-bundle\") pod \"6f63842c-f85f-4e07-8221-4ce96b22bf44\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.864324 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.864431 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-ovsdbserver-nb\") pod \"43bd8c48-fc92-4f02-af3e-db78673cacb9\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.864541 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2lrz\" (UniqueName: \"kubernetes.io/projected/43bd8c48-fc92-4f02-af3e-db78673cacb9-kube-api-access-x2lrz\") pod \"43bd8c48-fc92-4f02-af3e-db78673cacb9\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.865074 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j59bh\" (UniqueName: \"kubernetes.io/projected/6f63842c-f85f-4e07-8221-4ce96b22bf44-kube-api-access-j59bh\") pod \"6f63842c-f85f-4e07-8221-4ce96b22bf44\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.865176 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-ovsdbserver-nb-tls-certs\") pod \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.865255 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-dns-swift-storage-0\") pod \"43bd8c48-fc92-4f02-af3e-db78673cacb9\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.865516 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"6f63842c-f85f-4e07-8221-4ce96b22bf44\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.865570 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-scripts" (OuterVolumeSpecName: "scripts") pod "dbbd823b-2e1d-4901-855a-72cd9a13a6fd" (UID: "dbbd823b-2e1d-4901-855a-72cd9a13a6fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.865573 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f63842c-f85f-4e07-8221-4ce96b22bf44-scripts" (OuterVolumeSpecName: "scripts") pod "6f63842c-f85f-4e07-8221-4ce96b22bf44" (UID: "6f63842c-f85f-4e07-8221-4ce96b22bf44"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.865707 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-metrics-certs-tls-certs\") pod \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.865888 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-config\") pod \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.865985 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6f63842c-f85f-4e07-8221-4ce96b22bf44-ovsdb-rundir\") pod \"6f63842c-f85f-4e07-8221-4ce96b22bf44\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.866058 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7v84\" (UniqueName: \"kubernetes.io/projected/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-kube-api-access-x7v84\") pod \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.866376 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f63842c-f85f-4e07-8221-4ce96b22bf44-config" (OuterVolumeSpecName: "config") pod "6f63842c-f85f-4e07-8221-4ce96b22bf44" (UID: "6f63842c-f85f-4e07-8221-4ce96b22bf44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.866763 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f8159b-e07f-47bd-92e8-a57f3e0c545d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47f8159b-e07f-47bd-92e8-a57f3e0c545d" (UID: "47f8159b-e07f-47bd-92e8-a57f3e0c545d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.866128 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-ovsdb-rundir\") pod \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.867051 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f8159b-e07f-47bd-92e8-a57f3e0c545d-combined-ca-bundle\") pod \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\" (UID: \"47f8159b-e07f-47bd-92e8-a57f3e0c545d\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.867210 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-ovsdbserver-sb\") pod \"43bd8c48-fc92-4f02-af3e-db78673cacb9\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.869591 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-combined-ca-bundle\") pod \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\" (UID: \"dbbd823b-2e1d-4901-855a-72cd9a13a6fd\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.869754 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-ovsdbserver-sb-tls-certs\") pod \"6f63842c-f85f-4e07-8221-4ce96b22bf44\" (UID: \"6f63842c-f85f-4e07-8221-4ce96b22bf44\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.869788 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-config\") pod \"43bd8c48-fc92-4f02-af3e-db78673cacb9\" (UID: \"43bd8c48-fc92-4f02-af3e-db78673cacb9\") " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.871113 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47f8159b-e07f-47bd-92e8-a57f3e0c545d-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.871135 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f63842c-f85f-4e07-8221-4ce96b22bf44-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.871162 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.871171 4854 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/47f8159b-e07f-47bd-92e8-a57f3e0c545d-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.871181 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p28ml\" (UniqueName: \"kubernetes.io/projected/47f8159b-e07f-47bd-92e8-a57f3e0c545d-kube-api-access-p28ml\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.871191 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f63842c-f85f-4e07-8221-4ce96b22bf44-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.871201 4854 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/47f8159b-e07f-47bd-92e8-a57f3e0c545d-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.879258 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b33111a1b0f14dc72a4fc34de5d78d44cbf9938a6e3632fa063f8bb7985aeaf3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 12:46:01 crc kubenswrapper[4854]: E1007 12:46:01.879316 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="016c2264-9ba4-48c0-b416-02c468232b6b" containerName="nova-scheduler-scheduler" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.879603 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.879813 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="cac73de2-996a-4e04-abde-1153b44058bc" containerName="nova-cell1-conductor-conductor" containerID="cri-o://efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700" gracePeriod=30 Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.882382 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-config" (OuterVolumeSpecName: "config") pod "dbbd823b-2e1d-4901-855a-72cd9a13a6fd" (UID: "dbbd823b-2e1d-4901-855a-72cd9a13a6fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.882899 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f63842c-f85f-4e07-8221-4ce96b22bf44-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "6f63842c-f85f-4e07-8221-4ce96b22bf44" (UID: "6f63842c-f85f-4e07-8221-4ce96b22bf44"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.897088 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43bd8c48-fc92-4f02-af3e-db78673cacb9-kube-api-access-x2lrz" (OuterVolumeSpecName: "kube-api-access-x2lrz") pod "43bd8c48-fc92-4f02-af3e-db78673cacb9" (UID: "43bd8c48-fc92-4f02-af3e-db78673cacb9"). InnerVolumeSpecName "kube-api-access-x2lrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.898871 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "dbbd823b-2e1d-4901-855a-72cd9a13a6fd" (UID: "dbbd823b-2e1d-4901-855a-72cd9a13a6fd"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: W1007 12:46:01.898979 4854 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/47f8159b-e07f-47bd-92e8-a57f3e0c545d/volumes/kubernetes.io~secret/combined-ca-bundle Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.898988 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f8159b-e07f-47bd-92e8-a57f3e0c545d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47f8159b-e07f-47bd-92e8-a57f3e0c545d" (UID: "47f8159b-e07f-47bd-92e8-a57f3e0c545d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.914726 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpl49"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.923931 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "dbbd823b-2e1d-4901-855a-72cd9a13a6fd" (UID: "dbbd823b-2e1d-4901-855a-72cd9a13a6fd"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.930817 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zpl49"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.954335 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "6f63842c-f85f-4e07-8221-4ce96b22bf44" (UID: "6f63842c-f85f-4e07-8221-4ce96b22bf44"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.954774 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f63842c-f85f-4e07-8221-4ce96b22bf44-kube-api-access-j59bh" (OuterVolumeSpecName: "kube-api-access-j59bh") pod "6f63842c-f85f-4e07-8221-4ce96b22bf44" (UID: "6f63842c-f85f-4e07-8221-4ce96b22bf44"). InnerVolumeSpecName "kube-api-access-j59bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.955777 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-kube-api-access-x7v84" (OuterVolumeSpecName: "kube-api-access-x7v84") pod "dbbd823b-2e1d-4901-855a-72cd9a13a6fd" (UID: "dbbd823b-2e1d-4901-855a-72cd9a13a6fd"). InnerVolumeSpecName "kube-api-access-x7v84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.979139 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.979208 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6f63842c-f85f-4e07-8221-4ce96b22bf44-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.979225 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7v84\" (UniqueName: \"kubernetes.io/projected/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-kube-api-access-x7v84\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.979239 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.979252 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f8159b-e07f-47bd-92e8-a57f3e0c545d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.979285 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.979300 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2lrz\" (UniqueName: \"kubernetes.io/projected/43bd8c48-fc92-4f02-af3e-db78673cacb9-kube-api-access-x2lrz\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.979314 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j59bh\" (UniqueName: \"kubernetes.io/projected/6f63842c-f85f-4e07-8221-4ce96b22bf44-kube-api-access-j59bh\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.979341 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.988308 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 12:46:01 crc kubenswrapper[4854]: I1007 12:46:01.988829 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="3565e266-6994-4000-a4f2-2901e22f6682" containerName="nova-cell0-conductor-conductor" containerID="cri-o://6b55461da6eb4b748690750f389a1e88f87574727ef1064972561b0e7d0151ea" gracePeriod=30 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.018334 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rrljj"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.058041 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rrljj"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.086925 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance3267-account-delete-cwdxm"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.087293 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f8159b-e07f-47bd-92e8-a57f3e0c545d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "47f8159b-e07f-47bd-92e8-a57f3e0c545d" (UID: "47f8159b-e07f-47bd-92e8-a57f3e0c545d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.096190 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-75fd88c566-5j4xn"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.113036 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutronb378-account-delete-jzbpj"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.119682 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f63842c-f85f-4e07-8221-4ce96b22bf44" (UID: "6f63842c-f85f-4e07-8221-4ce96b22bf44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.119798 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementd4d8-account-delete-fh2n2"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.123837 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinderc7c1-account-delete-dz57l"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.136851 4854 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.165449 4854 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.183499 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.183534 4854 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.183543 4854 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.183553 4854 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/47f8159b-e07f-47bd-92e8-a57f3e0c545d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.186109 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.202320 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance3267-account-delete-cwdxm" event={"ID":"e7453b38-f6c3-4fe7-b15d-5bd8112dc687","Type":"ContainerStarted","Data":"ee5287507b8008495245a090482b288e4e0c9225d2dc2b05e38266512b1a68d0"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.203091 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6f63842c-f85f-4e07-8221-4ce96b22bf44" (UID: "6f63842c-f85f-4e07-8221-4ce96b22bf44"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.205662 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementd4d8-account-delete-fh2n2" event={"ID":"66f80399-ed98-4aba-9db5-759ad2e314fa","Type":"ContainerStarted","Data":"5d555e41a71b72a5185f547f2f1f330886012a2560d5d06568c69bc6395e5b39"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.208710 4854 scope.go:117] "RemoveContainer" containerID="d08d57f3c4a2642e473841c6986c3097f5c785af5a96f4a083b730e4a989d74c" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.208867 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.218249 4854 generic.go:334] "Generic (PLEG): container finished" podID="37dd5983-0d4d-4097-8657-f408e9bc68c0" containerID="6c3add82afc5a4706f36b57114ee421863a370ea330e776fc1686cfde8841f49" exitCode=143 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.218383 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37dd5983-0d4d-4097-8657-f408e9bc68c0","Type":"ContainerDied","Data":"6c3add82afc5a4706f36b57114ee421863a370ea330e776fc1686cfde8841f49"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.220598 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6f63842c-f85f-4e07-8221-4ce96b22bf44/ovsdbserver-sb/0.log" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.220699 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6f63842c-f85f-4e07-8221-4ce96b22bf44","Type":"ContainerDied","Data":"783c7da94febb249bab5774efd43f0f500c8148eb2f5c48b8262c6b322e93806"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.220903 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.246047 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronb378-account-delete-jzbpj" event={"ID":"ac535972-fa59-4e7f-818b-345da6937c14","Type":"ContainerStarted","Data":"5fc8a1cdad97dd2b204027fe3b0edf573d51ce78ed821b165334d96d088c8d5d"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.256075 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.256010 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-qshkq" event={"ID":"43bd8c48-fc92-4f02-af3e-db78673cacb9","Type":"ContainerDied","Data":"0ddbe79b9602fd8d93add2ee7a3ad6de13e93baac76695f4efc390c72e1935f0"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.262013 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" event={"ID":"04ffb838-3774-402c-9cdf-d11e51fb21e5","Type":"ContainerStarted","Data":"8efacd13cb9d6ed637f2722f33cc0eb702b0a0faa85193aa1e09cbcd8d27bb12"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.262955 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "43bd8c48-fc92-4f02-af3e-db78673cacb9" (UID: "43bd8c48-fc92-4f02-af3e-db78673cacb9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.269073 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderc7c1-account-delete-dz57l" event={"ID":"787f934e-4f31-4b00-8cf6-380efd34aaad","Type":"ContainerStarted","Data":"db5889e0b4fdd9ccdc73334e37e2cb3cc8fa447356ffa435225c965d86e57cfd"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.276240 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "6f63842c-f85f-4e07-8221-4ce96b22bf44" (UID: "6f63842c-f85f-4e07-8221-4ce96b22bf44"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.278719 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dbbd823b-2e1d-4901-855a-72cd9a13a6fd/ovsdbserver-nb/0.log" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.278860 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.279633 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dbbd823b-2e1d-4901-855a-72cd9a13a6fd","Type":"ContainerDied","Data":"bb54c736c2d80891bb8b405ef6a3e05368ba386d2418cd4bc7daf465ad50f464"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.289820 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssz77\" (UniqueName: \"kubernetes.io/projected/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-kube-api-access-ssz77\") pod \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.289893 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-combined-ca-bundle\") pod \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.289956 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-openstack-config\") pod \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.289995 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-openstack-config-secret\") pod \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\" (UID: \"0416b15d-a0a1-4bf2-bd86-4209c14c8e48\") " Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.292211 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "dbbd823b-2e1d-4901-855a-72cd9a13a6fd" (UID: "dbbd823b-2e1d-4901-855a-72cd9a13a6fd"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.304489 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-d4cpx_47f8159b-e07f-47bd-92e8-a57f3e0c545d/openstack-network-exporter/0.log" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.304671 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-d4cpx" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.306321 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-d4cpx" event={"ID":"47f8159b-e07f-47bd-92e8-a57f3e0c545d","Type":"ContainerDied","Data":"70cd323ec7510604eb6a540c69ab38d09f868b8b9b462b34ef8aa05cc1da4a28"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.315640 4854 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.315679 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.315694 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.315709 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f63842c-f85f-4e07-8221-4ce96b22bf44-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.317660 4854 generic.go:334] "Generic (PLEG): container finished" podID="847eb385-fc80-4568-813d-638dac11d81a" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" exitCode=0 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.317720 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j5h2b" event={"ID":"847eb385-fc80-4568-813d-638dac11d81a","Type":"ContainerDied","Data":"49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.318709 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-kube-api-access-ssz77" (OuterVolumeSpecName: "kube-api-access-ssz77") pod "0416b15d-a0a1-4bf2-bd86-4209c14c8e48" (UID: "0416b15d-a0a1-4bf2-bd86-4209c14c8e48"). InnerVolumeSpecName "kube-api-access-ssz77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.325624 4854 generic.go:334] "Generic (PLEG): container finished" podID="a03c4a0d-6346-43e4-8db1-f653b5dfa420" containerID="85e71836a3ed17c4ff8324140a6164289c633f406fdb437654545bf3071cb059" exitCode=143 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.325705 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f54bfd6b4-g5gq4" event={"ID":"a03c4a0d-6346-43e4-8db1-f653b5dfa420","Type":"ContainerDied","Data":"85e71836a3ed17c4ff8324140a6164289c633f406fdb437654545bf3071cb059"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.340664 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-config" (OuterVolumeSpecName: "config") pod "43bd8c48-fc92-4f02-af3e-db78673cacb9" (UID: "43bd8c48-fc92-4f02-af3e-db78673cacb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.341019 4854 generic.go:334] "Generic (PLEG): container finished" podID="39b6e215-6643-4003-9c53-d33f6af39494" containerID="55c2f083da705ca36f1812e92722e74d07fb8806209d2dfd59e204e7c4638f72" exitCode=143 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.341090 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" event={"ID":"39b6e215-6643-4003-9c53-d33f6af39494","Type":"ContainerDied","Data":"55c2f083da705ca36f1812e92722e74d07fb8806209d2dfd59e204e7c4638f72"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.360679 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "43bd8c48-fc92-4f02-af3e-db78673cacb9" (UID: "43bd8c48-fc92-4f02-af3e-db78673cacb9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.365929 4854 generic.go:334] "Generic (PLEG): container finished" podID="77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" containerID="6a8cb87e42de1baa0fe321962804fbc2f8635887612d2573377887b576e22a6c" exitCode=143 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.365998 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" event={"ID":"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e","Type":"ContainerDied","Data":"6a8cb87e42de1baa0fe321962804fbc2f8635887612d2573377887b576e22a6c"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.390477 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0416b15d-a0a1-4bf2-bd86-4209c14c8e48" (UID: "0416b15d-a0a1-4bf2-bd86-4209c14c8e48"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.423203 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.423226 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.423234 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssz77\" (UniqueName: \"kubernetes.io/projected/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-kube-api-access-ssz77\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.423243 4854 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.440633 4854 generic.go:334] "Generic (PLEG): container finished" podID="770ca0a9-4c48-446b-be08-84b06d20d501" containerID="6cbd3121741a98fa970c3ecfd5f8c014c006ce0662866c5090335c8dec86fdc4" exitCode=143 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.440752 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"770ca0a9-4c48-446b-be08-84b06d20d501","Type":"ContainerDied","Data":"6cbd3121741a98fa970c3ecfd5f8c014c006ce0662866c5090335c8dec86fdc4"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.442487 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "dbbd823b-2e1d-4901-855a-72cd9a13a6fd" (UID: "dbbd823b-2e1d-4901-855a-72cd9a13a6fd"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.492782 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "43bd8c48-fc92-4f02-af3e-db78673cacb9" (UID: "43bd8c48-fc92-4f02-af3e-db78673cacb9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.524776 4854 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.524810 4854 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.527509 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0416b15d-a0a1-4bf2-bd86-4209c14c8e48" (UID: "0416b15d-a0a1-4bf2-bd86-4209c14c8e48"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.537921 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbbd823b-2e1d-4901-855a-72cd9a13a6fd" (UID: "dbbd823b-2e1d-4901-855a-72cd9a13a6fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.550744 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "43bd8c48-fc92-4f02-af3e-db78673cacb9" (UID: "43bd8c48-fc92-4f02-af3e-db78673cacb9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.570136 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0416b15d-a0a1-4bf2-bd86-4209c14c8e48" (UID: "0416b15d-a0a1-4bf2-bd86-4209c14c8e48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.577157 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicana01a-account-delete-8phrr"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.580667 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="6e9be7ed32f4b1f0286eb9c12fca3356d25542eaefb68cc3134f42aca219812e" exitCode=0 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.580697 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="41fbd8df61c4cf14f1e960d26b75dfc2bc04a3598dc2d46c94fb7688efb55eb9" exitCode=0 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.580706 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="6c57c9247bb283b1cdb97c6d4a8501239230072515b3dfc540b0bb0d946a0d67" exitCode=0 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.580713 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="e7d1be76b597fbb83db1e8cb27aca94cb03bb1ca865e7042ef7e99b025c26823" exitCode=0 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.580754 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"6e9be7ed32f4b1f0286eb9c12fca3356d25542eaefb68cc3134f42aca219812e"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.580783 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"41fbd8df61c4cf14f1e960d26b75dfc2bc04a3598dc2d46c94fb7688efb55eb9"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.580792 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"6c57c9247bb283b1cdb97c6d4a8501239230072515b3dfc540b0bb0d946a0d67"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.580801 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"e7d1be76b597fbb83db1e8cb27aca94cb03bb1ca865e7042ef7e99b025c26823"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.589262 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0e568-account-delete-srq52"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.602919 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi643b-account-delete-92r97"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.603744 4854 generic.go:334] "Generic (PLEG): container finished" podID="21065050-7bdc-4f4e-9a7b-9dbcc2dab200" containerID="3cca971c6b0c47f58de9e946459a6537fae5efe97def4c9f7b9f8c2749fe2fc8" exitCode=0 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.606002 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21065050-7bdc-4f4e-9a7b-9dbcc2dab200","Type":"ContainerDied","Data":"3cca971c6b0c47f58de9e946459a6537fae5efe97def4c9f7b9f8c2749fe2fc8"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.618603 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65b8874fd7-dnnjf" event={"ID":"5179f78d-3a8f-4621-95f3-e147ff8da79f","Type":"ContainerStarted","Data":"02fc41252a41df476e3c3e2c65ab0a5aca662f6c1710215b0dcdfa49eb74b167"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.618657 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65b8874fd7-dnnjf" event={"ID":"5179f78d-3a8f-4621-95f3-e147ff8da79f","Type":"ContainerStarted","Data":"420df5386f6bce64ac480f93e7cefca1f7b6d0c1344eee0f4357340f7631529e"} Oct 07 12:46:02 crc kubenswrapper[4854]: E1007 12:46:02.623684 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b55461da6eb4b748690750f389a1e88f87574727ef1064972561b0e7d0151ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.623894 4854 generic.go:334] "Generic (PLEG): container finished" podID="6002f7a4-27d6-4554-a486-87926ebcf57e" containerID="ab6b9b6df7a6df7e7a7e15ca4a8d64babc1a2ae8feac7c3dc87e742af29e9944" exitCode=143 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.624000 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.624179 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7595d98994-smt7c" event={"ID":"6002f7a4-27d6-4554-a486-87926ebcf57e","Type":"ContainerDied","Data":"ab6b9b6df7a6df7e7a7e15ca4a8d64babc1a2ae8feac7c3dc87e742af29e9944"} Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.626869 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.626909 4854 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0416b15d-a0a1-4bf2-bd86-4209c14c8e48-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.626920 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43bd8c48-fc92-4f02-af3e-db78673cacb9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.626930 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbd823b-2e1d-4901-855a-72cd9a13a6fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: E1007 12:46:02.635105 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b55461da6eb4b748690750f389a1e88f87574727ef1064972561b0e7d0151ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 12:46:02 crc kubenswrapper[4854]: E1007 12:46:02.639641 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6b55461da6eb4b748690750f389a1e88f87574727ef1064972561b0e7d0151ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 12:46:02 crc kubenswrapper[4854]: E1007 12:46:02.639711 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="3565e266-6994-4000-a4f2-2901e22f6682" containerName="nova-cell0-conductor-conductor" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.705956 4854 scope.go:117] "RemoveContainer" containerID="3b5428679fa70871d5e7daa3b281bec044977353bc58c91be36e9d4e54d19bb6" Oct 07 12:46:02 crc kubenswrapper[4854]: E1007 12:46:02.732614 4854 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 07 12:46:02 crc kubenswrapper[4854]: E1007 12:46:02.732715 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data podName:4c293f13-b2a5-4d4b-9f69-fd118e34eab2 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:06.732689112 +0000 UTC m=+1282.720521357 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data") pod "rabbitmq-server-0" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2") : configmap "rabbitmq-config-data" not found Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.738027 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00fa0fe0-f7e2-4049-9158-248e0d3f1ea6" path="/var/lib/kubelet/pods/00fa0fe0-f7e2-4049-9158-248e0d3f1ea6/volumes" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.738699 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0416b15d-a0a1-4bf2-bd86-4209c14c8e48" path="/var/lib/kubelet/pods/0416b15d-a0a1-4bf2-bd86-4209c14c8e48/volumes" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.739291 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a366a87-147e-465b-80f0-484c351b07c4" path="/var/lib/kubelet/pods/3a366a87-147e-465b-80f0-484c351b07c4/volumes" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.739909 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40931f51-fd4e-40fd-82fc-fd8b92bf09e5" path="/var/lib/kubelet/pods/40931f51-fd4e-40fd-82fc-fd8b92bf09e5/volumes" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.740582 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.741196 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e76b358-7d43-4d2b-8aec-c90b3e4e1021" path="/var/lib/kubelet/pods/4e76b358-7d43-4d2b-8aec-c90b3e4e1021/volumes" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.742984 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a7ebb6-06f1-44cf-806a-f824afac8cf9" path="/var/lib/kubelet/pods/50a7ebb6-06f1-44cf-806a-f824afac8cf9/volumes" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.743657 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="757432e0-dbdf-40a7-b6e4-2534d6600bea" path="/var/lib/kubelet/pods/757432e0-dbdf-40a7-b6e4-2534d6600bea/volumes" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.746072 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e143666-1ff7-48c5-b28b-42fc66cd56af" path="/var/lib/kubelet/pods/7e143666-1ff7-48c5-b28b-42fc66cd56af/volumes" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.746998 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85fde367-0bd7-457c-bfa7-6d72cece519e" path="/var/lib/kubelet/pods/85fde367-0bd7-457c-bfa7-6d72cece519e/volumes" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.747767 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96132415-18c4-42a4-bb2c-cec92945602e" path="/var/lib/kubelet/pods/96132415-18c4-42a4-bb2c-cec92945602e/volumes" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.748447 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="982baa82-c0c6-4cc7-8c1e-fd351b582446" path="/var/lib/kubelet/pods/982baa82-c0c6-4cc7-8c1e-fd351b582446/volumes" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.749686 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f" path="/var/lib/kubelet/pods/a356da3f-0fbb-4d65-8b1f-a4fc6a0a881f/volumes" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.750232 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f512fcc4-c7f8-45ae-b165-c7367ee2d7fe" path="/var/lib/kubelet/pods/f512fcc4-c7f8-45ae-b165-c7367ee2d7fe/volumes" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.781747 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.814585 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.832974 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-d4cpx"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.842907 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data-custom\") pod \"32501fcc-b225-4b92-8aef-15d69474e7a1\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.842966 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-internal-tls-certs\") pod \"32501fcc-b225-4b92-8aef-15d69474e7a1\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.843026 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcnng\" (UniqueName: \"kubernetes.io/projected/32501fcc-b225-4b92-8aef-15d69474e7a1-kube-api-access-dcnng\") pod \"32501fcc-b225-4b92-8aef-15d69474e7a1\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.843113 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-public-tls-certs\") pod \"32501fcc-b225-4b92-8aef-15d69474e7a1\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.843139 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-combined-ca-bundle\") pod \"32501fcc-b225-4b92-8aef-15d69474e7a1\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.843300 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32501fcc-b225-4b92-8aef-15d69474e7a1-logs\") pod \"32501fcc-b225-4b92-8aef-15d69474e7a1\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.846988 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32501fcc-b225-4b92-8aef-15d69474e7a1-logs" (OuterVolumeSpecName: "logs") pod "32501fcc-b225-4b92-8aef-15d69474e7a1" (UID: "32501fcc-b225-4b92-8aef-15d69474e7a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.868520 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-d4cpx"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.878395 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-667c455579-lnd9l"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.878699 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-667c455579-lnd9l" podUID="2c8167ae-8941-4616-bee2-ff0fb5e98c16" containerName="proxy-httpd" containerID="cri-o://364708de26903cb2d2d7224428a3d27a659926ef0d8ef5da26e1579482e5acf8" gracePeriod=30 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.878941 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-667c455579-lnd9l" podUID="2c8167ae-8941-4616-bee2-ff0fb5e98c16" containerName="proxy-server" containerID="cri-o://dd65a138aac9fa721dada3f006a871724f98f90b0fa13687c508bb7e6049a302" gracePeriod=30 Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.887170 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qshkq"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.897892 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-qshkq"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.905474 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.906686 4854 scope.go:117] "RemoveContainer" containerID="746867422f38d038a9f9c789e768db3e7fd483a9e53e306db5812c58e0fc6129" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.913241 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.927945 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "32501fcc-b225-4b92-8aef-15d69474e7a1" (UID: "32501fcc-b225-4b92-8aef-15d69474e7a1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.927964 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32501fcc-b225-4b92-8aef-15d69474e7a1" (UID: "32501fcc-b225-4b92-8aef-15d69474e7a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.928209 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "32501fcc-b225-4b92-8aef-15d69474e7a1" (UID: "32501fcc-b225-4b92-8aef-15d69474e7a1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.928340 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "32501fcc-b225-4b92-8aef-15d69474e7a1" (UID: "32501fcc-b225-4b92-8aef-15d69474e7a1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.928373 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32501fcc-b225-4b92-8aef-15d69474e7a1-kube-api-access-dcnng" (OuterVolumeSpecName: "kube-api-access-dcnng") pod "32501fcc-b225-4b92-8aef-15d69474e7a1" (UID: "32501fcc-b225-4b92-8aef-15d69474e7a1"). InnerVolumeSpecName "kube-api-access-dcnng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.945037 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32501fcc-b225-4b92-8aef-15d69474e7a1-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.945058 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.945068 4854 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.945078 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcnng\" (UniqueName: \"kubernetes.io/projected/32501fcc-b225-4b92-8aef-15d69474e7a1-kube-api-access-dcnng\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.945089 4854 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.945099 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:02 crc kubenswrapper[4854]: I1007 12:46:02.980691 4854 scope.go:117] "RemoveContainer" containerID="222c24d46a308342692e4bc3749f04b6f56b937db2818c8260cd786457220f04" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.010435 4854 scope.go:117] "RemoveContainer" containerID="a50dfeaa00f16b6d847f6b9bed54bcb51e19dccdc1b497636e9040901056764e" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.085666 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.095636 4854 scope.go:117] "RemoveContainer" containerID="cabf2a84de378bf0db16b49e75a56dc1796963bdeaadcbaca2e8f231777ff7d6" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.149047 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data\") pod \"barbican-api-7c4598cc6b-tl8vg\" (UID: \"32501fcc-b225-4b92-8aef-15d69474e7a1\") " pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:46:03 crc kubenswrapper[4854]: E1007 12:46:03.149267 4854 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 07 12:46:03 crc kubenswrapper[4854]: E1007 12:46:03.149363 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data podName:32501fcc-b225-4b92-8aef-15d69474e7a1 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:07.149340445 +0000 UTC m=+1283.137172760 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data") pod "barbican-api-7c4598cc6b-tl8vg" (UID: "32501fcc-b225-4b92-8aef-15d69474e7a1") : secret "barbican-config-data" not found Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.184383 4854 scope.go:117] "RemoveContainer" containerID="f92c3a568292b0a3d969cd5adf77c55dbce4f1e154b300face2ca6a1e33c82df" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.185921 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.251566 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-nova-novncproxy-tls-certs\") pod \"22343ee2-64b7-4496-b74c-c9860920e953\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.251620 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2m56\" (UniqueName: \"kubernetes.io/projected/22343ee2-64b7-4496-b74c-c9860920e953-kube-api-access-g2m56\") pod \"22343ee2-64b7-4496-b74c-c9860920e953\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.251698 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-config-data\") pod \"22343ee2-64b7-4496-b74c-c9860920e953\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.251813 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-vencrypt-tls-certs\") pod \"22343ee2-64b7-4496-b74c-c9860920e953\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.251865 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-combined-ca-bundle\") pod \"22343ee2-64b7-4496-b74c-c9860920e953\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.276559 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22343ee2-64b7-4496-b74c-c9860920e953-kube-api-access-g2m56" (OuterVolumeSpecName: "kube-api-access-g2m56") pod "22343ee2-64b7-4496-b74c-c9860920e953" (UID: "22343ee2-64b7-4496-b74c-c9860920e953"). InnerVolumeSpecName "kube-api-access-g2m56". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.315373 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-config-data" (OuterVolumeSpecName: "config-data") pod "22343ee2-64b7-4496-b74c-c9860920e953" (UID: "22343ee2-64b7-4496-b74c-c9860920e953"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.353722 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-combined-ca-bundle\") pod \"39b6e215-6643-4003-9c53-d33f6af39494\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.353930 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvmrg\" (UniqueName: \"kubernetes.io/projected/39b6e215-6643-4003-9c53-d33f6af39494-kube-api-access-jvmrg\") pod \"39b6e215-6643-4003-9c53-d33f6af39494\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.353743 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22343ee2-64b7-4496-b74c-c9860920e953" (UID: "22343ee2-64b7-4496-b74c-c9860920e953"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.354043 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-config-data-custom\") pod \"39b6e215-6643-4003-9c53-d33f6af39494\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.354091 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-combined-ca-bundle\") pod \"22343ee2-64b7-4496-b74c-c9860920e953\" (UID: \"22343ee2-64b7-4496-b74c-c9860920e953\") " Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.354201 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-config-data\") pod \"39b6e215-6643-4003-9c53-d33f6af39494\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.354286 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39b6e215-6643-4003-9c53-d33f6af39494-logs\") pod \"39b6e215-6643-4003-9c53-d33f6af39494\" (UID: \"39b6e215-6643-4003-9c53-d33f6af39494\") " Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.354688 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2m56\" (UniqueName: \"kubernetes.io/projected/22343ee2-64b7-4496-b74c-c9860920e953-kube-api-access-g2m56\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.354704 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.354963 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39b6e215-6643-4003-9c53-d33f6af39494-logs" (OuterVolumeSpecName: "logs") pod "39b6e215-6643-4003-9c53-d33f6af39494" (UID: "39b6e215-6643-4003-9c53-d33f6af39494"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:03 crc kubenswrapper[4854]: W1007 12:46:03.356620 4854 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/22343ee2-64b7-4496-b74c-c9860920e953/volumes/kubernetes.io~secret/combined-ca-bundle Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.356636 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22343ee2-64b7-4496-b74c-c9860920e953" (UID: "22343ee2-64b7-4496-b74c-c9860920e953"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.381133 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "39b6e215-6643-4003-9c53-d33f6af39494" (UID: "39b6e215-6643-4003-9c53-d33f6af39494"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.382571 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b6e215-6643-4003-9c53-d33f6af39494-kube-api-access-jvmrg" (OuterVolumeSpecName: "kube-api-access-jvmrg") pod "39b6e215-6643-4003-9c53-d33f6af39494" (UID: "39b6e215-6643-4003-9c53-d33f6af39494"). InnerVolumeSpecName "kube-api-access-jvmrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.400310 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "22343ee2-64b7-4496-b74c-c9860920e953" (UID: "22343ee2-64b7-4496-b74c-c9860920e953"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.458064 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39b6e215-6643-4003-9c53-d33f6af39494" (UID: "39b6e215-6643-4003-9c53-d33f6af39494"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.471529 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-config-data" (OuterVolumeSpecName: "config-data") pod "39b6e215-6643-4003-9c53-d33f6af39494" (UID: "39b6e215-6643-4003-9c53-d33f6af39494"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:03 crc kubenswrapper[4854]: E1007 12:46:03.477320 4854 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 07 12:46:03 crc kubenswrapper[4854]: E1007 12:46:03.477394 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data podName:79513100-48d2-4e7b-ae14-888322cab8f3 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:07.477377826 +0000 UTC m=+1283.465210081 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data") pod "rabbitmq-cell1-server-0" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3") : configmap "rabbitmq-cell1-config-data" not found Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.477611 4854 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.477627 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.477637 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.477646 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.477654 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39b6e215-6643-4003-9c53-d33f6af39494-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.477664 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39b6e215-6643-4003-9c53-d33f6af39494-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.477672 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvmrg\" (UniqueName: \"kubernetes.io/projected/39b6e215-6643-4003-9c53-d33f6af39494-kube-api-access-jvmrg\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.550685 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "22343ee2-64b7-4496-b74c-c9860920e953" (UID: "22343ee2-64b7-4496-b74c-c9860920e953"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.579594 4854 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/22343ee2-64b7-4496-b74c-c9860920e953-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.628552 4854 scope.go:117] "RemoveContainer" containerID="3e6647d1edb8724d040f38fe0892860ca4c3bb6eab536177c27f549d3f099144" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.650353 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi643b-account-delete-92r97" event={"ID":"aaa0738a-daf1-479f-9dbd-913806703370","Type":"ContainerStarted","Data":"c2f64bd6a502240eba60aec7a5a06461ffdecfdc04beff785aa7c921c7ef988e"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.650399 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi643b-account-delete-92r97" event={"ID":"aaa0738a-daf1-479f-9dbd-913806703370","Type":"ContainerStarted","Data":"85952b1081fd40865652081a84c9d72face986f2075653f2245c849e07426ff2"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.651730 4854 generic.go:334] "Generic (PLEG): container finished" podID="22343ee2-64b7-4496-b74c-c9860920e953" containerID="59e04fd8406ec48a71ffcc20755a831bd6b1ec8e9383f0d8c7df80e46452738a" exitCode=0 Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.651771 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"22343ee2-64b7-4496-b74c-c9860920e953","Type":"ContainerDied","Data":"59e04fd8406ec48a71ffcc20755a831bd6b1ec8e9383f0d8c7df80e46452738a"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.651789 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"22343ee2-64b7-4496-b74c-c9860920e953","Type":"ContainerDied","Data":"d3a004987a0b14c1a22e75d69824fa876f445af6182a991647fba7625ef4d751"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.651854 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.665282 4854 scope.go:117] "RemoveContainer" containerID="59e04fd8406ec48a71ffcc20755a831bd6b1ec8e9383f0d8c7df80e46452738a" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.666390 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementd4d8-account-delete-fh2n2" event={"ID":"66f80399-ed98-4aba-9db5-759ad2e314fa","Type":"ContainerStarted","Data":"75c3ef86d0dbd953431e4fcb366d7ac773027294d708d52f2fae09e3f2500930"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.666690 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placementd4d8-account-delete-fh2n2" podUID="66f80399-ed98-4aba-9db5-759ad2e314fa" containerName="mariadb-account-delete" containerID="cri-o://75c3ef86d0dbd953431e4fcb366d7ac773027294d708d52f2fae09e3f2500930" gracePeriod=30 Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.674989 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" event={"ID":"04ffb838-3774-402c-9cdf-d11e51fb21e5","Type":"ContainerStarted","Data":"1ee628a9c0b67d36be2b170a1da5ccab6d647b96fb94a73b6590d405c4039a11"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.683405 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placementd4d8-account-delete-fh2n2" podStartSLOduration=4.683061035 podStartE2EDuration="4.683061035s" podCreationTimestamp="2025-10-07 12:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:46:03.680298335 +0000 UTC m=+1279.668130580" watchObservedRunningTime="2025-10-07 12:46:03.683061035 +0000 UTC m=+1279.670893290" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.684086 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0e568-account-delete-srq52" event={"ID":"7c47fe71-3b92-4490-8488-d98d0e25519e","Type":"ContainerStarted","Data":"960e8011f3e7e734c34ca5881aea3b73d34e5f8e258e4f210770c8d783d53077"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.696212 4854 generic.go:334] "Generic (PLEG): container finished" podID="2c8167ae-8941-4616-bee2-ff0fb5e98c16" containerID="364708de26903cb2d2d7224428a3d27a659926ef0d8ef5da26e1579482e5acf8" exitCode=0 Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.696294 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-667c455579-lnd9l" event={"ID":"2c8167ae-8941-4616-bee2-ff0fb5e98c16","Type":"ContainerDied","Data":"364708de26903cb2d2d7224428a3d27a659926ef0d8ef5da26e1579482e5acf8"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.711475 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65b8874fd7-dnnjf" event={"ID":"5179f78d-3a8f-4621-95f3-e147ff8da79f","Type":"ContainerStarted","Data":"6c97ffba80b987fc973726fde0f552970e4fb94bcc265aa80afeaa87e15eaa7e"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.712605 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.715252 4854 generic.go:334] "Generic (PLEG): container finished" podID="39b6e215-6643-4003-9c53-d33f6af39494" containerID="5e0219a6ee2ee21252dc5506f149e4cf4652bd9388a64c5f4c96dec5453578f6" exitCode=0 Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.715349 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" event={"ID":"39b6e215-6643-4003-9c53-d33f6af39494","Type":"ContainerDied","Data":"5e0219a6ee2ee21252dc5506f149e4cf4652bd9388a64c5f4c96dec5453578f6"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.715384 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" event={"ID":"39b6e215-6643-4003-9c53-d33f6af39494","Type":"ContainerDied","Data":"f73b792b25607fa6b1c25ff040b16a00fcfb986450169a010234f049614c0558"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.715432 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-77f6984dd6-vxjwr" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.722908 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance3267-account-delete-cwdxm" event={"ID":"e7453b38-f6c3-4fe7-b15d-5bd8112dc687","Type":"ContainerStarted","Data":"88e0e41ebc891694b10be4302198088674ffb09627e1081edead8304fb62bb09"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.743005 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.749203 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance3267-account-delete-cwdxm" podStartSLOduration=4.749185821 podStartE2EDuration="4.749185821s" podCreationTimestamp="2025-10-07 12:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:46:03.748176451 +0000 UTC m=+1279.736008706" watchObservedRunningTime="2025-10-07 12:46:03.749185821 +0000 UTC m=+1279.737018076" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.749942 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderc7c1-account-delete-dz57l" event={"ID":"787f934e-4f31-4b00-8cf6-380efd34aaad","Type":"ContainerStarted","Data":"beb49729e015c29539bad7a67c84a635504e19d4bc244ead20fb12cbae9b8e94"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.760322 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.172:8776/healthcheck\": dial tcp 10.217.0.172:8776: connect: connection refused" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.763712 4854 generic.go:334] "Generic (PLEG): container finished" podID="3565e266-6994-4000-a4f2-2901e22f6682" containerID="6b55461da6eb4b748690750f389a1e88f87574727ef1064972561b0e7d0151ea" exitCode=0 Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.763784 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3565e266-6994-4000-a4f2-2901e22f6682","Type":"ContainerDied","Data":"6b55461da6eb4b748690750f389a1e88f87574727ef1064972561b0e7d0151ea"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.784208 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronb378-account-delete-jzbpj" event={"ID":"ac535972-fa59-4e7f-818b-345da6937c14","Type":"ContainerStarted","Data":"4ab5598f2f1f8e2a8745b5f522f943448e0b454fdb30f12ff064c16195d663a3"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.790603 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c4598cc6b-tl8vg" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.792437 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbicana01a-account-delete-8phrr" podUID="735fa97d-e751-4957-bdae-3ae0b10635d2" containerName="mariadb-account-delete" containerID="cri-o://c2b608d16398a3c2e79505a49f4bc3d6c058f2350032df3d21f4491f85f50701" gracePeriod=30 Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.792722 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicana01a-account-delete-8phrr" event={"ID":"735fa97d-e751-4957-bdae-3ae0b10635d2","Type":"ContainerStarted","Data":"c2b608d16398a3c2e79505a49f4bc3d6c058f2350032df3d21f4491f85f50701"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.792750 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicana01a-account-delete-8phrr" event={"ID":"735fa97d-e751-4957-bdae-3ae0b10635d2","Type":"ContainerStarted","Data":"9208b58630a4edbf80b1d83d52ae8466d7c5b4e422216067d1554368305e830e"} Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.809349 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutronb378-account-delete-jzbpj" podStartSLOduration=4.8093317110000005 podStartE2EDuration="4.809331711s" podCreationTimestamp="2025-10-07 12:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:46:03.800614216 +0000 UTC m=+1279.788446471" watchObservedRunningTime="2025-10-07 12:46:03.809331711 +0000 UTC m=+1279.797163966" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.825087 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbicana01a-account-delete-8phrr" podStartSLOduration=4.825071941 podStartE2EDuration="4.825071941s" podCreationTimestamp="2025-10-07 12:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:46:03.820198039 +0000 UTC m=+1279.808030294" watchObservedRunningTime="2025-10-07 12:46:03.825071941 +0000 UTC m=+1279.812904196" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.847758 4854 scope.go:117] "RemoveContainer" containerID="59e04fd8406ec48a71ffcc20755a831bd6b1ec8e9383f0d8c7df80e46452738a" Oct 07 12:46:03 crc kubenswrapper[4854]: E1007 12:46:03.849421 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59e04fd8406ec48a71ffcc20755a831bd6b1ec8e9383f0d8c7df80e46452738a\": container with ID starting with 59e04fd8406ec48a71ffcc20755a831bd6b1ec8e9383f0d8c7df80e46452738a not found: ID does not exist" containerID="59e04fd8406ec48a71ffcc20755a831bd6b1ec8e9383f0d8c7df80e46452738a" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.849472 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59e04fd8406ec48a71ffcc20755a831bd6b1ec8e9383f0d8c7df80e46452738a"} err="failed to get container status \"59e04fd8406ec48a71ffcc20755a831bd6b1ec8e9383f0d8c7df80e46452738a\": rpc error: code = NotFound desc = could not find container \"59e04fd8406ec48a71ffcc20755a831bd6b1ec8e9383f0d8c7df80e46452738a\": container with ID starting with 59e04fd8406ec48a71ffcc20755a831bd6b1ec8e9383f0d8c7df80e46452738a not found: ID does not exist" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.849492 4854 scope.go:117] "RemoveContainer" containerID="5e0219a6ee2ee21252dc5506f149e4cf4652bd9388a64c5f4c96dec5453578f6" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.906392 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 12:46:03 crc kubenswrapper[4854]: E1007 12:46:03.906456 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89f0eb49b67715625593c9e8be3b27b5e68c8ad630cb00e0f86cd890329ac748" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 07 12:46:03 crc kubenswrapper[4854]: E1007 12:46:03.908691 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89f0eb49b67715625593c9e8be3b27b5e68c8ad630cb00e0f86cd890329ac748" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 07 12:46:03 crc kubenswrapper[4854]: E1007 12:46:03.911023 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89f0eb49b67715625593c9e8be3b27b5e68c8ad630cb00e0f86cd890329ac748" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 07 12:46:03 crc kubenswrapper[4854]: E1007 12:46:03.911049 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="698aae03-92da-4cc2-a9d2-ecdb5f143439" containerName="ovn-northd" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.916366 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-77f6984dd6-vxjwr"] Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.945580 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-77f6984dd6-vxjwr"] Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.949051 4854 scope.go:117] "RemoveContainer" containerID="55c2f083da705ca36f1812e92722e74d07fb8806209d2dfd59e204e7c4638f72" Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.964323 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c4598cc6b-tl8vg"] Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.974000 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7c4598cc6b-tl8vg"] Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.996067 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3565e266-6994-4000-a4f2-2901e22f6682-config-data\") pod \"3565e266-6994-4000-a4f2-2901e22f6682\" (UID: \"3565e266-6994-4000-a4f2-2901e22f6682\") " Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.996331 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76jz4\" (UniqueName: \"kubernetes.io/projected/3565e266-6994-4000-a4f2-2901e22f6682-kube-api-access-76jz4\") pod \"3565e266-6994-4000-a4f2-2901e22f6682\" (UID: \"3565e266-6994-4000-a4f2-2901e22f6682\") " Oct 07 12:46:03 crc kubenswrapper[4854]: I1007 12:46:03.996390 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3565e266-6994-4000-a4f2-2901e22f6682-combined-ca-bundle\") pod \"3565e266-6994-4000-a4f2-2901e22f6682\" (UID: \"3565e266-6994-4000-a4f2-2901e22f6682\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.007053 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3565e266-6994-4000-a4f2-2901e22f6682-kube-api-access-76jz4" (OuterVolumeSpecName: "kube-api-access-76jz4") pod "3565e266-6994-4000-a4f2-2901e22f6682" (UID: "3565e266-6994-4000-a4f2-2901e22f6682"). InnerVolumeSpecName "kube-api-access-76jz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.011573 4854 scope.go:117] "RemoveContainer" containerID="5e0219a6ee2ee21252dc5506f149e4cf4652bd9388a64c5f4c96dec5453578f6" Oct 07 12:46:04 crc kubenswrapper[4854]: E1007 12:46:04.013038 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e0219a6ee2ee21252dc5506f149e4cf4652bd9388a64c5f4c96dec5453578f6\": container with ID starting with 5e0219a6ee2ee21252dc5506f149e4cf4652bd9388a64c5f4c96dec5453578f6 not found: ID does not exist" containerID="5e0219a6ee2ee21252dc5506f149e4cf4652bd9388a64c5f4c96dec5453578f6" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.013073 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e0219a6ee2ee21252dc5506f149e4cf4652bd9388a64c5f4c96dec5453578f6"} err="failed to get container status \"5e0219a6ee2ee21252dc5506f149e4cf4652bd9388a64c5f4c96dec5453578f6\": rpc error: code = NotFound desc = could not find container \"5e0219a6ee2ee21252dc5506f149e4cf4652bd9388a64c5f4c96dec5453578f6\": container with ID starting with 5e0219a6ee2ee21252dc5506f149e4cf4652bd9388a64c5f4c96dec5453578f6 not found: ID does not exist" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.013097 4854 scope.go:117] "RemoveContainer" containerID="55c2f083da705ca36f1812e92722e74d07fb8806209d2dfd59e204e7c4638f72" Oct 07 12:46:04 crc kubenswrapper[4854]: E1007 12:46:04.014399 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55c2f083da705ca36f1812e92722e74d07fb8806209d2dfd59e204e7c4638f72\": container with ID starting with 55c2f083da705ca36f1812e92722e74d07fb8806209d2dfd59e204e7c4638f72 not found: ID does not exist" containerID="55c2f083da705ca36f1812e92722e74d07fb8806209d2dfd59e204e7c4638f72" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.014426 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55c2f083da705ca36f1812e92722e74d07fb8806209d2dfd59e204e7c4638f72"} err="failed to get container status \"55c2f083da705ca36f1812e92722e74d07fb8806209d2dfd59e204e7c4638f72\": rpc error: code = NotFound desc = could not find container \"55c2f083da705ca36f1812e92722e74d07fb8806209d2dfd59e204e7c4638f72\": container with ID starting with 55c2f083da705ca36f1812e92722e74d07fb8806209d2dfd59e204e7c4638f72 not found: ID does not exist" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.024458 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3565e266-6994-4000-a4f2-2901e22f6682-config-data" (OuterVolumeSpecName: "config-data") pod "3565e266-6994-4000-a4f2-2901e22f6682" (UID: "3565e266-6994-4000-a4f2-2901e22f6682"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.028510 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3565e266-6994-4000-a4f2-2901e22f6682-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3565e266-6994-4000-a4f2-2901e22f6682" (UID: "3565e266-6994-4000-a4f2-2901e22f6682"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.098488 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76jz4\" (UniqueName: \"kubernetes.io/projected/3565e266-6994-4000-a4f2-2901e22f6682-kube-api-access-76jz4\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.098512 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3565e266-6994-4000-a4f2-2901e22f6682-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.098522 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3565e266-6994-4000-a4f2-2901e22f6682-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.098530 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32501fcc-b225-4b92-8aef-15d69474e7a1-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.110575 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-4q2wn"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.118371 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-4q2wn"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.140818 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance3267-account-delete-cwdxm"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.152359 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3267-account-create-lnz96"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.162752 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3267-account-create-lnz96"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.296170 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="770ca0a9-4c48-446b-be08-84b06d20d501" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:51862->10.217.0.199:8775: read: connection reset by peer" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.298863 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="770ca0a9-4c48-446b-be08-84b06d20d501" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:51874->10.217.0.199:8775: read: connection reset by peer" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.360826 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7plvr"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.372205 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b378-account-create-gnv6t"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.392887 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronb378-account-delete-jzbpj"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.402338 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7plvr"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.409033 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b378-account-create-gnv6t"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.415250 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-hfzk2"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.421520 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-hfzk2"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.435660 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c7c1-account-create-4vmxx"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.441716 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderc7c1-account-delete-dz57l"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.446759 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-c7c1-account-create-4vmxx"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.718008 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a87eed9-2ee8-4185-b755-9a12163e51fc" path="/var/lib/kubelet/pods/0a87eed9-2ee8-4185-b755-9a12163e51fc/volumes" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.724505 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22343ee2-64b7-4496-b74c-c9860920e953" path="/var/lib/kubelet/pods/22343ee2-64b7-4496-b74c-c9860920e953/volumes" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.725849 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="278d1b2b-3c52-4899-9432-1c1ca85a0f6d" path="/var/lib/kubelet/pods/278d1b2b-3c52-4899-9432-1c1ca85a0f6d/volumes" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.726631 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32501fcc-b225-4b92-8aef-15d69474e7a1" path="/var/lib/kubelet/pods/32501fcc-b225-4b92-8aef-15d69474e7a1/volumes" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.727174 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39b6e215-6643-4003-9c53-d33f6af39494" path="/var/lib/kubelet/pods/39b6e215-6643-4003-9c53-d33f6af39494/volumes" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.728578 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43bd8c48-fc92-4f02-af3e-db78673cacb9" path="/var/lib/kubelet/pods/43bd8c48-fc92-4f02-af3e-db78673cacb9/volumes" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.729377 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47f8159b-e07f-47bd-92e8-a57f3e0c545d" path="/var/lib/kubelet/pods/47f8159b-e07f-47bd-92e8-a57f3e0c545d/volumes" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.732869 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f63842c-f85f-4e07-8221-4ce96b22bf44" path="/var/lib/kubelet/pods/6f63842c-f85f-4e07-8221-4ce96b22bf44/volumes" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.770129 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95bc842d-1bac-4c0e-b2c8-4f0838a630ad" path="/var/lib/kubelet/pods/95bc842d-1bac-4c0e-b2c8-4f0838a630ad/volumes" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.770943 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d71ee2f2-10ff-48f3-8eed-e8eab883ad22" path="/var/lib/kubelet/pods/d71ee2f2-10ff-48f3-8eed-e8eab883ad22/volumes" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.773204 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbbd823b-2e1d-4901-855a-72cd9a13a6fd" path="/var/lib/kubelet/pods/dbbd823b-2e1d-4901-855a-72cd9a13a6fd/volumes" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.775954 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2a2edca-4c99-45d7-b47e-12d31a7e649f" path="/var/lib/kubelet/pods/e2a2edca-4c99-45d7-b47e-12d31a7e649f/volumes" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.776672 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3bea35c-bafd-44fc-bd6f-7878a27e1c9b" path="/var/lib/kubelet/pods/f3bea35c-bafd-44fc-bd6f-7878a27e1c9b/volumes" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.777157 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.787356 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.815344 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hllqq" podUID="6e6702f3-b113-49f9-b85f-a2d294bac6dc" containerName="ovn-controller" probeResult="failure" output=< Oct 07 12:46:04 crc kubenswrapper[4854]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Oct 07 12:46:04 crc kubenswrapper[4854]: > Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.817843 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.826990 4854 generic.go:334] "Generic (PLEG): container finished" podID="a936b898-5163-4c5e-ac30-64af1533cec7" containerID="a963e64a5e60796e5674815e7ff05f24676f7da4a8d14514981bf0f0c9a9c739" exitCode=0 Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.827075 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a936b898-5163-4c5e-ac30-64af1533cec7","Type":"ContainerDied","Data":"a963e64a5e60796e5674815e7ff05f24676f7da4a8d14514981bf0f0c9a9c739"} Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.827103 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a936b898-5163-4c5e-ac30-64af1533cec7","Type":"ContainerDied","Data":"178b76a44ed3e91ddc157457a0a57a8f8807036b5e3c023f4860ac6fa7893eb4"} Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.827122 4854 scope.go:117] "RemoveContainer" containerID="a963e64a5e60796e5674815e7ff05f24676f7da4a8d14514981bf0f0c9a9c739" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.827327 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.834678 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" event={"ID":"04ffb838-3774-402c-9cdf-d11e51fb21e5","Type":"ContainerStarted","Data":"76a801916c6498bb2e49a91144e0ff94292e20664e61ad28d638eadcee76fb94"} Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.834926 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" podUID="04ffb838-3774-402c-9cdf-d11e51fb21e5" containerName="barbican-keystone-listener-log" containerID="cri-o://1ee628a9c0b67d36be2b170a1da5ccab6d647b96fb94a73b6590d405c4039a11" gracePeriod=30 Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.835129 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" podUID="04ffb838-3774-402c-9cdf-d11e51fb21e5" containerName="barbican-keystone-listener" containerID="cri-o://76a801916c6498bb2e49a91144e0ff94292e20664e61ad28d638eadcee76fb94" gracePeriod=30 Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.871987 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3565e266-6994-4000-a4f2-2901e22f6682","Type":"ContainerDied","Data":"cb1ea59674a5427ac3dffbdb7b70cd2fd85f11c37af0b58c75c5288f3840438a"} Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.874283 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.914410 4854 generic.go:334] "Generic (PLEG): container finished" podID="37dd5983-0d4d-4097-8657-f408e9bc68c0" containerID="bd7103524f01d027fb1b2f752223346853e431f9b045e756199823ef3b9a9a3d" exitCode=0 Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.914510 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37dd5983-0d4d-4097-8657-f408e9bc68c0","Type":"ContainerDied","Data":"bd7103524f01d027fb1b2f752223346853e431f9b045e756199823ef3b9a9a3d"} Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.924735 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-config-data\") pod \"a936b898-5163-4c5e-ac30-64af1533cec7\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.924821 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-internal-tls-certs\") pod \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.924867 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c8167ae-8941-4616-bee2-ff0fb5e98c16-etc-swift\") pod \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.924905 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-public-tls-certs\") pod \"a936b898-5163-4c5e-ac30-64af1533cec7\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.924934 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"a936b898-5163-4c5e-ac30-64af1533cec7\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.924958 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-public-tls-certs\") pod \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.924982 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-config-data\") pod \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925025 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a936b898-5163-4c5e-ac30-64af1533cec7-httpd-run\") pod \"a936b898-5163-4c5e-ac30-64af1533cec7\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925066 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8167ae-8941-4616-bee2-ff0fb5e98c16-log-httpd\") pod \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925101 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7qdc\" (UniqueName: \"kubernetes.io/projected/2c8167ae-8941-4616-bee2-ff0fb5e98c16-kube-api-access-l7qdc\") pod \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925170 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-config-data\") pod \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925198 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlwq8\" (UniqueName: \"kubernetes.io/projected/a936b898-5163-4c5e-ac30-64af1533cec7-kube-api-access-tlwq8\") pod \"a936b898-5163-4c5e-ac30-64af1533cec7\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925221 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-internal-tls-certs\") pod \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925249 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a936b898-5163-4c5e-ac30-64af1533cec7-logs\") pod \"a936b898-5163-4c5e-ac30-64af1533cec7\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925284 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-config-data-custom\") pod \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925330 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-scripts\") pod \"a936b898-5163-4c5e-ac30-64af1533cec7\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925378 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-logs\") pod \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925414 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-combined-ca-bundle\") pod \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925451 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8167ae-8941-4616-bee2-ff0fb5e98c16-run-httpd\") pod \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\" (UID: \"2c8167ae-8941-4616-bee2-ff0fb5e98c16\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925475 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-etc-machine-id\") pod \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925508 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-public-tls-certs\") pod \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925534 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-combined-ca-bundle\") pod \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925574 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-scripts\") pod \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925594 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-combined-ca-bundle\") pod \"a936b898-5163-4c5e-ac30-64af1533cec7\" (UID: \"a936b898-5163-4c5e-ac30-64af1533cec7\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.925626 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbfrg\" (UniqueName: \"kubernetes.io/projected/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-kube-api-access-bbfrg\") pod \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\" (UID: \"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf\") " Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.953109 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c8167ae-8941-4616-bee2-ff0fb5e98c16-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2c8167ae-8941-4616-bee2-ff0fb5e98c16" (UID: "2c8167ae-8941-4616-bee2-ff0fb5e98c16"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.972392 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8167ae-8941-4616-bee2-ff0fb5e98c16-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2c8167ae-8941-4616-bee2-ff0fb5e98c16" (UID: "2c8167ae-8941-4616-bee2-ff0fb5e98c16"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.975334 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" (UID: "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.982484 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-logs" (OuterVolumeSpecName: "logs") pod "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" (UID: "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.982518 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a936b898-5163-4c5e-ac30-64af1533cec7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a936b898-5163-4c5e-ac30-64af1533cec7" (UID: "a936b898-5163-4c5e-ac30-64af1533cec7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.987084 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a936b898-5163-4c5e-ac30-64af1533cec7-logs" (OuterVolumeSpecName: "logs") pod "a936b898-5163-4c5e-ac30-64af1533cec7" (UID: "a936b898-5163-4c5e-ac30-64af1533cec7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.987319 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c8167ae-8941-4616-bee2-ff0fb5e98c16-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2c8167ae-8941-4616-bee2-ff0fb5e98c16" (UID: "2c8167ae-8941-4616-bee2-ff0fb5e98c16"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.987447 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" (UID: "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.988114 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-scripts" (OuterVolumeSpecName: "scripts") pod "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" (UID: "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.988321 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-kube-api-access-bbfrg" (OuterVolumeSpecName: "kube-api-access-bbfrg") pod "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" (UID: "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf"). InnerVolumeSpecName "kube-api-access-bbfrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.989050 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-scripts" (OuterVolumeSpecName: "scripts") pod "a936b898-5163-4c5e-ac30-64af1533cec7" (UID: "a936b898-5163-4c5e-ac30-64af1533cec7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.989218 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8167ae-8941-4616-bee2-ff0fb5e98c16-kube-api-access-l7qdc" (OuterVolumeSpecName: "kube-api-access-l7qdc") pod "2c8167ae-8941-4616-bee2-ff0fb5e98c16" (UID: "2c8167ae-8941-4616-bee2-ff0fb5e98c16"). InnerVolumeSpecName "kube-api-access-l7qdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.996433 4854 scope.go:117] "RemoveContainer" containerID="ab9b63d9a92d2b3c1a80cf5ab47e0bdb681cabb6aa9a7547201bf16b8e4df31e" Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.997798 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.998036 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="ceilometer-central-agent" containerID="cri-o://b9891d843c836c188cc5c0d2b4df4db333aab6e636378878b75cfbd65ecdb10b" gracePeriod=30 Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.998426 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="proxy-httpd" containerID="cri-o://43911c074518df2d12734bda5be2d0438565e37df58ee0f1bb19e3735feca614" gracePeriod=30 Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.998492 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="sg-core" containerID="cri-o://1bb3783752e90f8f65fc57d4fca86c174ad0436acd90b5d1426146067be381e1" gracePeriod=30 Oct 07 12:46:04 crc kubenswrapper[4854]: I1007 12:46:04.998535 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="ceilometer-notification-agent" containerID="cri-o://e2647a1c022554b80ce34c49ff2bbcb7d62b1334b8589db6fa5444a0549bf597" gracePeriod=30 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:04.999745 4854 generic.go:334] "Generic (PLEG): container finished" podID="2c8167ae-8941-4616-bee2-ff0fb5e98c16" containerID="dd65a138aac9fa721dada3f006a871724f98f90b0fa13687c508bb7e6049a302" exitCode=0 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:04.999817 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-667c455579-lnd9l" event={"ID":"2c8167ae-8941-4616-bee2-ff0fb5e98c16","Type":"ContainerDied","Data":"dd65a138aac9fa721dada3f006a871724f98f90b0fa13687c508bb7e6049a302"} Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:04.999846 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-667c455579-lnd9l" event={"ID":"2c8167ae-8941-4616-bee2-ff0fb5e98c16","Type":"ContainerDied","Data":"8d1b853dcbf4379c9355c85ed7807d68fc02707946cc9d5d5c9ec4959b8edb49"} Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:04.999899 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-667c455579-lnd9l" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.000353 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "a936b898-5163-4c5e-ac30-64af1533cec7" (UID: "a936b898-5163-4c5e-ac30-64af1533cec7"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.022467 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.025271 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.025357 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a936b898-5163-4c5e-ac30-64af1533cec7-kube-api-access-tlwq8" (OuterVolumeSpecName: "kube-api-access-tlwq8") pod "a936b898-5163-4c5e-ac30-64af1533cec7" (UID: "a936b898-5163-4c5e-ac30-64af1533cec7"). InnerVolumeSpecName "kube-api-access-tlwq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.028281 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.031458 4854 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a936b898-5163-4c5e-ac30-64af1533cec7-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.031491 4854 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8167ae-8941-4616-bee2-ff0fb5e98c16-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.031501 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7qdc\" (UniqueName: \"kubernetes.io/projected/2c8167ae-8941-4616-bee2-ff0fb5e98c16-kube-api-access-l7qdc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.031512 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlwq8\" (UniqueName: \"kubernetes.io/projected/a936b898-5163-4c5e-ac30-64af1533cec7-kube-api-access-tlwq8\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.031521 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a936b898-5163-4c5e-ac30-64af1533cec7-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.031529 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.031537 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.031545 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.031553 4854 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c8167ae-8941-4616-bee2-ff0fb5e98c16-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.031563 4854 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.031570 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.031578 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbfrg\" (UniqueName: \"kubernetes.io/projected/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-kube-api-access-bbfrg\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.031586 4854 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c8167ae-8941-4616-bee2-ff0fb5e98c16-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.031606 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.039225 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.039209 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.039295 4854 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-j5h2b" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovsdb-server" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.039601 4854 generic.go:334] "Generic (PLEG): container finished" podID="2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" containerID="66d61a36b5d71d29e4fc833793e7b8e2d512bb92c6cdc2ceee1421b70515026b" exitCode=0 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.039668 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.039705 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf","Type":"ContainerDied","Data":"66d61a36b5d71d29e4fc833793e7b8e2d512bb92c6cdc2ceee1421b70515026b"} Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.039731 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2bcb9c6a-b0b3-438e-9e00-b3706ea71adf","Type":"ContainerDied","Data":"27072ab6ef04a6c47adee37687942b234b2555f7c9dbff7def6b6c3eebce4240"} Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.074408 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.074475 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-j5h2b" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovs-vswitchd" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.085361 4854 scope.go:117] "RemoveContainer" containerID="a963e64a5e60796e5674815e7ff05f24676f7da4a8d14514981bf0f0c9a9c739" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.109439 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a963e64a5e60796e5674815e7ff05f24676f7da4a8d14514981bf0f0c9a9c739\": container with ID starting with a963e64a5e60796e5674815e7ff05f24676f7da4a8d14514981bf0f0c9a9c739 not found: ID does not exist" containerID="a963e64a5e60796e5674815e7ff05f24676f7da4a8d14514981bf0f0c9a9c739" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.109481 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a963e64a5e60796e5674815e7ff05f24676f7da4a8d14514981bf0f0c9a9c739"} err="failed to get container status \"a963e64a5e60796e5674815e7ff05f24676f7da4a8d14514981bf0f0c9a9c739\": rpc error: code = NotFound desc = could not find container \"a963e64a5e60796e5674815e7ff05f24676f7da4a8d14514981bf0f0c9a9c739\": container with ID starting with a963e64a5e60796e5674815e7ff05f24676f7da4a8d14514981bf0f0c9a9c739 not found: ID does not exist" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.109504 4854 scope.go:117] "RemoveContainer" containerID="ab9b63d9a92d2b3c1a80cf5ab47e0bdb681cabb6aa9a7547201bf16b8e4df31e" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.111958 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab9b63d9a92d2b3c1a80cf5ab47e0bdb681cabb6aa9a7547201bf16b8e4df31e\": container with ID starting with ab9b63d9a92d2b3c1a80cf5ab47e0bdb681cabb6aa9a7547201bf16b8e4df31e not found: ID does not exist" containerID="ab9b63d9a92d2b3c1a80cf5ab47e0bdb681cabb6aa9a7547201bf16b8e4df31e" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.111986 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab9b63d9a92d2b3c1a80cf5ab47e0bdb681cabb6aa9a7547201bf16b8e4df31e"} err="failed to get container status \"ab9b63d9a92d2b3c1a80cf5ab47e0bdb681cabb6aa9a7547201bf16b8e4df31e\": rpc error: code = NotFound desc = could not find container \"ab9b63d9a92d2b3c1a80cf5ab47e0bdb681cabb6aa9a7547201bf16b8e4df31e\": container with ID starting with ab9b63d9a92d2b3c1a80cf5ab47e0bdb681cabb6aa9a7547201bf16b8e4df31e not found: ID does not exist" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.112003 4854 scope.go:117] "RemoveContainer" containerID="6b55461da6eb4b748690750f389a1e88f87574727ef1064972561b0e7d0151ea" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.112490 4854 generic.go:334] "Generic (PLEG): container finished" podID="6002f7a4-27d6-4554-a486-87926ebcf57e" containerID="f0ff8ce507731e5f1fa6188aabef5933b1d02dc7dee5ef5454692c6f625fabed" exitCode=0 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.112543 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7595d98994-smt7c" event={"ID":"6002f7a4-27d6-4554-a486-87926ebcf57e","Type":"ContainerDied","Data":"f0ff8ce507731e5f1fa6188aabef5933b1d02dc7dee5ef5454692c6f625fabed"} Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.141631 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.142223 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e309be64-7a5a-4156-89a6-d1201eaaff63" containerName="kube-state-metrics" containerID="cri-o://ca038c665c300165c2b08c130cce95306c24d5eee4ccd22ba09577872f53b9c2" gracePeriod=30 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.150898 4854 generic.go:334] "Generic (PLEG): container finished" podID="13453d02-4f55-45a7-98be-1cd41c741a3e" containerID="6ab3625afd9d5554eb97be59fc81dda17398623a3a3702f50c0d684653f13606" exitCode=0 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.150978 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13453d02-4f55-45a7-98be-1cd41c741a3e","Type":"ContainerDied","Data":"6ab3625afd9d5554eb97be59fc81dda17398623a3a3702f50c0d684653f13606"} Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.174121 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a936b898-5163-4c5e-ac30-64af1533cec7" (UID: "a936b898-5163-4c5e-ac30-64af1533cec7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.183589 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0e568-account-delete-srq52" event={"ID":"7c47fe71-3b92-4490-8488-d98d0e25519e","Type":"ContainerStarted","Data":"97fcf4e9b37412fc9921e0d9fdf26f43fba6f08230c7b20f4625c9b0374aebc2"} Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.183729 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell0e568-account-delete-srq52" podUID="7c47fe71-3b92-4490-8488-d98d0e25519e" containerName="mariadb-account-delete" containerID="cri-o://97fcf4e9b37412fc9921e0d9fdf26f43fba6f08230c7b20f4625c9b0374aebc2" gracePeriod=30 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.214817 4854 generic.go:334] "Generic (PLEG): container finished" podID="770ca0a9-4c48-446b-be08-84b06d20d501" containerID="16fe06c197602fe7fb28e9bd7f56721c49f40f87574fc4f159cb1d7c6685606e" exitCode=0 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.214937 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"770ca0a9-4c48-446b-be08-84b06d20d501","Type":"ContainerDied","Data":"16fe06c197602fe7fb28e9bd7f56721c49f40f87574fc4f159cb1d7c6685606e"} Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.218984 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.241472 4854 generic.go:334] "Generic (PLEG): container finished" podID="a03c4a0d-6346-43e4-8db1-f653b5dfa420" containerID="dda148e71ce3d2269edc8e05951c34651b383998b2ea4117e0f5773827d60523" exitCode=0 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.241953 4854 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/neutronb378-account-delete-jzbpj" secret="" err="secret \"galera-openstack-dockercfg-x6thd\" not found" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.242290 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapi643b-account-delete-92r97" podUID="aaa0738a-daf1-479f-9dbd-913806703370" containerName="mariadb-account-delete" containerID="cri-o://c2f64bd6a502240eba60aec7a5a06461ffdecfdc04beff785aa7c921c7ef988e" gracePeriod=30 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.242405 4854 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/cinderc7c1-account-delete-dz57l" secret="" err="secret \"galera-openstack-dockercfg-x6thd\" not found" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.242453 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-65b8874fd7-dnnjf" podUID="5179f78d-3a8f-4621-95f3-e147ff8da79f" containerName="barbican-worker-log" containerID="cri-o://02fc41252a41df476e3c3e2c65ab0a5aca662f6c1710215b0dcdfa49eb74b167" gracePeriod=30 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.242539 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-65b8874fd7-dnnjf" podUID="5179f78d-3a8f-4621-95f3-e147ff8da79f" containerName="barbican-worker" containerID="cri-o://6c97ffba80b987fc973726fde0f552970e4fb94bcc265aa80afeaa87e15eaa7e" gracePeriod=30 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.242596 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f54bfd6b4-g5gq4" event={"ID":"a03c4a0d-6346-43e4-8db1-f653b5dfa420","Type":"ContainerDied","Data":"dda148e71ce3d2269edc8e05951c34651b383998b2ea4117e0f5773827d60523"} Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.243499 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttgk7\" (UniqueName: \"kubernetes.io/projected/770ca0a9-4c48-446b-be08-84b06d20d501-kube-api-access-ttgk7\") pod \"770ca0a9-4c48-446b-be08-84b06d20d501\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.243549 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/770ca0a9-4c48-446b-be08-84b06d20d501-logs\") pod \"770ca0a9-4c48-446b-be08-84b06d20d501\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.243662 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-config-data\") pod \"770ca0a9-4c48-446b-be08-84b06d20d501\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.243780 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-combined-ca-bundle\") pod \"770ca0a9-4c48-446b-be08-84b06d20d501\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.243873 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-nova-metadata-tls-certs\") pod \"770ca0a9-4c48-446b-be08-84b06d20d501\" (UID: \"770ca0a9-4c48-446b-be08-84b06d20d501\") " Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.244280 4854 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.245243 4854 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/glance3267-account-delete-cwdxm" secret="" err="secret \"galera-openstack-dockercfg-x6thd\" not found" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.247638 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/770ca0a9-4c48-446b-be08-84b06d20d501-logs" (OuterVolumeSpecName: "logs") pod "770ca0a9-4c48-446b-be08-84b06d20d501" (UID: "770ca0a9-4c48-446b-be08-84b06d20d501"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.259477 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" podStartSLOduration=7.259458481 podStartE2EDuration="7.259458481s" podCreationTimestamp="2025-10-07 12:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:46:04.97583571 +0000 UTC m=+1280.963667965" watchObservedRunningTime="2025-10-07 12:46:05.259458481 +0000 UTC m=+1281.247290726" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.269804 4854 scope.go:117] "RemoveContainer" containerID="dd65a138aac9fa721dada3f006a871724f98f90b0fa13687c508bb7e6049a302" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.283476 4854 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.308175 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.315951 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.322399 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.322848 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="c2cebadb-2142-477a-85b3-53e7c73fa6cc" containerName="memcached" containerID="cri-o://452b12795119ff3315cae84f6ce86ad79fa535175852c78bfc7a106938624b75" gracePeriod=30 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.330496 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-frlpd"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.334907 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-f7kxk"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.344324 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-frlpd"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.347336 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/770ca0a9-4c48-446b-be08-84b06d20d501-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.347366 4854 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.349041 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7595d98994-smt7c" podUID="6002f7a4-27d6-4554-a486-87926ebcf57e" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.155:9311/healthcheck\": dial tcp 10.217.0.155:9311: connect: connection refused" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.356325 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-f7kxk"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.358674 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-775676c768-frcfp"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.358918 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-775676c768-frcfp" podUID="9869a8d4-db8e-4aba-82d1-6d02c3cf988e" containerName="keystone-api" containerID="cri-o://d16e3dc32eb57dea6b6b77bcedd6210c8586d9780a519f65e63564f3af418dfd" gracePeriod=30 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.362933 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7595d98994-smt7c" podUID="6002f7a4-27d6-4554-a486-87926ebcf57e" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.155:9311/healthcheck\": dial tcp 10.217.0.155:9311: connect: connection refused" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.363217 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a936b898-5163-4c5e-ac30-64af1533cec7" (UID: "a936b898-5163-4c5e-ac30-64af1533cec7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368238 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone242e-account-delete-s9qn8"] Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368667 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b6e215-6643-4003-9c53-d33f6af39494" containerName="barbican-keystone-listener" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368679 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b6e215-6643-4003-9c53-d33f6af39494" containerName="barbican-keystone-listener" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368692 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b6e215-6643-4003-9c53-d33f6af39494" containerName="barbican-keystone-listener-log" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368698 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b6e215-6643-4003-9c53-d33f6af39494" containerName="barbican-keystone-listener-log" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368710 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbbd823b-2e1d-4901-855a-72cd9a13a6fd" containerName="ovsdbserver-nb" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368716 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbbd823b-2e1d-4901-855a-72cd9a13a6fd" containerName="ovsdbserver-nb" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368723 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770ca0a9-4c48-446b-be08-84b06d20d501" containerName="nova-metadata-metadata" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368728 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="770ca0a9-4c48-446b-be08-84b06d20d501" containerName="nova-metadata-metadata" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368740 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770ca0a9-4c48-446b-be08-84b06d20d501" containerName="nova-metadata-log" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368746 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="770ca0a9-4c48-446b-be08-84b06d20d501" containerName="nova-metadata-log" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368762 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a936b898-5163-4c5e-ac30-64af1533cec7" containerName="glance-httpd" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368770 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a936b898-5163-4c5e-ac30-64af1533cec7" containerName="glance-httpd" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368778 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8167ae-8941-4616-bee2-ff0fb5e98c16" containerName="proxy-server" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368784 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8167ae-8941-4616-bee2-ff0fb5e98c16" containerName="proxy-server" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368799 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22343ee2-64b7-4496-b74c-c9860920e953" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368806 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="22343ee2-64b7-4496-b74c-c9860920e953" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368812 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" containerName="cinder-api-log" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368818 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" containerName="cinder-api-log" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368829 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f63842c-f85f-4e07-8221-4ce96b22bf44" containerName="ovsdbserver-sb" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368835 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f63842c-f85f-4e07-8221-4ce96b22bf44" containerName="ovsdbserver-sb" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368845 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f8159b-e07f-47bd-92e8-a57f3e0c545d" containerName="openstack-network-exporter" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368853 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f8159b-e07f-47bd-92e8-a57f3e0c545d" containerName="openstack-network-exporter" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368860 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a936b898-5163-4c5e-ac30-64af1533cec7" containerName="glance-log" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368865 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a936b898-5163-4c5e-ac30-64af1533cec7" containerName="glance-log" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368874 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" containerName="cinder-api" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368882 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" containerName="cinder-api" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368893 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43bd8c48-fc92-4f02-af3e-db78673cacb9" containerName="dnsmasq-dns" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368900 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bd8c48-fc92-4f02-af3e-db78673cacb9" containerName="dnsmasq-dns" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368909 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f63842c-f85f-4e07-8221-4ce96b22bf44" containerName="openstack-network-exporter" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368914 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f63842c-f85f-4e07-8221-4ce96b22bf44" containerName="openstack-network-exporter" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368922 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8167ae-8941-4616-bee2-ff0fb5e98c16" containerName="proxy-httpd" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368928 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8167ae-8941-4616-bee2-ff0fb5e98c16" containerName="proxy-httpd" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368939 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43bd8c48-fc92-4f02-af3e-db78673cacb9" containerName="init" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368944 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="43bd8c48-fc92-4f02-af3e-db78673cacb9" containerName="init" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368953 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbbd823b-2e1d-4901-855a-72cd9a13a6fd" containerName="openstack-network-exporter" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368959 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbbd823b-2e1d-4901-855a-72cd9a13a6fd" containerName="openstack-network-exporter" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.368973 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3565e266-6994-4000-a4f2-2901e22f6682" containerName="nova-cell0-conductor-conductor" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.368979 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="3565e266-6994-4000-a4f2-2901e22f6682" containerName="nova-cell0-conductor-conductor" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369127 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f63842c-f85f-4e07-8221-4ce96b22bf44" containerName="openstack-network-exporter" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369141 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="3565e266-6994-4000-a4f2-2901e22f6682" containerName="nova-cell0-conductor-conductor" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369238 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a936b898-5163-4c5e-ac30-64af1533cec7" containerName="glance-log" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369245 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f8159b-e07f-47bd-92e8-a57f3e0c545d" containerName="openstack-network-exporter" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369253 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="39b6e215-6643-4003-9c53-d33f6af39494" containerName="barbican-keystone-listener" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369265 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="22343ee2-64b7-4496-b74c-c9860920e953" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369279 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a936b898-5163-4c5e-ac30-64af1533cec7" containerName="glance-httpd" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369291 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" containerName="cinder-api" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369301 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" containerName="cinder-api-log" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369311 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="770ca0a9-4c48-446b-be08-84b06d20d501" containerName="nova-metadata-metadata" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369321 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8167ae-8941-4616-bee2-ff0fb5e98c16" containerName="proxy-server" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369332 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f63842c-f85f-4e07-8221-4ce96b22bf44" containerName="ovsdbserver-sb" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369342 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbbd823b-2e1d-4901-855a-72cd9a13a6fd" containerName="ovsdbserver-nb" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369349 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbbd823b-2e1d-4901-855a-72cd9a13a6fd" containerName="openstack-network-exporter" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369359 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8167ae-8941-4616-bee2-ff0fb5e98c16" containerName="proxy-httpd" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369368 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="43bd8c48-fc92-4f02-af3e-db78673cacb9" containerName="dnsmasq-dns" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369376 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="770ca0a9-4c48-446b-be08-84b06d20d501" containerName="nova-metadata-log" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369384 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="39b6e215-6643-4003-9c53-d33f6af39494" containerName="barbican-keystone-listener-log" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.369956 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone242e-account-delete-s9qn8" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.380680 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone242e-account-delete-s9qn8"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.384457 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.386446 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/770ca0a9-4c48-446b-be08-84b06d20d501-kube-api-access-ttgk7" (OuterVolumeSpecName: "kube-api-access-ttgk7") pod "770ca0a9-4c48-446b-be08-84b06d20d501" (UID: "770ca0a9-4c48-446b-be08-84b06d20d501"). InnerVolumeSpecName "kube-api-access-ttgk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.386485 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-config-data" (OuterVolumeSpecName: "config-data") pod "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" (UID: "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.386568 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" (UID: "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.389530 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell0e568-account-delete-srq52" podStartSLOduration=6.389508047 podStartE2EDuration="6.389508047s" podCreationTimestamp="2025-10-07 12:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:46:05.225071784 +0000 UTC m=+1281.212904039" watchObservedRunningTime="2025-10-07 12:46:05.389508047 +0000 UTC m=+1281.377340302" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.420051 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-h4dv5"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.420118 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-h4dv5"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.440840 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone242e-account-delete-s9qn8"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.450329 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9v49\" (UniqueName: \"kubernetes.io/projected/6f22a792-7012-4cfb-9442-20ef086f5532-kube-api-access-v9v49\") pod \"keystone242e-account-delete-s9qn8\" (UID: \"6f22a792-7012-4cfb-9442-20ef086f5532\") " pod="openstack/keystone242e-account-delete-s9qn8" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.450542 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.450562 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.450574 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.450585 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttgk7\" (UniqueName: \"kubernetes.io/projected/770ca0a9-4c48-446b-be08-84b06d20d501-kube-api-access-ttgk7\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.484809 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-242e-account-create-7jrdt"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.488871 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-242e-account-create-7jrdt"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.504181 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapi643b-account-delete-92r97" podStartSLOduration=6.5041594719999996 podStartE2EDuration="6.504159472s" podCreationTimestamp="2025-10-07 12:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:46:05.321441285 +0000 UTC m=+1281.309273540" watchObservedRunningTime="2025-10-07 12:46:05.504159472 +0000 UTC m=+1281.491991727" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.525740 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-65b8874fd7-dnnjf" podStartSLOduration=7.525720383 podStartE2EDuration="7.525720383s" podCreationTimestamp="2025-10-07 12:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:46:05.34861524 +0000 UTC m=+1281.336447495" watchObservedRunningTime="2025-10-07 12:46:05.525720383 +0000 UTC m=+1281.513552628" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.541314 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinderc7c1-account-delete-dz57l" podStartSLOduration=6.541292359 podStartE2EDuration="6.541292359s" podCreationTimestamp="2025-10-07 12:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 12:46:05.386776547 +0000 UTC m=+1281.374608802" watchObservedRunningTime="2025-10-07 12:46:05.541292359 +0000 UTC m=+1281.529124614" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.553115 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9v49\" (UniqueName: \"kubernetes.io/projected/6f22a792-7012-4cfb-9442-20ef086f5532-kube-api-access-v9v49\") pod \"keystone242e-account-delete-s9qn8\" (UID: \"6f22a792-7012-4cfb-9442-20ef086f5532\") " pod="openstack/keystone242e-account-delete-s9qn8" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.559202 4854 projected.go:194] Error preparing data for projected volume kube-api-access-v9v49 for pod openstack/keystone242e-account-delete-s9qn8: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.559267 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f22a792-7012-4cfb-9442-20ef086f5532-kube-api-access-v9v49 podName:6f22a792-7012-4cfb-9442-20ef086f5532 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:06.059252104 +0000 UTC m=+1282.047084359 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-v9v49" (UniqueName: "kubernetes.io/projected/6f22a792-7012-4cfb-9442-20ef086f5532-kube-api-access-v9v49") pod "keystone242e-account-delete-s9qn8" (UID: "6f22a792-7012-4cfb-9442-20ef086f5532") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.668418 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" (UID: "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.746319 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-config-data" (OuterVolumeSpecName: "config-data") pod "770ca0a9-4c48-446b-be08-84b06d20d501" (UID: "770ca0a9-4c48-446b-be08-84b06d20d501"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.750502 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2c8167ae-8941-4616-bee2-ff0fb5e98c16" (UID: "2c8167ae-8941-4616-bee2-ff0fb5e98c16"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.777809 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" (UID: "2bcb9c6a-b0b3-438e-9e00-b3706ea71adf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.783694 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-config-data" (OuterVolumeSpecName: "config-data") pod "2c8167ae-8941-4616-bee2-ff0fb5e98c16" (UID: "2c8167ae-8941-4616-bee2-ff0fb5e98c16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.798700 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="e599e18f-63c0-4756-845c-973257921fd0" containerName="galera" containerID="cri-o://00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc" gracePeriod=30 Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.803510 4854 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.803553 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.803632 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.804076 4854 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.804094 4854 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.814373 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c8167ae-8941-4616-bee2-ff0fb5e98c16" (UID: "2c8167ae-8941-4616-bee2-ff0fb5e98c16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.834825 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-config-data" (OuterVolumeSpecName: "config-data") pod "a936b898-5163-4c5e-ac30-64af1533cec7" (UID: "a936b898-5163-4c5e-ac30-64af1533cec7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.846270 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "770ca0a9-4c48-446b-be08-84b06d20d501" (UID: "770ca0a9-4c48-446b-be08-84b06d20d501"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.865190 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2c8167ae-8941-4616-bee2-ff0fb5e98c16" (UID: "2c8167ae-8941-4616-bee2-ff0fb5e98c16"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.888658 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.888865 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "770ca0a9-4c48-446b-be08-84b06d20d501" (UID: "770ca0a9-4c48-446b-be08-84b06d20d501"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.890201 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.908121 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.908182 4854 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/770ca0a9-4c48-446b-be08-84b06d20d501-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.908197 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.908209 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a936b898-5163-4c5e-ac30-64af1533cec7-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: I1007 12:46:05.908219 4854 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c8167ae-8941-4616-bee2-ff0fb5e98c16-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.910426 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 12:46:05 crc kubenswrapper[4854]: E1007 12:46:05.910497 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="cac73de2-996a-4e04-abde-1153b44058bc" containerName="nova-cell1-conductor-conductor" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.120002 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9v49\" (UniqueName: \"kubernetes.io/projected/6f22a792-7012-4cfb-9442-20ef086f5532-kube-api-access-v9v49\") pod \"keystone242e-account-delete-s9qn8\" (UID: \"6f22a792-7012-4cfb-9442-20ef086f5532\") " pod="openstack/keystone242e-account-delete-s9qn8" Oct 07 12:46:06 crc kubenswrapper[4854]: E1007 12:46:06.128241 4854 projected.go:194] Error preparing data for projected volume kube-api-access-v9v49 for pod openstack/keystone242e-account-delete-s9qn8: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 07 12:46:06 crc kubenswrapper[4854]: E1007 12:46:06.128313 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f22a792-7012-4cfb-9442-20ef086f5532-kube-api-access-v9v49 podName:6f22a792-7012-4cfb-9442-20ef086f5532 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:07.128291017 +0000 UTC m=+1283.116123282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-v9v49" (UniqueName: "kubernetes.io/projected/6f22a792-7012-4cfb-9442-20ef086f5532-kube-api-access-v9v49") pod "keystone242e-account-delete-s9qn8" (UID: "6f22a792-7012-4cfb-9442-20ef086f5532") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.228926 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="4c293f13-b2a5-4d4b-9f69-fd118e34eab2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.260367 4854 generic.go:334] "Generic (PLEG): container finished" podID="e309be64-7a5a-4156-89a6-d1201eaaff63" containerID="ca038c665c300165c2b08c130cce95306c24d5eee4ccd22ba09577872f53b9c2" exitCode=2 Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.260495 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e309be64-7a5a-4156-89a6-d1201eaaff63","Type":"ContainerDied","Data":"ca038c665c300165c2b08c130cce95306c24d5eee4ccd22ba09577872f53b9c2"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.265512 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7595d98994-smt7c" event={"ID":"6002f7a4-27d6-4554-a486-87926ebcf57e","Type":"ContainerDied","Data":"9aa2233d29f445f6dbb48e7902cbb60c9de1fb72b3cf982ec1c19a1afd9da099"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.265562 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9aa2233d29f445f6dbb48e7902cbb60c9de1fb72b3cf982ec1c19a1afd9da099" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.267682 4854 generic.go:334] "Generic (PLEG): container finished" podID="aaa0738a-daf1-479f-9dbd-913806703370" containerID="c2f64bd6a502240eba60aec7a5a06461ffdecfdc04beff785aa7c921c7ef988e" exitCode=1 Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.267758 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi643b-account-delete-92r97" event={"ID":"aaa0738a-daf1-479f-9dbd-913806703370","Type":"ContainerDied","Data":"c2f64bd6a502240eba60aec7a5a06461ffdecfdc04beff785aa7c921c7ef988e"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.269524 4854 generic.go:334] "Generic (PLEG): container finished" podID="7c47fe71-3b92-4490-8488-d98d0e25519e" containerID="97fcf4e9b37412fc9921e0d9fdf26f43fba6f08230c7b20f4625c9b0374aebc2" exitCode=1 Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.269602 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0e568-account-delete-srq52" event={"ID":"7c47fe71-3b92-4490-8488-d98d0e25519e","Type":"ContainerDied","Data":"97fcf4e9b37412fc9921e0d9fdf26f43fba6f08230c7b20f4625c9b0374aebc2"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.273894 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37dd5983-0d4d-4097-8657-f408e9bc68c0","Type":"ContainerDied","Data":"5eb854a7a43acaab354b891bdb18b1efb536f1c9425821ac5965dae2fea77fa6"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.273953 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eb854a7a43acaab354b891bdb18b1efb536f1c9425821ac5965dae2fea77fa6" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.277628 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f54bfd6b4-g5gq4" event={"ID":"a03c4a0d-6346-43e4-8db1-f653b5dfa420","Type":"ContainerDied","Data":"37ba3c6481940bfbdda3c0521fdf02c60e40bf5891809d93abc2401822c90ae2"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.277695 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37ba3c6481940bfbdda3c0521fdf02c60e40bf5891809d93abc2401822c90ae2" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.282862 4854 generic.go:334] "Generic (PLEG): container finished" podID="5179f78d-3a8f-4621-95f3-e147ff8da79f" containerID="02fc41252a41df476e3c3e2c65ab0a5aca662f6c1710215b0dcdfa49eb74b167" exitCode=143 Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.282984 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65b8874fd7-dnnjf" event={"ID":"5179f78d-3a8f-4621-95f3-e147ff8da79f","Type":"ContainerDied","Data":"02fc41252a41df476e3c3e2c65ab0a5aca662f6c1710215b0dcdfa49eb74b167"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.285929 4854 generic.go:334] "Generic (PLEG): container finished" podID="735fa97d-e751-4957-bdae-3ae0b10635d2" containerID="c2b608d16398a3c2e79505a49f4bc3d6c058f2350032df3d21f4491f85f50701" exitCode=1 Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.286017 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicana01a-account-delete-8phrr" event={"ID":"735fa97d-e751-4957-bdae-3ae0b10635d2","Type":"ContainerDied","Data":"c2b608d16398a3c2e79505a49f4bc3d6c058f2350032df3d21f4491f85f50701"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.293589 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"13453d02-4f55-45a7-98be-1cd41c741a3e","Type":"ContainerDied","Data":"491449b30345fa51266807b560fb43c69324ed35091a73d2f9c7b2e839da0060"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.293638 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="491449b30345fa51266807b560fb43c69324ed35091a73d2f9c7b2e839da0060" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.300945 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"770ca0a9-4c48-446b-be08-84b06d20d501","Type":"ContainerDied","Data":"f73202056a1e805a5e2c0e483143fd82b536fa11f7554dece33ccf0abc6bbe11"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.301055 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.315945 4854 generic.go:334] "Generic (PLEG): container finished" podID="e7453b38-f6c3-4fe7-b15d-5bd8112dc687" containerID="88e0e41ebc891694b10be4302198088674ffb09627e1081edead8304fb62bb09" exitCode=1 Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.316000 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance3267-account-delete-cwdxm" event={"ID":"e7453b38-f6c3-4fe7-b15d-5bd8112dc687","Type":"ContainerDied","Data":"88e0e41ebc891694b10be4302198088674ffb09627e1081edead8304fb62bb09"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.329243 4854 generic.go:334] "Generic (PLEG): container finished" podID="66f80399-ed98-4aba-9db5-759ad2e314fa" containerID="75c3ef86d0dbd953431e4fcb366d7ac773027294d708d52f2fae09e3f2500930" exitCode=0 Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.329713 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementd4d8-account-delete-fh2n2" event={"ID":"66f80399-ed98-4aba-9db5-759ad2e314fa","Type":"ContainerDied","Data":"75c3ef86d0dbd953431e4fcb366d7ac773027294d708d52f2fae09e3f2500930"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.343842 4854 generic.go:334] "Generic (PLEG): container finished" podID="ac535972-fa59-4e7f-818b-345da6937c14" containerID="4ab5598f2f1f8e2a8745b5f522f943448e0b454fdb30f12ff064c16195d663a3" exitCode=0 Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.344131 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronb378-account-delete-jzbpj" event={"ID":"ac535972-fa59-4e7f-818b-345da6937c14","Type":"ContainerDied","Data":"4ab5598f2f1f8e2a8745b5f522f943448e0b454fdb30f12ff064c16195d663a3"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.370795 4854 generic.go:334] "Generic (PLEG): container finished" podID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerID="43911c074518df2d12734bda5be2d0438565e37df58ee0f1bb19e3735feca614" exitCode=0 Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.370829 4854 generic.go:334] "Generic (PLEG): container finished" podID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerID="1bb3783752e90f8f65fc57d4fca86c174ad0436acd90b5d1426146067be381e1" exitCode=2 Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.370980 4854 generic.go:334] "Generic (PLEG): container finished" podID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerID="b9891d843c836c188cc5c0d2b4df4db333aab6e636378878b75cfbd65ecdb10b" exitCode=0 Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.371052 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47e6a48a-4ef0-4764-a132-50140d86a6b2","Type":"ContainerDied","Data":"43911c074518df2d12734bda5be2d0438565e37df58ee0f1bb19e3735feca614"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.371082 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47e6a48a-4ef0-4764-a132-50140d86a6b2","Type":"ContainerDied","Data":"1bb3783752e90f8f65fc57d4fca86c174ad0436acd90b5d1426146067be381e1"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.371100 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47e6a48a-4ef0-4764-a132-50140d86a6b2","Type":"ContainerDied","Data":"b9891d843c836c188cc5c0d2b4df4db333aab6e636378878b75cfbd65ecdb10b"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.381781 4854 generic.go:334] "Generic (PLEG): container finished" podID="04ffb838-3774-402c-9cdf-d11e51fb21e5" containerID="1ee628a9c0b67d36be2b170a1da5ccab6d647b96fb94a73b6590d405c4039a11" exitCode=143 Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.381860 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" event={"ID":"04ffb838-3774-402c-9cdf-d11e51fb21e5","Type":"ContainerDied","Data":"1ee628a9c0b67d36be2b170a1da5ccab6d647b96fb94a73b6590d405c4039a11"} Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.386352 4854 generic.go:334] "Generic (PLEG): container finished" podID="787f934e-4f31-4b00-8cf6-380efd34aaad" containerID="beb49729e015c29539bad7a67c84a635504e19d4bc244ead20fb12cbae9b8e94" exitCode=1 Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.386403 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderc7c1-account-delete-dz57l" event={"ID":"787f934e-4f31-4b00-8cf6-380efd34aaad","Type":"ContainerDied","Data":"beb49729e015c29539bad7a67c84a635504e19d4bc244ead20fb12cbae9b8e94"} Oct 07 12:46:06 crc kubenswrapper[4854]: E1007 12:46:06.526486 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-v9v49], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone242e-account-delete-s9qn8" podUID="6f22a792-7012-4cfb-9442-20ef086f5532" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.536921 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.550606 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.559030 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="79513100-48d2-4e7b-ae14-888322cab8f3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.560350 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.562406 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.568642 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.578301 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.586378 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.587121 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.593213 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-667c455579-lnd9l"] Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.598797 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-667c455579-lnd9l"] Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.618002 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.623884 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.628764 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.636953 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementd4d8-account-delete-fh2n2" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.659305 4854 scope.go:117] "RemoveContainer" containerID="364708de26903cb2d2d7224428a3d27a659926ef0d8ef5da26e1579482e5acf8" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.710003 4854 scope.go:117] "RemoveContainer" containerID="dd65a138aac9fa721dada3f006a871724f98f90b0fa13687c508bb7e6049a302" Oct 07 12:46:06 crc kubenswrapper[4854]: E1007 12:46:06.710655 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd65a138aac9fa721dada3f006a871724f98f90b0fa13687c508bb7e6049a302\": container with ID starting with dd65a138aac9fa721dada3f006a871724f98f90b0fa13687c508bb7e6049a302 not found: ID does not exist" containerID="dd65a138aac9fa721dada3f006a871724f98f90b0fa13687c508bb7e6049a302" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.710694 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd65a138aac9fa721dada3f006a871724f98f90b0fa13687c508bb7e6049a302"} err="failed to get container status \"dd65a138aac9fa721dada3f006a871724f98f90b0fa13687c508bb7e6049a302\": rpc error: code = NotFound desc = could not find container \"dd65a138aac9fa721dada3f006a871724f98f90b0fa13687c508bb7e6049a302\": container with ID starting with dd65a138aac9fa721dada3f006a871724f98f90b0fa13687c508bb7e6049a302 not found: ID does not exist" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.710721 4854 scope.go:117] "RemoveContainer" containerID="364708de26903cb2d2d7224428a3d27a659926ef0d8ef5da26e1579482e5acf8" Oct 07 12:46:06 crc kubenswrapper[4854]: E1007 12:46:06.711479 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"364708de26903cb2d2d7224428a3d27a659926ef0d8ef5da26e1579482e5acf8\": container with ID starting with 364708de26903cb2d2d7224428a3d27a659926ef0d8ef5da26e1579482e5acf8 not found: ID does not exist" containerID="364708de26903cb2d2d7224428a3d27a659926ef0d8ef5da26e1579482e5acf8" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.711508 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"364708de26903cb2d2d7224428a3d27a659926ef0d8ef5da26e1579482e5acf8"} err="failed to get container status \"364708de26903cb2d2d7224428a3d27a659926ef0d8ef5da26e1579482e5acf8\": rpc error: code = NotFound desc = could not find container \"364708de26903cb2d2d7224428a3d27a659926ef0d8ef5da26e1579482e5acf8\": container with ID starting with 364708de26903cb2d2d7224428a3d27a659926ef0d8ef5da26e1579482e5acf8 not found: ID does not exist" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.711526 4854 scope.go:117] "RemoveContainer" containerID="66d61a36b5d71d29e4fc833793e7b8e2d512bb92c6cdc2ceee1421b70515026b" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.726265 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c69757e-7365-47af-a7ef-03c00a3aae33" path="/var/lib/kubelet/pods/0c69757e-7365-47af-a7ef-03c00a3aae33/volumes" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.727435 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bcb9c6a-b0b3-438e-9e00-b3706ea71adf" path="/var/lib/kubelet/pods/2bcb9c6a-b0b3-438e-9e00-b3706ea71adf/volumes" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.729771 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c8167ae-8941-4616-bee2-ff0fb5e98c16" path="/var/lib/kubelet/pods/2c8167ae-8941-4616-bee2-ff0fb5e98c16/volumes" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.730936 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-state-metrics-tls-certs\") pod \"e309be64-7a5a-4156-89a6-d1201eaaff63\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.730985 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-combined-ca-bundle\") pod \"13453d02-4f55-45a7-98be-1cd41c741a3e\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731010 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-internal-tls-certs\") pod \"37dd5983-0d4d-4097-8657-f408e9bc68c0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731040 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2jwc\" (UniqueName: \"kubernetes.io/projected/6002f7a4-27d6-4554-a486-87926ebcf57e-kube-api-access-q2jwc\") pod \"6002f7a4-27d6-4554-a486-87926ebcf57e\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731071 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-internal-tls-certs\") pod \"13453d02-4f55-45a7-98be-1cd41c741a3e\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731096 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-state-metrics-tls-config\") pod \"e309be64-7a5a-4156-89a6-d1201eaaff63\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731118 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-internal-tls-certs\") pod \"6002f7a4-27d6-4554-a486-87926ebcf57e\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731137 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-combined-ca-bundle\") pod \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731219 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-scripts\") pod \"13453d02-4f55-45a7-98be-1cd41c741a3e\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731257 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37dd5983-0d4d-4097-8657-f408e9bc68c0-logs\") pod \"37dd5983-0d4d-4097-8657-f408e9bc68c0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731283 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-config-data\") pod \"6002f7a4-27d6-4554-a486-87926ebcf57e\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731306 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13453d02-4f55-45a7-98be-1cd41c741a3e-httpd-run\") pod \"13453d02-4f55-45a7-98be-1cd41c741a3e\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731326 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-combined-ca-bundle\") pod \"37dd5983-0d4d-4097-8657-f408e9bc68c0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731354 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-config-data\") pod \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731389 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzz66\" (UniqueName: \"kubernetes.io/projected/66f80399-ed98-4aba-9db5-759ad2e314fa-kube-api-access-wzz66\") pod \"66f80399-ed98-4aba-9db5-759ad2e314fa\" (UID: \"66f80399-ed98-4aba-9db5-759ad2e314fa\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731409 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-config-data-custom\") pod \"6002f7a4-27d6-4554-a486-87926ebcf57e\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731429 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-public-tls-certs\") pod \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731454 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13453d02-4f55-45a7-98be-1cd41c741a3e-logs\") pod \"13453d02-4f55-45a7-98be-1cd41c741a3e\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731478 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-public-tls-certs\") pod \"6002f7a4-27d6-4554-a486-87926ebcf57e\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731497 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-scripts\") pod \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731511 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td7lc\" (UniqueName: \"kubernetes.io/projected/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-api-access-td7lc\") pod \"e309be64-7a5a-4156-89a6-d1201eaaff63\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731543 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-internal-tls-certs\") pod \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731534 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3565e266-6994-4000-a4f2-2901e22f6682" path="/var/lib/kubelet/pods/3565e266-6994-4000-a4f2-2901e22f6682/volumes" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731563 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-public-tls-certs\") pod \"37dd5983-0d4d-4097-8657-f408e9bc68c0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731597 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"13453d02-4f55-45a7-98be-1cd41c741a3e\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731618 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-combined-ca-bundle\") pod \"e309be64-7a5a-4156-89a6-d1201eaaff63\" (UID: \"e309be64-7a5a-4156-89a6-d1201eaaff63\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731637 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw9xk\" (UniqueName: \"kubernetes.io/projected/13453d02-4f55-45a7-98be-1cd41c741a3e-kube-api-access-sw9xk\") pod \"13453d02-4f55-45a7-98be-1cd41c741a3e\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731660 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwgrc\" (UniqueName: \"kubernetes.io/projected/a03c4a0d-6346-43e4-8db1-f653b5dfa420-kube-api-access-vwgrc\") pod \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731697 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a03c4a0d-6346-43e4-8db1-f653b5dfa420-logs\") pod \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\" (UID: \"a03c4a0d-6346-43e4-8db1-f653b5dfa420\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731718 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-config-data\") pod \"13453d02-4f55-45a7-98be-1cd41c741a3e\" (UID: \"13453d02-4f55-45a7-98be-1cd41c741a3e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731738 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-combined-ca-bundle\") pod \"6002f7a4-27d6-4554-a486-87926ebcf57e\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731768 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-config-data\") pod \"37dd5983-0d4d-4097-8657-f408e9bc68c0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731785 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg69z\" (UniqueName: \"kubernetes.io/projected/37dd5983-0d4d-4097-8657-f408e9bc68c0-kube-api-access-lg69z\") pod \"37dd5983-0d4d-4097-8657-f408e9bc68c0\" (UID: \"37dd5983-0d4d-4097-8657-f408e9bc68c0\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.731808 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002f7a4-27d6-4554-a486-87926ebcf57e-logs\") pod \"6002f7a4-27d6-4554-a486-87926ebcf57e\" (UID: \"6002f7a4-27d6-4554-a486-87926ebcf57e\") " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.732273 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9" path="/var/lib/kubelet/pods/5683d2a3-82a3-4e7b-b5eb-9a2e46c10fc9/volumes" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.733218 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6002f7a4-27d6-4554-a486-87926ebcf57e-logs" (OuterVolumeSpecName: "logs") pod "6002f7a4-27d6-4554-a486-87926ebcf57e" (UID: "6002f7a4-27d6-4554-a486-87926ebcf57e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.734606 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="770ca0a9-4c48-446b-be08-84b06d20d501" path="/var/lib/kubelet/pods/770ca0a9-4c48-446b-be08-84b06d20d501/volumes" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.737304 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13453d02-4f55-45a7-98be-1cd41c741a3e-logs" (OuterVolumeSpecName: "logs") pod "13453d02-4f55-45a7-98be-1cd41c741a3e" (UID: "13453d02-4f55-45a7-98be-1cd41c741a3e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.737571 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a03c4a0d-6346-43e4-8db1-f653b5dfa420-logs" (OuterVolumeSpecName: "logs") pod "a03c4a0d-6346-43e4-8db1-f653b5dfa420" (UID: "a03c4a0d-6346-43e4-8db1-f653b5dfa420"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.740053 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13453d02-4f55-45a7-98be-1cd41c741a3e-kube-api-access-sw9xk" (OuterVolumeSpecName: "kube-api-access-sw9xk") pod "13453d02-4f55-45a7-98be-1cd41c741a3e" (UID: "13453d02-4f55-45a7-98be-1cd41c741a3e"). InnerVolumeSpecName "kube-api-access-sw9xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.740131 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6002f7a4-27d6-4554-a486-87926ebcf57e-kube-api-access-q2jwc" (OuterVolumeSpecName: "kube-api-access-q2jwc") pod "6002f7a4-27d6-4554-a486-87926ebcf57e" (UID: "6002f7a4-27d6-4554-a486-87926ebcf57e"). InnerVolumeSpecName "kube-api-access-q2jwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.740405 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13453d02-4f55-45a7-98be-1cd41c741a3e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "13453d02-4f55-45a7-98be-1cd41c741a3e" (UID: "13453d02-4f55-45a7-98be-1cd41c741a3e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.741349 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b897234-a3ea-40e0-a94c-0f501794c5d4" path="/var/lib/kubelet/pods/7b897234-a3ea-40e0-a94c-0f501794c5d4/volumes" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.742231 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a936b898-5163-4c5e-ac30-64af1533cec7" path="/var/lib/kubelet/pods/a936b898-5163-4c5e-ac30-64af1533cec7/volumes" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.742763 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37dd5983-0d4d-4097-8657-f408e9bc68c0-logs" (OuterVolumeSpecName: "logs") pod "37dd5983-0d4d-4097-8657-f408e9bc68c0" (UID: "37dd5983-0d4d-4097-8657-f408e9bc68c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.743287 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db6903f7-f211-44ad-a2d6-cc7b92c1c477" path="/var/lib/kubelet/pods/db6903f7-f211-44ad-a2d6-cc7b92c1c477/volumes" Oct 07 12:46:06 crc kubenswrapper[4854]: E1007 12:46:06.747046 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b33111a1b0f14dc72a4fc34de5d78d44cbf9938a6e3632fa063f8bb7985aeaf3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.755098 4854 scope.go:117] "RemoveContainer" containerID="0775213d263bbb54ab38859826cdc168c18bef5c43a091d586b83ae90efc35b8" Oct 07 12:46:06 crc kubenswrapper[4854]: E1007 12:46:06.755306 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b33111a1b0f14dc72a4fc34de5d78d44cbf9938a6e3632fa063f8bb7985aeaf3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 12:46:06 crc kubenswrapper[4854]: E1007 12:46:06.757642 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b33111a1b0f14dc72a4fc34de5d78d44cbf9938a6e3632fa063f8bb7985aeaf3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 12:46:06 crc kubenswrapper[4854]: E1007 12:46:06.757693 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="016c2264-9ba4-48c0-b416-02c468232b6b" containerName="nova-scheduler-scheduler" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.767038 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37dd5983-0d4d-4097-8657-f408e9bc68c0-kube-api-access-lg69z" (OuterVolumeSpecName: "kube-api-access-lg69z") pod "37dd5983-0d4d-4097-8657-f408e9bc68c0" (UID: "37dd5983-0d4d-4097-8657-f408e9bc68c0"). InnerVolumeSpecName "kube-api-access-lg69z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.767095 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a03c4a0d-6346-43e4-8db1-f653b5dfa420-kube-api-access-vwgrc" (OuterVolumeSpecName: "kube-api-access-vwgrc") pod "a03c4a0d-6346-43e4-8db1-f653b5dfa420" (UID: "a03c4a0d-6346-43e4-8db1-f653b5dfa420"). InnerVolumeSpecName "kube-api-access-vwgrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.768368 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-scripts" (OuterVolumeSpecName: "scripts") pod "13453d02-4f55-45a7-98be-1cd41c741a3e" (UID: "13453d02-4f55-45a7-98be-1cd41c741a3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.769745 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "13453d02-4f55-45a7-98be-1cd41c741a3e" (UID: "13453d02-4f55-45a7-98be-1cd41c741a3e"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.780553 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6002f7a4-27d6-4554-a486-87926ebcf57e" (UID: "6002f7a4-27d6-4554-a486-87926ebcf57e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.785723 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66f80399-ed98-4aba-9db5-759ad2e314fa-kube-api-access-wzz66" (OuterVolumeSpecName: "kube-api-access-wzz66") pod "66f80399-ed98-4aba-9db5-759ad2e314fa" (UID: "66f80399-ed98-4aba-9db5-759ad2e314fa"). InnerVolumeSpecName "kube-api-access-wzz66". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.810994 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-scripts" (OuterVolumeSpecName: "scripts") pod "a03c4a0d-6346-43e4-8db1-f653b5dfa420" (UID: "a03c4a0d-6346-43e4-8db1-f653b5dfa420"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.817757 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-api-access-td7lc" (OuterVolumeSpecName: "kube-api-access-td7lc") pod "e309be64-7a5a-4156-89a6-d1201eaaff63" (UID: "e309be64-7a5a-4156-89a6-d1201eaaff63"). InnerVolumeSpecName "kube-api-access-td7lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.820234 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13453d02-4f55-45a7-98be-1cd41c741a3e" (UID: "13453d02-4f55-45a7-98be-1cd41c741a3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834561 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834610 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw9xk\" (UniqueName: \"kubernetes.io/projected/13453d02-4f55-45a7-98be-1cd41c741a3e-kube-api-access-sw9xk\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834624 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwgrc\" (UniqueName: \"kubernetes.io/projected/a03c4a0d-6346-43e4-8db1-f653b5dfa420-kube-api-access-vwgrc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834636 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a03c4a0d-6346-43e4-8db1-f653b5dfa420-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834651 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg69z\" (UniqueName: \"kubernetes.io/projected/37dd5983-0d4d-4097-8657-f408e9bc68c0-kube-api-access-lg69z\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834665 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002f7a4-27d6-4554-a486-87926ebcf57e-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834676 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834688 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2jwc\" (UniqueName: \"kubernetes.io/projected/6002f7a4-27d6-4554-a486-87926ebcf57e-kube-api-access-q2jwc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834699 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834709 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37dd5983-0d4d-4097-8657-f408e9bc68c0-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834720 4854 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13453d02-4f55-45a7-98be-1cd41c741a3e-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834733 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzz66\" (UniqueName: \"kubernetes.io/projected/66f80399-ed98-4aba-9db5-759ad2e314fa-kube-api-access-wzz66\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834744 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834755 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13453d02-4f55-45a7-98be-1cd41c741a3e-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834765 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.834776 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td7lc\" (UniqueName: \"kubernetes.io/projected/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-api-access-td7lc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: E1007 12:46:06.834632 4854 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 07 12:46:06 crc kubenswrapper[4854]: E1007 12:46:06.834857 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data podName:4c293f13-b2a5-4d4b-9f69-fd118e34eab2 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:14.834836865 +0000 UTC m=+1290.822669120 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data") pod "rabbitmq-server-0" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2") : configmap "rabbitmq-config-data" not found Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.837874 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "e309be64-7a5a-4156-89a6-d1201eaaff63" (UID: "e309be64-7a5a-4156-89a6-d1201eaaff63"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.865689 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-config-data" (OuterVolumeSpecName: "config-data") pod "37dd5983-0d4d-4097-8657-f408e9bc68c0" (UID: "37dd5983-0d4d-4097-8657-f408e9bc68c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.876894 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6002f7a4-27d6-4554-a486-87926ebcf57e" (UID: "6002f7a4-27d6-4554-a486-87926ebcf57e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.884806 4854 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.894953 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e309be64-7a5a-4156-89a6-d1201eaaff63" (UID: "e309be64-7a5a-4156-89a6-d1201eaaff63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.914116 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-config-data" (OuterVolumeSpecName: "config-data") pod "a03c4a0d-6346-43e4-8db1-f653b5dfa420" (UID: "a03c4a0d-6346-43e4-8db1-f653b5dfa420"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.927892 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a03c4a0d-6346-43e4-8db1-f653b5dfa420" (UID: "a03c4a0d-6346-43e4-8db1-f653b5dfa420"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.929015 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37dd5983-0d4d-4097-8657-f408e9bc68c0" (UID: "37dd5983-0d4d-4097-8657-f408e9bc68c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.936438 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.936467 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.936476 4854 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.936485 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.936494 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.936502 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.936510 4854 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.936520 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.938679 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a03c4a0d-6346-43e4-8db1-f653b5dfa420" (UID: "a03c4a0d-6346-43e4-8db1-f653b5dfa420"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.938678 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6002f7a4-27d6-4554-a486-87926ebcf57e" (UID: "6002f7a4-27d6-4554-a486-87926ebcf57e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.943403 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6002f7a4-27d6-4554-a486-87926ebcf57e" (UID: "6002f7a4-27d6-4554-a486-87926ebcf57e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.952805 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "37dd5983-0d4d-4097-8657-f408e9bc68c0" (UID: "37dd5983-0d4d-4097-8657-f408e9bc68c0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.972356 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-config-data" (OuterVolumeSpecName: "config-data") pod "6002f7a4-27d6-4554-a486-87926ebcf57e" (UID: "6002f7a4-27d6-4554-a486-87926ebcf57e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:06 crc kubenswrapper[4854]: I1007 12:46:06.997717 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-config-data" (OuterVolumeSpecName: "config-data") pod "13453d02-4f55-45a7-98be-1cd41c741a3e" (UID: "13453d02-4f55-45a7-98be-1cd41c741a3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.005379 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "e309be64-7a5a-4156-89a6-d1201eaaff63" (UID: "e309be64-7a5a-4156-89a6-d1201eaaff63"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.008831 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a03c4a0d-6346-43e4-8db1-f653b5dfa420" (UID: "a03c4a0d-6346-43e4-8db1-f653b5dfa420"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.013287 4854 scope.go:117] "RemoveContainer" containerID="66d61a36b5d71d29e4fc833793e7b8e2d512bb92c6cdc2ceee1421b70515026b" Oct 07 12:46:07 crc kubenswrapper[4854]: E1007 12:46:07.013854 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d61a36b5d71d29e4fc833793e7b8e2d512bb92c6cdc2ceee1421b70515026b\": container with ID starting with 66d61a36b5d71d29e4fc833793e7b8e2d512bb92c6cdc2ceee1421b70515026b not found: ID does not exist" containerID="66d61a36b5d71d29e4fc833793e7b8e2d512bb92c6cdc2ceee1421b70515026b" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.013897 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d61a36b5d71d29e4fc833793e7b8e2d512bb92c6cdc2ceee1421b70515026b"} err="failed to get container status \"66d61a36b5d71d29e4fc833793e7b8e2d512bb92c6cdc2ceee1421b70515026b\": rpc error: code = NotFound desc = could not find container \"66d61a36b5d71d29e4fc833793e7b8e2d512bb92c6cdc2ceee1421b70515026b\": container with ID starting with 66d61a36b5d71d29e4fc833793e7b8e2d512bb92c6cdc2ceee1421b70515026b not found: ID does not exist" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.013927 4854 scope.go:117] "RemoveContainer" containerID="0775213d263bbb54ab38859826cdc168c18bef5c43a091d586b83ae90efc35b8" Oct 07 12:46:07 crc kubenswrapper[4854]: E1007 12:46:07.014372 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0775213d263bbb54ab38859826cdc168c18bef5c43a091d586b83ae90efc35b8\": container with ID starting with 0775213d263bbb54ab38859826cdc168c18bef5c43a091d586b83ae90efc35b8 not found: ID does not exist" containerID="0775213d263bbb54ab38859826cdc168c18bef5c43a091d586b83ae90efc35b8" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.014394 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0775213d263bbb54ab38859826cdc168c18bef5c43a091d586b83ae90efc35b8"} err="failed to get container status \"0775213d263bbb54ab38859826cdc168c18bef5c43a091d586b83ae90efc35b8\": rpc error: code = NotFound desc = could not find container \"0775213d263bbb54ab38859826cdc168c18bef5c43a091d586b83ae90efc35b8\": container with ID starting with 0775213d263bbb54ab38859826cdc168c18bef5c43a091d586b83ae90efc35b8 not found: ID does not exist" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.014413 4854 scope.go:117] "RemoveContainer" containerID="16fe06c197602fe7fb28e9bd7f56721c49f40f87574fc4f159cb1d7c6685606e" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.037762 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.037799 4854 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e309be64-7a5a-4156-89a6-d1201eaaff63-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.037810 4854 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.037819 4854 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.037828 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.037836 4854 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.037844 4854 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6002f7a4-27d6-4554-a486-87926ebcf57e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.037852 4854 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a03c4a0d-6346-43e4-8db1-f653b5dfa420-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.070028 4854 scope.go:117] "RemoveContainer" containerID="6cbd3121741a98fa970c3ecfd5f8c014c006ce0662866c5090335c8dec86fdc4" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.070179 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "13453d02-4f55-45a7-98be-1cd41c741a3e" (UID: "13453d02-4f55-45a7-98be-1cd41c741a3e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.082068 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "37dd5983-0d4d-4097-8657-f408e9bc68c0" (UID: "37dd5983-0d4d-4097-8657-f408e9bc68c0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.090786 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0e568-account-delete-srq52" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.136460 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicana01a-account-delete-8phrr" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.137724 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronb378-account-delete-jzbpj" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.137867 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderc7c1-account-delete-dz57l" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.139919 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9v49\" (UniqueName: \"kubernetes.io/projected/6f22a792-7012-4cfb-9442-20ef086f5532-kube-api-access-v9v49\") pod \"keystone242e-account-delete-s9qn8\" (UID: \"6f22a792-7012-4cfb-9442-20ef086f5532\") " pod="openstack/keystone242e-account-delete-s9qn8" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.140452 4854 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13453d02-4f55-45a7-98be-1cd41c741a3e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.140732 4854 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37dd5983-0d4d-4097-8657-f408e9bc68c0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: E1007 12:46:07.144922 4854 projected.go:194] Error preparing data for projected volume kube-api-access-v9v49 for pod openstack/keystone242e-account-delete-s9qn8: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 07 12:46:07 crc kubenswrapper[4854]: E1007 12:46:07.144993 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f22a792-7012-4cfb-9442-20ef086f5532-kube-api-access-v9v49 podName:6f22a792-7012-4cfb-9442-20ef086f5532 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:09.144971961 +0000 UTC m=+1285.132804286 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-v9v49" (UniqueName: "kubernetes.io/projected/6f22a792-7012-4cfb-9442-20ef086f5532-kube-api-access-v9v49") pod "keystone242e-account-delete-s9qn8" (UID: "6f22a792-7012-4cfb-9442-20ef086f5532") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.145709 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.194560 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance3267-account-delete-cwdxm" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.202235 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi643b-account-delete-92r97" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.243352 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nqgj\" (UniqueName: \"kubernetes.io/projected/c2cebadb-2142-477a-85b3-53e7c73fa6cc-kube-api-access-9nqgj\") pod \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.243401 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt7gn\" (UniqueName: \"kubernetes.io/projected/7c47fe71-3b92-4490-8488-d98d0e25519e-kube-api-access-vt7gn\") pod \"7c47fe71-3b92-4490-8488-d98d0e25519e\" (UID: \"7c47fe71-3b92-4490-8488-d98d0e25519e\") " Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.243436 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c2cebadb-2142-477a-85b3-53e7c73fa6cc-kolla-config\") pod \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.243507 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbj7s\" (UniqueName: \"kubernetes.io/projected/787f934e-4f31-4b00-8cf6-380efd34aaad-kube-api-access-vbj7s\") pod \"787f934e-4f31-4b00-8cf6-380efd34aaad\" (UID: \"787f934e-4f31-4b00-8cf6-380efd34aaad\") " Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.243532 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cebadb-2142-477a-85b3-53e7c73fa6cc-memcached-tls-certs\") pod \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.243626 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cebadb-2142-477a-85b3-53e7c73fa6cc-combined-ca-bundle\") pod \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.243683 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cskfq\" (UniqueName: \"kubernetes.io/projected/ac535972-fa59-4e7f-818b-345da6937c14-kube-api-access-cskfq\") pod \"ac535972-fa59-4e7f-818b-345da6937c14\" (UID: \"ac535972-fa59-4e7f-818b-345da6937c14\") " Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.243709 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2cebadb-2142-477a-85b3-53e7c73fa6cc-config-data\") pod \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\" (UID: \"c2cebadb-2142-477a-85b3-53e7c73fa6cc\") " Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.243745 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptt4r\" (UniqueName: \"kubernetes.io/projected/735fa97d-e751-4957-bdae-3ae0b10635d2-kube-api-access-ptt4r\") pod \"735fa97d-e751-4957-bdae-3ae0b10635d2\" (UID: \"735fa97d-e751-4957-bdae-3ae0b10635d2\") " Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.245958 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2cebadb-2142-477a-85b3-53e7c73fa6cc-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "c2cebadb-2142-477a-85b3-53e7c73fa6cc" (UID: "c2cebadb-2142-477a-85b3-53e7c73fa6cc"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.245979 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2cebadb-2142-477a-85b3-53e7c73fa6cc-config-data" (OuterVolumeSpecName: "config-data") pod "c2cebadb-2142-477a-85b3-53e7c73fa6cc" (UID: "c2cebadb-2142-477a-85b3-53e7c73fa6cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.247822 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2cebadb-2142-477a-85b3-53e7c73fa6cc-kube-api-access-9nqgj" (OuterVolumeSpecName: "kube-api-access-9nqgj") pod "c2cebadb-2142-477a-85b3-53e7c73fa6cc" (UID: "c2cebadb-2142-477a-85b3-53e7c73fa6cc"). InnerVolumeSpecName "kube-api-access-9nqgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.249003 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/735fa97d-e751-4957-bdae-3ae0b10635d2-kube-api-access-ptt4r" (OuterVolumeSpecName: "kube-api-access-ptt4r") pod "735fa97d-e751-4957-bdae-3ae0b10635d2" (UID: "735fa97d-e751-4957-bdae-3ae0b10635d2"). InnerVolumeSpecName "kube-api-access-ptt4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.249739 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c47fe71-3b92-4490-8488-d98d0e25519e-kube-api-access-vt7gn" (OuterVolumeSpecName: "kube-api-access-vt7gn") pod "7c47fe71-3b92-4490-8488-d98d0e25519e" (UID: "7c47fe71-3b92-4490-8488-d98d0e25519e"). InnerVolumeSpecName "kube-api-access-vt7gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.250277 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/787f934e-4f31-4b00-8cf6-380efd34aaad-kube-api-access-vbj7s" (OuterVolumeSpecName: "kube-api-access-vbj7s") pod "787f934e-4f31-4b00-8cf6-380efd34aaad" (UID: "787f934e-4f31-4b00-8cf6-380efd34aaad"). InnerVolumeSpecName "kube-api-access-vbj7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.255330 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac535972-fa59-4e7f-818b-345da6937c14-kube-api-access-cskfq" (OuterVolumeSpecName: "kube-api-access-cskfq") pod "ac535972-fa59-4e7f-818b-345da6937c14" (UID: "ac535972-fa59-4e7f-818b-345da6937c14"). InnerVolumeSpecName "kube-api-access-cskfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.276002 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2cebadb-2142-477a-85b3-53e7c73fa6cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2cebadb-2142-477a-85b3-53e7c73fa6cc" (UID: "c2cebadb-2142-477a-85b3-53e7c73fa6cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.322935 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2cebadb-2142-477a-85b3-53e7c73fa6cc-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "c2cebadb-2142-477a-85b3-53e7c73fa6cc" (UID: "c2cebadb-2142-477a-85b3-53e7c73fa6cc"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.345428 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbwkw\" (UniqueName: \"kubernetes.io/projected/aaa0738a-daf1-479f-9dbd-913806703370-kube-api-access-qbwkw\") pod \"aaa0738a-daf1-479f-9dbd-913806703370\" (UID: \"aaa0738a-daf1-479f-9dbd-913806703370\") " Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.345672 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd7dp\" (UniqueName: \"kubernetes.io/projected/e7453b38-f6c3-4fe7-b15d-5bd8112dc687-kube-api-access-kd7dp\") pod \"e7453b38-f6c3-4fe7-b15d-5bd8112dc687\" (UID: \"e7453b38-f6c3-4fe7-b15d-5bd8112dc687\") " Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.346366 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cskfq\" (UniqueName: \"kubernetes.io/projected/ac535972-fa59-4e7f-818b-345da6937c14-kube-api-access-cskfq\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.346395 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2cebadb-2142-477a-85b3-53e7c73fa6cc-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.346420 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptt4r\" (UniqueName: \"kubernetes.io/projected/735fa97d-e751-4957-bdae-3ae0b10635d2-kube-api-access-ptt4r\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.346432 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nqgj\" (UniqueName: \"kubernetes.io/projected/c2cebadb-2142-477a-85b3-53e7c73fa6cc-kube-api-access-9nqgj\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.346444 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt7gn\" (UniqueName: \"kubernetes.io/projected/7c47fe71-3b92-4490-8488-d98d0e25519e-kube-api-access-vt7gn\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.346457 4854 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c2cebadb-2142-477a-85b3-53e7c73fa6cc-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.346468 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbj7s\" (UniqueName: \"kubernetes.io/projected/787f934e-4f31-4b00-8cf6-380efd34aaad-kube-api-access-vbj7s\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.346482 4854 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cebadb-2142-477a-85b3-53e7c73fa6cc-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.346493 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cebadb-2142-477a-85b3-53e7c73fa6cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.351216 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaa0738a-daf1-479f-9dbd-913806703370-kube-api-access-qbwkw" (OuterVolumeSpecName: "kube-api-access-qbwkw") pod "aaa0738a-daf1-479f-9dbd-913806703370" (UID: "aaa0738a-daf1-479f-9dbd-913806703370"). InnerVolumeSpecName "kube-api-access-qbwkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.354418 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7453b38-f6c3-4fe7-b15d-5bd8112dc687-kube-api-access-kd7dp" (OuterVolumeSpecName: "kube-api-access-kd7dp") pod "e7453b38-f6c3-4fe7-b15d-5bd8112dc687" (UID: "e7453b38-f6c3-4fe7-b15d-5bd8112dc687"). InnerVolumeSpecName "kube-api-access-kd7dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.435918 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.436215 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e309be64-7a5a-4156-89a6-d1201eaaff63","Type":"ContainerDied","Data":"26680bebc3fd5e6e9b2b69dbb0e21addd670512464a9a360a906df7833a01d0e"} Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.436316 4854 scope.go:117] "RemoveContainer" containerID="ca038c665c300165c2b08c130cce95306c24d5eee4ccd22ba09577872f53b9c2" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.440617 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance3267-account-delete-cwdxm" event={"ID":"e7453b38-f6c3-4fe7-b15d-5bd8112dc687","Type":"ContainerDied","Data":"ee5287507b8008495245a090482b288e4e0c9225d2dc2b05e38266512b1a68d0"} Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.440749 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance3267-account-delete-cwdxm" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.448000 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronb378-account-delete-jzbpj" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.448032 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronb378-account-delete-jzbpj" event={"ID":"ac535972-fa59-4e7f-818b-345da6937c14","Type":"ContainerDied","Data":"5fc8a1cdad97dd2b204027fe3b0edf573d51ce78ed821b165334d96d088c8d5d"} Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.449345 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd7dp\" (UniqueName: \"kubernetes.io/projected/e7453b38-f6c3-4fe7-b15d-5bd8112dc687-kube-api-access-kd7dp\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.449524 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbwkw\" (UniqueName: \"kubernetes.io/projected/aaa0738a-daf1-479f-9dbd-913806703370-kube-api-access-qbwkw\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.452577 4854 generic.go:334] "Generic (PLEG): container finished" podID="c2cebadb-2142-477a-85b3-53e7c73fa6cc" containerID="452b12795119ff3315cae84f6ce86ad79fa535175852c78bfc7a106938624b75" exitCode=0 Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.452688 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.452826 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c2cebadb-2142-477a-85b3-53e7c73fa6cc","Type":"ContainerDied","Data":"452b12795119ff3315cae84f6ce86ad79fa535175852c78bfc7a106938624b75"} Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.452912 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c2cebadb-2142-477a-85b3-53e7c73fa6cc","Type":"ContainerDied","Data":"e8eeecea5a138704147e9142fc2e5ccab80504d6993b3a52faa6fd5a7ac302c7"} Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.458516 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi643b-account-delete-92r97" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.458595 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi643b-account-delete-92r97" event={"ID":"aaa0738a-daf1-479f-9dbd-913806703370","Type":"ContainerDied","Data":"85952b1081fd40865652081a84c9d72face986f2075653f2245c849e07426ff2"} Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.461531 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementd4d8-account-delete-fh2n2" event={"ID":"66f80399-ed98-4aba-9db5-759ad2e314fa","Type":"ContainerDied","Data":"5d555e41a71b72a5185f547f2f1f330886012a2560d5d06568c69bc6395e5b39"} Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.462051 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementd4d8-account-delete-fh2n2" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.466467 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0e568-account-delete-srq52" event={"ID":"7c47fe71-3b92-4490-8488-d98d0e25519e","Type":"ContainerDied","Data":"960e8011f3e7e734c34ca5881aea3b73d34e5f8e258e4f210770c8d783d53077"} Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.466692 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0e568-account-delete-srq52" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.467852 4854 scope.go:117] "RemoveContainer" containerID="88e0e41ebc891694b10be4302198088674ffb09627e1081edead8304fb62bb09" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.472687 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.474686 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderc7c1-account-delete-dz57l" event={"ID":"787f934e-4f31-4b00-8cf6-380efd34aaad","Type":"ContainerDied","Data":"db5889e0b4fdd9ccdc73334e37e2cb3cc8fa447356ffa435225c965d86e57cfd"} Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.474788 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderc7c1-account-delete-dz57l" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.478936 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.488411 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7595d98994-smt7c" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.489769 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f54bfd6b4-g5gq4" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.490621 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicana01a-account-delete-8phrr" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.490787 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicana01a-account-delete-8phrr" event={"ID":"735fa97d-e751-4957-bdae-3ae0b10635d2","Type":"ContainerDied","Data":"9208b58630a4edbf80b1d83d52ae8466d7c5b4e422216067d1554368305e830e"} Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.490860 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.491472 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone242e-account-delete-s9qn8" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.491918 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.538117 4854 scope.go:117] "RemoveContainer" containerID="4ab5598f2f1f8e2a8745b5f522f943448e0b454fdb30f12ff064c16195d663a3" Oct 07 12:46:07 crc kubenswrapper[4854]: E1007 12:46:07.551750 4854 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 07 12:46:07 crc kubenswrapper[4854]: E1007 12:46:07.551834 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data podName:79513100-48d2-4e7b-ae14-888322cab8f3 nodeName:}" failed. No retries permitted until 2025-10-07 12:46:15.551814828 +0000 UTC m=+1291.539647083 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data") pod "rabbitmq-cell1-server-0" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3") : configmap "rabbitmq-cell1-config-data" not found Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.571486 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone242e-account-delete-s9qn8" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.587268 4854 scope.go:117] "RemoveContainer" containerID="452b12795119ff3315cae84f6ce86ad79fa535175852c78bfc7a106938624b75" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.604761 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronb378-account-delete-jzbpj"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.611398 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutronb378-account-delete-jzbpj"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.625294 4854 scope.go:117] "RemoveContainer" containerID="452b12795119ff3315cae84f6ce86ad79fa535175852c78bfc7a106938624b75" Oct 07 12:46:07 crc kubenswrapper[4854]: E1007 12:46:07.625848 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"452b12795119ff3315cae84f6ce86ad79fa535175852c78bfc7a106938624b75\": container with ID starting with 452b12795119ff3315cae84f6ce86ad79fa535175852c78bfc7a106938624b75 not found: ID does not exist" containerID="452b12795119ff3315cae84f6ce86ad79fa535175852c78bfc7a106938624b75" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.625928 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452b12795119ff3315cae84f6ce86ad79fa535175852c78bfc7a106938624b75"} err="failed to get container status \"452b12795119ff3315cae84f6ce86ad79fa535175852c78bfc7a106938624b75\": rpc error: code = NotFound desc = could not find container \"452b12795119ff3315cae84f6ce86ad79fa535175852c78bfc7a106938624b75\": container with ID starting with 452b12795119ff3315cae84f6ce86ad79fa535175852c78bfc7a106938624b75 not found: ID does not exist" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.626017 4854 scope.go:117] "RemoveContainer" containerID="c2f64bd6a502240eba60aec7a5a06461ffdecfdc04beff785aa7c921c7ef988e" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.650009 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7595d98994-smt7c"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.655208 4854 scope.go:117] "RemoveContainer" containerID="75c3ef86d0dbd953431e4fcb366d7ac773027294d708d52f2fae09e3f2500930" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.655672 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7595d98994-smt7c"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.666187 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6f54bfd6b4-g5gq4"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.677130 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6f54bfd6b4-g5gq4"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.684792 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0e568-account-delete-srq52"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.696040 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell0e568-account-delete-srq52"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.706877 4854 scope.go:117] "RemoveContainer" containerID="97fcf4e9b37412fc9921e0d9fdf26f43fba6f08230c7b20f4625c9b0374aebc2" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.710293 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.721251 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.733902 4854 scope.go:117] "RemoveContainer" containerID="beb49729e015c29539bad7a67c84a635504e19d4bc244ead20fb12cbae9b8e94" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.736333 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance3267-account-delete-cwdxm"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.752645 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance3267-account-delete-cwdxm"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.770417 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderc7c1-account-delete-dz57l"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.783570 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinderc7c1-account-delete-dz57l"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.789426 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi643b-account-delete-92r97"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.796415 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi643b-account-delete-92r97"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.796501 4854 scope.go:117] "RemoveContainer" containerID="c2b608d16398a3c2e79505a49f4bc3d6c058f2350032df3d21f4491f85f50701" Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.802722 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicana01a-account-delete-8phrr"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.805495 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbicana01a-account-delete-8phrr"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.816172 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.816566 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.822200 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementd4d8-account-delete-fh2n2"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.828733 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placementd4d8-account-delete-fh2n2"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.833268 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:46:07 crc kubenswrapper[4854]: I1007 12:46:07.838582 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 12:46:07 crc kubenswrapper[4854]: E1007 12:46:07.870848 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 07 12:46:07 crc kubenswrapper[4854]: E1007 12:46:07.883062 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 07 12:46:07 crc kubenswrapper[4854]: E1007 12:46:07.885571 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 07 12:46:07 crc kubenswrapper[4854]: E1007 12:46:07.885611 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e599e18f-63c0-4756-845c-973257921fd0" containerName="galera" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.118519 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.282034 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-pod-info\") pod \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.282109 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data\") pod \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.282173 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-tls\") pod \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.282208 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-plugins\") pod \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.282228 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-plugins-conf\") pod \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.282261 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.282291 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h59t\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-kube-api-access-6h59t\") pod \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.282319 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-erlang-cookie\") pod \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.282353 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-confd\") pod \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.282412 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-server-conf\") pod \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.282432 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-erlang-cookie-secret\") pod \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\" (UID: \"4c293f13-b2a5-4d4b-9f69-fd118e34eab2\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.285591 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4c293f13-b2a5-4d4b-9f69-fd118e34eab2" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.286756 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4c293f13-b2a5-4d4b-9f69-fd118e34eab2" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.286997 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4c293f13-b2a5-4d4b-9f69-fd118e34eab2" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.287014 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "4c293f13-b2a5-4d4b-9f69-fd118e34eab2" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.287125 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4c293f13-b2a5-4d4b-9f69-fd118e34eab2" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.298780 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-kube-api-access-6h59t" (OuterVolumeSpecName: "kube-api-access-6h59t") pod "4c293f13-b2a5-4d4b-9f69-fd118e34eab2" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2"). InnerVolumeSpecName "kube-api-access-6h59t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.298867 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-pod-info" (OuterVolumeSpecName: "pod-info") pod "4c293f13-b2a5-4d4b-9f69-fd118e34eab2" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.298887 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4c293f13-b2a5-4d4b-9f69-fd118e34eab2" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.321243 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data" (OuterVolumeSpecName: "config-data") pod "4c293f13-b2a5-4d4b-9f69-fd118e34eab2" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.357763 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-server-conf" (OuterVolumeSpecName: "server-conf") pod "4c293f13-b2a5-4d4b-9f69-fd118e34eab2" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.383917 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.383943 4854 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.383953 4854 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.383962 4854 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.383991 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.384000 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h59t\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-kube-api-access-6h59t\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.384011 4854 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.384018 4854 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.384028 4854 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.384036 4854 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.404536 4854 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.439586 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4c293f13-b2a5-4d4b-9f69-fd118e34eab2" (UID: "4c293f13-b2a5-4d4b-9f69-fd118e34eab2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.485798 4854 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.485831 4854 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4c293f13-b2a5-4d4b-9f69-fd118e34eab2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.500252 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.503461 4854 generic.go:334] "Generic (PLEG): container finished" podID="f725ba88-4d40-4eab-890d-e114448fabe9" containerID="11b5b9529231881fc76b4bbf3d8f51ee97af2d3afad7b47b620ba4ffbcd4d4ec" exitCode=0 Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.503526 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.503544 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f725ba88-4d40-4eab-890d-e114448fabe9","Type":"ContainerDied","Data":"11b5b9529231881fc76b4bbf3d8f51ee97af2d3afad7b47b620ba4ffbcd4d4ec"} Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.503579 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f725ba88-4d40-4eab-890d-e114448fabe9","Type":"ContainerDied","Data":"5edd9c8fe76bd7a06cde58a29ab4392ccc9451823d6be61f1cd64ada2c9e90b0"} Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.503605 4854 scope.go:117] "RemoveContainer" containerID="11b5b9529231881fc76b4bbf3d8f51ee97af2d3afad7b47b620ba4ffbcd4d4ec" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.511330 4854 generic.go:334] "Generic (PLEG): container finished" podID="4c293f13-b2a5-4d4b-9f69-fd118e34eab2" containerID="773b06224b9794e065a5371a7a6969873348f5679927494931f1681bc7245e22" exitCode=0 Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.511505 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4c293f13-b2a5-4d4b-9f69-fd118e34eab2","Type":"ContainerDied","Data":"773b06224b9794e065a5371a7a6969873348f5679927494931f1681bc7245e22"} Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.511525 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4c293f13-b2a5-4d4b-9f69-fd118e34eab2","Type":"ContainerDied","Data":"7c0980786f4f741e8ff31c00d722199c81181f404d93d164f100dc2152c83e6c"} Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.511569 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.517704 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone242e-account-delete-s9qn8" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.574446 4854 scope.go:117] "RemoveContainer" containerID="17b9a36db2be4e7a076fcf2c8b6464f79d7b5f4ecc9432981400b860803f26bc" Oct 07 12:46:08 crc kubenswrapper[4854]: E1007 12:46:08.596439 4854 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 07 12:46:08 crc kubenswrapper[4854]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-07T12:46:01Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 07 12:46:08 crc kubenswrapper[4854]: /etc/init.d/functions: line 589: 445 Alarm clock "$@" Oct 07 12:46:08 crc kubenswrapper[4854]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-hllqq" message=< Oct 07 12:46:08 crc kubenswrapper[4854]: Exiting ovn-controller (1) [FAILED] Oct 07 12:46:08 crc kubenswrapper[4854]: Killing ovn-controller (1) [ OK ] Oct 07 12:46:08 crc kubenswrapper[4854]: Killing ovn-controller (1) with SIGKILL [ OK ] Oct 07 12:46:08 crc kubenswrapper[4854]: 2025-10-07T12:46:01Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 07 12:46:08 crc kubenswrapper[4854]: /etc/init.d/functions: line 589: 445 Alarm clock "$@" Oct 07 12:46:08 crc kubenswrapper[4854]: > Oct 07 12:46:08 crc kubenswrapper[4854]: E1007 12:46:08.596479 4854 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 07 12:46:08 crc kubenswrapper[4854]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-07T12:46:01Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 07 12:46:08 crc kubenswrapper[4854]: /etc/init.d/functions: line 589: 445 Alarm clock "$@" Oct 07 12:46:08 crc kubenswrapper[4854]: > pod="openstack/ovn-controller-hllqq" podUID="6e6702f3-b113-49f9-b85f-a2d294bac6dc" containerName="ovn-controller" containerID="cri-o://26e52344aa056f49042bb8a78b4be03e2be3272b704fe00a7d0d27c06de21f1c" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.596528 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-hllqq" podUID="6e6702f3-b113-49f9-b85f-a2d294bac6dc" containerName="ovn-controller" containerID="cri-o://26e52344aa056f49042bb8a78b4be03e2be3272b704fe00a7d0d27c06de21f1c" gracePeriod=21 Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.641452 4854 scope.go:117] "RemoveContainer" containerID="11b5b9529231881fc76b4bbf3d8f51ee97af2d3afad7b47b620ba4ffbcd4d4ec" Oct 07 12:46:08 crc kubenswrapper[4854]: E1007 12:46:08.660961 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11b5b9529231881fc76b4bbf3d8f51ee97af2d3afad7b47b620ba4ffbcd4d4ec\": container with ID starting with 11b5b9529231881fc76b4bbf3d8f51ee97af2d3afad7b47b620ba4ffbcd4d4ec not found: ID does not exist" containerID="11b5b9529231881fc76b4bbf3d8f51ee97af2d3afad7b47b620ba4ffbcd4d4ec" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.661006 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11b5b9529231881fc76b4bbf3d8f51ee97af2d3afad7b47b620ba4ffbcd4d4ec"} err="failed to get container status \"11b5b9529231881fc76b4bbf3d8f51ee97af2d3afad7b47b620ba4ffbcd4d4ec\": rpc error: code = NotFound desc = could not find container \"11b5b9529231881fc76b4bbf3d8f51ee97af2d3afad7b47b620ba4ffbcd4d4ec\": container with ID starting with 11b5b9529231881fc76b4bbf3d8f51ee97af2d3afad7b47b620ba4ffbcd4d4ec not found: ID does not exist" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.661030 4854 scope.go:117] "RemoveContainer" containerID="17b9a36db2be4e7a076fcf2c8b6464f79d7b5f4ecc9432981400b860803f26bc" Oct 07 12:46:08 crc kubenswrapper[4854]: E1007 12:46:08.665410 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b9a36db2be4e7a076fcf2c8b6464f79d7b5f4ecc9432981400b860803f26bc\": container with ID starting with 17b9a36db2be4e7a076fcf2c8b6464f79d7b5f4ecc9432981400b860803f26bc not found: ID does not exist" containerID="17b9a36db2be4e7a076fcf2c8b6464f79d7b5f4ecc9432981400b860803f26bc" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.665449 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b9a36db2be4e7a076fcf2c8b6464f79d7b5f4ecc9432981400b860803f26bc"} err="failed to get container status \"17b9a36db2be4e7a076fcf2c8b6464f79d7b5f4ecc9432981400b860803f26bc\": rpc error: code = NotFound desc = could not find container \"17b9a36db2be4e7a076fcf2c8b6464f79d7b5f4ecc9432981400b860803f26bc\": container with ID starting with 17b9a36db2be4e7a076fcf2c8b6464f79d7b5f4ecc9432981400b860803f26bc not found: ID does not exist" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.665470 4854 scope.go:117] "RemoveContainer" containerID="773b06224b9794e065a5371a7a6969873348f5679927494931f1681bc7245e22" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.682324 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone242e-account-delete-s9qn8"] Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.688396 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f725ba88-4d40-4eab-890d-e114448fabe9-config-data-generated\") pod \"f725ba88-4d40-4eab-890d-e114448fabe9\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.688499 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-combined-ca-bundle\") pod \"f725ba88-4d40-4eab-890d-e114448fabe9\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.688581 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bfsl\" (UniqueName: \"kubernetes.io/projected/f725ba88-4d40-4eab-890d-e114448fabe9-kube-api-access-6bfsl\") pod \"f725ba88-4d40-4eab-890d-e114448fabe9\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.688615 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-galera-tls-certs\") pod \"f725ba88-4d40-4eab-890d-e114448fabe9\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.688634 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f725ba88-4d40-4eab-890d-e114448fabe9\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.688685 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-kolla-config\") pod \"f725ba88-4d40-4eab-890d-e114448fabe9\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.688777 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-config-data-default\") pod \"f725ba88-4d40-4eab-890d-e114448fabe9\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.688808 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-operator-scripts\") pod \"f725ba88-4d40-4eab-890d-e114448fabe9\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.688871 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-secrets\") pod \"f725ba88-4d40-4eab-890d-e114448fabe9\" (UID: \"f725ba88-4d40-4eab-890d-e114448fabe9\") " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.689432 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f725ba88-4d40-4eab-890d-e114448fabe9" (UID: "f725ba88-4d40-4eab-890d-e114448fabe9"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.689498 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f725ba88-4d40-4eab-890d-e114448fabe9" (UID: "f725ba88-4d40-4eab-890d-e114448fabe9"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.689628 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f725ba88-4d40-4eab-890d-e114448fabe9" (UID: "f725ba88-4d40-4eab-890d-e114448fabe9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.689836 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f725ba88-4d40-4eab-890d-e114448fabe9-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f725ba88-4d40-4eab-890d-e114448fabe9" (UID: "f725ba88-4d40-4eab-890d-e114448fabe9"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.692369 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f725ba88-4d40-4eab-890d-e114448fabe9-kube-api-access-6bfsl" (OuterVolumeSpecName: "kube-api-access-6bfsl") pod "f725ba88-4d40-4eab-890d-e114448fabe9" (UID: "f725ba88-4d40-4eab-890d-e114448fabe9"). InnerVolumeSpecName "kube-api-access-6bfsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.692977 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone242e-account-delete-s9qn8"] Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.693437 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-secrets" (OuterVolumeSpecName: "secrets") pod "f725ba88-4d40-4eab-890d-e114448fabe9" (UID: "f725ba88-4d40-4eab-890d-e114448fabe9"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.721346 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13453d02-4f55-45a7-98be-1cd41c741a3e" path="/var/lib/kubelet/pods/13453d02-4f55-45a7-98be-1cd41c741a3e/volumes" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.722597 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37dd5983-0d4d-4097-8657-f408e9bc68c0" path="/var/lib/kubelet/pods/37dd5983-0d4d-4097-8657-f408e9bc68c0/volumes" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.724199 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6002f7a4-27d6-4554-a486-87926ebcf57e" path="/var/lib/kubelet/pods/6002f7a4-27d6-4554-a486-87926ebcf57e/volumes" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.725006 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66f80399-ed98-4aba-9db5-759ad2e314fa" path="/var/lib/kubelet/pods/66f80399-ed98-4aba-9db5-759ad2e314fa/volumes" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.725532 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f22a792-7012-4cfb-9442-20ef086f5532" path="/var/lib/kubelet/pods/6f22a792-7012-4cfb-9442-20ef086f5532/volumes" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.726368 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="735fa97d-e751-4957-bdae-3ae0b10635d2" path="/var/lib/kubelet/pods/735fa97d-e751-4957-bdae-3ae0b10635d2/volumes" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.727810 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="787f934e-4f31-4b00-8cf6-380efd34aaad" path="/var/lib/kubelet/pods/787f934e-4f31-4b00-8cf6-380efd34aaad/volumes" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.728455 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c47fe71-3b92-4490-8488-d98d0e25519e" path="/var/lib/kubelet/pods/7c47fe71-3b92-4490-8488-d98d0e25519e/volumes" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.729077 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a03c4a0d-6346-43e4-8db1-f653b5dfa420" path="/var/lib/kubelet/pods/a03c4a0d-6346-43e4-8db1-f653b5dfa420/volumes" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.731240 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaa0738a-daf1-479f-9dbd-913806703370" path="/var/lib/kubelet/pods/aaa0738a-daf1-479f-9dbd-913806703370/volumes" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.731587 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f725ba88-4d40-4eab-890d-e114448fabe9" (UID: "f725ba88-4d40-4eab-890d-e114448fabe9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.731895 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac535972-fa59-4e7f-818b-345da6937c14" path="/var/lib/kubelet/pods/ac535972-fa59-4e7f-818b-345da6937c14/volumes" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.732566 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2cebadb-2142-477a-85b3-53e7c73fa6cc" path="/var/lib/kubelet/pods/c2cebadb-2142-477a-85b3-53e7c73fa6cc/volumes" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.733179 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e309be64-7a5a-4156-89a6-d1201eaaff63" path="/var/lib/kubelet/pods/e309be64-7a5a-4156-89a6-d1201eaaff63/volumes" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.734482 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7453b38-f6c3-4fe7-b15d-5bd8112dc687" path="/var/lib/kubelet/pods/e7453b38-f6c3-4fe7-b15d-5bd8112dc687/volumes" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.747234 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "f725ba88-4d40-4eab-890d-e114448fabe9" (UID: "f725ba88-4d40-4eab-890d-e114448fabe9"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.764278 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "f725ba88-4d40-4eab-890d-e114448fabe9" (UID: "f725ba88-4d40-4eab-890d-e114448fabe9"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.790843 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bfsl\" (UniqueName: \"kubernetes.io/projected/f725ba88-4d40-4eab-890d-e114448fabe9-kube-api-access-6bfsl\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.790876 4854 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.790906 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.790916 4854 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.790925 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.790933 4854 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f725ba88-4d40-4eab-890d-e114448fabe9-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.790940 4854 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.790949 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f725ba88-4d40-4eab-890d-e114448fabe9-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.790957 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9v49\" (UniqueName: \"kubernetes.io/projected/6f22a792-7012-4cfb-9442-20ef086f5532-kube-api-access-v9v49\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.790965 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f725ba88-4d40-4eab-890d-e114448fabe9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.811805 4854 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.821368 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.821526 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.838934 4854 scope.go:117] "RemoveContainer" containerID="e014726a2ceb29d439118ca6c1761fd0aaf31c3c217a1281253c7e737429cda0" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.842381 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.861748 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 12:46:08 crc kubenswrapper[4854]: E1007 12:46:08.871086 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89f0eb49b67715625593c9e8be3b27b5e68c8ad630cb00e0f86cd890329ac748" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 07 12:46:08 crc kubenswrapper[4854]: E1007 12:46:08.887461 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89f0eb49b67715625593c9e8be3b27b5e68c8ad630cb00e0f86cd890329ac748" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 07 12:46:08 crc kubenswrapper[4854]: E1007 12:46:08.889743 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89f0eb49b67715625593c9e8be3b27b5e68c8ad630cb00e0f86cd890329ac748" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 07 12:46:08 crc kubenswrapper[4854]: E1007 12:46:08.889783 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="698aae03-92da-4cc2-a9d2-ecdb5f143439" containerName="ovn-northd" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.892030 4854 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.942917 4854 scope.go:117] "RemoveContainer" containerID="773b06224b9794e065a5371a7a6969873348f5679927494931f1681bc7245e22" Oct 07 12:46:08 crc kubenswrapper[4854]: E1007 12:46:08.943527 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"773b06224b9794e065a5371a7a6969873348f5679927494931f1681bc7245e22\": container with ID starting with 773b06224b9794e065a5371a7a6969873348f5679927494931f1681bc7245e22 not found: ID does not exist" containerID="773b06224b9794e065a5371a7a6969873348f5679927494931f1681bc7245e22" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.943567 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"773b06224b9794e065a5371a7a6969873348f5679927494931f1681bc7245e22"} err="failed to get container status \"773b06224b9794e065a5371a7a6969873348f5679927494931f1681bc7245e22\": rpc error: code = NotFound desc = could not find container \"773b06224b9794e065a5371a7a6969873348f5679927494931f1681bc7245e22\": container with ID starting with 773b06224b9794e065a5371a7a6969873348f5679927494931f1681bc7245e22 not found: ID does not exist" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.943594 4854 scope.go:117] "RemoveContainer" containerID="e014726a2ceb29d439118ca6c1761fd0aaf31c3c217a1281253c7e737429cda0" Oct 07 12:46:08 crc kubenswrapper[4854]: E1007 12:46:08.944500 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e014726a2ceb29d439118ca6c1761fd0aaf31c3c217a1281253c7e737429cda0\": container with ID starting with e014726a2ceb29d439118ca6c1761fd0aaf31c3c217a1281253c7e737429cda0 not found: ID does not exist" containerID="e014726a2ceb29d439118ca6c1761fd0aaf31c3c217a1281253c7e737429cda0" Oct 07 12:46:08 crc kubenswrapper[4854]: I1007 12:46:08.944557 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e014726a2ceb29d439118ca6c1761fd0aaf31c3c217a1281253c7e737429cda0"} err="failed to get container status \"e014726a2ceb29d439118ca6c1761fd0aaf31c3c217a1281253c7e737429cda0\": rpc error: code = NotFound desc = could not find container \"e014726a2ceb29d439118ca6c1761fd0aaf31c3c217a1281253c7e737429cda0\": container with ID starting with e014726a2ceb29d439118ca6c1761fd0aaf31c3c217a1281253c7e737429cda0 not found: ID does not exist" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.128271 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-hllqq_6e6702f3-b113-49f9-b85f-a2d294bac6dc/ovn-controller/0.log" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.128630 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hllqq" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.300486 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e6702f3-b113-49f9-b85f-a2d294bac6dc-ovn-controller-tls-certs\") pod \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.300546 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-run\") pod \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.300596 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h566\" (UniqueName: \"kubernetes.io/projected/6e6702f3-b113-49f9-b85f-a2d294bac6dc-kube-api-access-7h566\") pod \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.300859 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6702f3-b113-49f9-b85f-a2d294bac6dc-combined-ca-bundle\") pod \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.300883 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e6702f3-b113-49f9-b85f-a2d294bac6dc-scripts\") pod \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.300905 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-log-ovn\") pod \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.300940 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-run-ovn\") pod \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\" (UID: \"6e6702f3-b113-49f9-b85f-a2d294bac6dc\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.301321 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6e6702f3-b113-49f9-b85f-a2d294bac6dc" (UID: "6e6702f3-b113-49f9-b85f-a2d294bac6dc"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.315645 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-run" (OuterVolumeSpecName: "var-run") pod "6e6702f3-b113-49f9-b85f-a2d294bac6dc" (UID: "6e6702f3-b113-49f9-b85f-a2d294bac6dc"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.315950 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6e6702f3-b113-49f9-b85f-a2d294bac6dc" (UID: "6e6702f3-b113-49f9-b85f-a2d294bac6dc"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.317002 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e6702f3-b113-49f9-b85f-a2d294bac6dc-scripts" (OuterVolumeSpecName: "scripts") pod "6e6702f3-b113-49f9-b85f-a2d294bac6dc" (UID: "6e6702f3-b113-49f9-b85f-a2d294bac6dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.363527 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e6702f3-b113-49f9-b85f-a2d294bac6dc-kube-api-access-7h566" (OuterVolumeSpecName: "kube-api-access-7h566") pod "6e6702f3-b113-49f9-b85f-a2d294bac6dc" (UID: "6e6702f3-b113-49f9-b85f-a2d294bac6dc"). InnerVolumeSpecName "kube-api-access-7h566". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.391477 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e6702f3-b113-49f9-b85f-a2d294bac6dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e6702f3-b113-49f9-b85f-a2d294bac6dc" (UID: "6e6702f3-b113-49f9-b85f-a2d294bac6dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.402949 4854 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-run\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.402989 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h566\" (UniqueName: \"kubernetes.io/projected/6e6702f3-b113-49f9-b85f-a2d294bac6dc-kube-api-access-7h566\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.403003 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6702f3-b113-49f9-b85f-a2d294bac6dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.403013 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e6702f3-b113-49f9-b85f-a2d294bac6dc-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.403024 4854 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.403035 4854 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e6702f3-b113-49f9-b85f-a2d294bac6dc-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.458319 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e6702f3-b113-49f9-b85f-a2d294bac6dc-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "6e6702f3-b113-49f9-b85f-a2d294bac6dc" (UID: "6e6702f3-b113-49f9-b85f-a2d294bac6dc"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.504786 4854 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e6702f3-b113-49f9-b85f-a2d294bac6dc-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.537748 4854 generic.go:334] "Generic (PLEG): container finished" podID="77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" containerID="0b7c87e468f2b8ba32d54b131d0c6194d6f52f79c88bd597eeb0b8875705a1d6" exitCode=0 Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.538063 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" event={"ID":"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e","Type":"ContainerDied","Data":"0b7c87e468f2b8ba32d54b131d0c6194d6f52f79c88bd597eeb0b8875705a1d6"} Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.538090 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" event={"ID":"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e","Type":"ContainerDied","Data":"62b231c9792c08f3532e265cd8d38d6096ed1f5c4336b1ec0de88c85e3d1a9a5"} Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.538103 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62b231c9792c08f3532e265cd8d38d6096ed1f5c4336b1ec0de88c85e3d1a9a5" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.541735 4854 generic.go:334] "Generic (PLEG): container finished" podID="9869a8d4-db8e-4aba-82d1-6d02c3cf988e" containerID="d16e3dc32eb57dea6b6b77bcedd6210c8586d9780a519f65e63564f3af418dfd" exitCode=0 Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.541791 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-775676c768-frcfp" event={"ID":"9869a8d4-db8e-4aba-82d1-6d02c3cf988e","Type":"ContainerDied","Data":"d16e3dc32eb57dea6b6b77bcedd6210c8586d9780a519f65e63564f3af418dfd"} Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.541811 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-775676c768-frcfp" event={"ID":"9869a8d4-db8e-4aba-82d1-6d02c3cf988e","Type":"ContainerDied","Data":"f09c0ef913c08c234c9f9000f73fe2841754c0ba4b940bdb749ba13056666d4a"} Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.541822 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f09c0ef913c08c234c9f9000f73fe2841754c0ba4b940bdb749ba13056666d4a" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.544313 4854 generic.go:334] "Generic (PLEG): container finished" podID="016c2264-9ba4-48c0-b416-02c468232b6b" containerID="b33111a1b0f14dc72a4fc34de5d78d44cbf9938a6e3632fa063f8bb7985aeaf3" exitCode=0 Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.544376 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"016c2264-9ba4-48c0-b416-02c468232b6b","Type":"ContainerDied","Data":"b33111a1b0f14dc72a4fc34de5d78d44cbf9938a6e3632fa063f8bb7985aeaf3"} Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.544399 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"016c2264-9ba4-48c0-b416-02c468232b6b","Type":"ContainerDied","Data":"e6c6a2e60341c34ca76475fd4d69991f224cb18e5de374d907f485fb628b47a0"} Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.544413 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6c6a2e60341c34ca76475fd4d69991f224cb18e5de374d907f485fb628b47a0" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.550120 4854 generic.go:334] "Generic (PLEG): container finished" podID="79513100-48d2-4e7b-ae14-888322cab8f3" containerID="6b0f9e765f42e0528d289da2b61440b957c5df88bdf61759d5eef1939df5e18b" exitCode=0 Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.550187 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"79513100-48d2-4e7b-ae14-888322cab8f3","Type":"ContainerDied","Data":"6b0f9e765f42e0528d289da2b61440b957c5df88bdf61759d5eef1939df5e18b"} Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.550248 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"79513100-48d2-4e7b-ae14-888322cab8f3","Type":"ContainerDied","Data":"90a6236e4e1a2098e0ad321b015dc2006dca7a42f2f4eb0e997754f1087e31d6"} Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.550266 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90a6236e4e1a2098e0ad321b015dc2006dca7a42f2f4eb0e997754f1087e31d6" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.553100 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_698aae03-92da-4cc2-a9d2-ecdb5f143439/ovn-northd/0.log" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.553141 4854 generic.go:334] "Generic (PLEG): container finished" podID="698aae03-92da-4cc2-a9d2-ecdb5f143439" containerID="89f0eb49b67715625593c9e8be3b27b5e68c8ad630cb00e0f86cd890329ac748" exitCode=139 Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.553253 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"698aae03-92da-4cc2-a9d2-ecdb5f143439","Type":"ContainerDied","Data":"89f0eb49b67715625593c9e8be3b27b5e68c8ad630cb00e0f86cd890329ac748"} Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.554851 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-hllqq_6e6702f3-b113-49f9-b85f-a2d294bac6dc/ovn-controller/0.log" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.554909 4854 generic.go:334] "Generic (PLEG): container finished" podID="6e6702f3-b113-49f9-b85f-a2d294bac6dc" containerID="26e52344aa056f49042bb8a78b4be03e2be3272b704fe00a7d0d27c06de21f1c" exitCode=137 Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.554948 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hllqq" event={"ID":"6e6702f3-b113-49f9-b85f-a2d294bac6dc","Type":"ContainerDied","Data":"26e52344aa056f49042bb8a78b4be03e2be3272b704fe00a7d0d27c06de21f1c"} Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.554967 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hllqq" event={"ID":"6e6702f3-b113-49f9-b85f-a2d294bac6dc","Type":"ContainerDied","Data":"02e5362acc7369cd09a76a5490e1b8a1968c6ee1ced875ad0db909a6d49b1a47"} Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.554985 4854 scope.go:117] "RemoveContainer" containerID="26e52344aa056f49042bb8a78b4be03e2be3272b704fe00a7d0d27c06de21f1c" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.555101 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hllqq" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.601894 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.614514 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.622020 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hllqq"] Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.623912 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.636385 4854 scope.go:117] "RemoveContainer" containerID="26e52344aa056f49042bb8a78b4be03e2be3272b704fe00a7d0d27c06de21f1c" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.639116 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hllqq"] Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.641260 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-775676c768-frcfp" Oct 07 12:46:09 crc kubenswrapper[4854]: E1007 12:46:09.642362 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26e52344aa056f49042bb8a78b4be03e2be3272b704fe00a7d0d27c06de21f1c\": container with ID starting with 26e52344aa056f49042bb8a78b4be03e2be3272b704fe00a7d0d27c06de21f1c not found: ID does not exist" containerID="26e52344aa056f49042bb8a78b4be03e2be3272b704fe00a7d0d27c06de21f1c" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.642401 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26e52344aa056f49042bb8a78b4be03e2be3272b704fe00a7d0d27c06de21f1c"} err="failed to get container status \"26e52344aa056f49042bb8a78b4be03e2be3272b704fe00a7d0d27c06de21f1c\": rpc error: code = NotFound desc = could not find container \"26e52344aa056f49042bb8a78b4be03e2be3272b704fe00a7d0d27c06de21f1c\": container with ID starting with 26e52344aa056f49042bb8a78b4be03e2be3272b704fe00a7d0d27c06de21f1c not found: ID does not exist" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.690720 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-667c455579-lnd9l" podUID="2c8167ae-8941-4616-bee2-ff0fb5e98c16" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.165:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.690783 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-667c455579-lnd9l" podUID="2c8167ae-8941-4616-bee2-ff0fb5e98c16" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.165:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.707399 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"79513100-48d2-4e7b-ae14-888322cab8f3\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.707512 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-tls\") pod \"79513100-48d2-4e7b-ae14-888322cab8f3\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.707583 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8hv7\" (UniqueName: \"kubernetes.io/projected/016c2264-9ba4-48c0-b416-02c468232b6b-kube-api-access-c8hv7\") pod \"016c2264-9ba4-48c0-b416-02c468232b6b\" (UID: \"016c2264-9ba4-48c0-b416-02c468232b6b\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.707656 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-config-data-custom\") pod \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.707682 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-confd\") pod \"79513100-48d2-4e7b-ae14-888322cab8f3\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.707745 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2l9x\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-kube-api-access-q2l9x\") pod \"79513100-48d2-4e7b-ae14-888322cab8f3\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.707817 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-combined-ca-bundle\") pod \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.707887 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-config-data\") pod \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.708016 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-logs\") pod \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.708047 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79513100-48d2-4e7b-ae14-888322cab8f3-pod-info\") pod \"79513100-48d2-4e7b-ae14-888322cab8f3\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.708100 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-erlang-cookie\") pod \"79513100-48d2-4e7b-ae14-888322cab8f3\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.708134 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g47gl\" (UniqueName: \"kubernetes.io/projected/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-kube-api-access-g47gl\") pod \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\" (UID: \"77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.708202 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-plugins\") pod \"79513100-48d2-4e7b-ae14-888322cab8f3\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.708223 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79513100-48d2-4e7b-ae14-888322cab8f3-erlang-cookie-secret\") pod \"79513100-48d2-4e7b-ae14-888322cab8f3\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.708276 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data\") pod \"79513100-48d2-4e7b-ae14-888322cab8f3\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.708306 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/016c2264-9ba4-48c0-b416-02c468232b6b-config-data\") pod \"016c2264-9ba4-48c0-b416-02c468232b6b\" (UID: \"016c2264-9ba4-48c0-b416-02c468232b6b\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.708421 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/016c2264-9ba4-48c0-b416-02c468232b6b-combined-ca-bundle\") pod \"016c2264-9ba4-48c0-b416-02c468232b6b\" (UID: \"016c2264-9ba4-48c0-b416-02c468232b6b\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.708471 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-server-conf\") pod \"79513100-48d2-4e7b-ae14-888322cab8f3\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.708533 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-plugins-conf\") pod \"79513100-48d2-4e7b-ae14-888322cab8f3\" (UID: \"79513100-48d2-4e7b-ae14-888322cab8f3\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.710500 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "79513100-48d2-4e7b-ae14-888322cab8f3" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.711182 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "79513100-48d2-4e7b-ae14-888322cab8f3" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.715255 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-logs" (OuterVolumeSpecName: "logs") pod "77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" (UID: "77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.720619 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "79513100-48d2-4e7b-ae14-888322cab8f3" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.722451 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-kube-api-access-g47gl" (OuterVolumeSpecName: "kube-api-access-g47gl") pod "77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" (UID: "77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e"). InnerVolumeSpecName "kube-api-access-g47gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.724214 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "79513100-48d2-4e7b-ae14-888322cab8f3" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.725447 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" (UID: "77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.727883 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/016c2264-9ba4-48c0-b416-02c468232b6b-kube-api-access-c8hv7" (OuterVolumeSpecName: "kube-api-access-c8hv7") pod "016c2264-9ba4-48c0-b416-02c468232b6b" (UID: "016c2264-9ba4-48c0-b416-02c468232b6b"). InnerVolumeSpecName "kube-api-access-c8hv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.727972 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "79513100-48d2-4e7b-ae14-888322cab8f3" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.735430 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-kube-api-access-q2l9x" (OuterVolumeSpecName: "kube-api-access-q2l9x") pod "79513100-48d2-4e7b-ae14-888322cab8f3" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3"). InnerVolumeSpecName "kube-api-access-q2l9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.736443 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79513100-48d2-4e7b-ae14-888322cab8f3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "79513100-48d2-4e7b-ae14-888322cab8f3" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.736735 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_698aae03-92da-4cc2-a9d2-ecdb5f143439/ovn-northd/0.log" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.736818 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.751328 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/79513100-48d2-4e7b-ae14-888322cab8f3-pod-info" (OuterVolumeSpecName: "pod-info") pod "79513100-48d2-4e7b-ae14-888322cab8f3" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.760402 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" (UID: "77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.779383 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data" (OuterVolumeSpecName: "config-data") pod "79513100-48d2-4e7b-ae14-888322cab8f3" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.806347 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/016c2264-9ba4-48c0-b416-02c468232b6b-config-data" (OuterVolumeSpecName: "config-data") pod "016c2264-9ba4-48c0-b416-02c468232b6b" (UID: "016c2264-9ba4-48c0-b416-02c468232b6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.807119 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-server-conf" (OuterVolumeSpecName: "server-conf") pod "79513100-48d2-4e7b-ae14-888322cab8f3" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.807183 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/016c2264-9ba4-48c0-b416-02c468232b6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "016c2264-9ba4-48c0-b416-02c468232b6b" (UID: "016c2264-9ba4-48c0-b416-02c468232b6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.810621 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-config-data\") pod \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.810735 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-internal-tls-certs\") pod \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.810767 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-combined-ca-bundle\") pod \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.810809 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-public-tls-certs\") pod \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.810878 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-credential-keys\") pod \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.810983 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-fernet-keys\") pod \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.811030 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-scripts\") pod \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.811078 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clfkj\" (UniqueName: \"kubernetes.io/projected/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-kube-api-access-clfkj\") pod \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\" (UID: \"9869a8d4-db8e-4aba-82d1-6d02c3cf988e\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.811870 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.811893 4854 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/79513100-48d2-4e7b-ae14-888322cab8f3-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.811908 4854 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.811923 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g47gl\" (UniqueName: \"kubernetes.io/projected/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-kube-api-access-g47gl\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.811936 4854 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.811948 4854 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/79513100-48d2-4e7b-ae14-888322cab8f3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.811959 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.811971 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/016c2264-9ba4-48c0-b416-02c468232b6b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.811982 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/016c2264-9ba4-48c0-b416-02c468232b6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.811992 4854 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.812004 4854 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/79513100-48d2-4e7b-ae14-888322cab8f3-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.812029 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.812041 4854 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.812055 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8hv7\" (UniqueName: \"kubernetes.io/projected/016c2264-9ba4-48c0-b416-02c468232b6b-kube-api-access-c8hv7\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.812068 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.812081 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2l9x\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-kube-api-access-q2l9x\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.812092 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.824408 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9869a8d4-db8e-4aba-82d1-6d02c3cf988e" (UID: "9869a8d4-db8e-4aba-82d1-6d02c3cf988e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.826667 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-kube-api-access-clfkj" (OuterVolumeSpecName: "kube-api-access-clfkj") pod "9869a8d4-db8e-4aba-82d1-6d02c3cf988e" (UID: "9869a8d4-db8e-4aba-82d1-6d02c3cf988e"). InnerVolumeSpecName "kube-api-access-clfkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.830619 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9869a8d4-db8e-4aba-82d1-6d02c3cf988e" (UID: "9869a8d4-db8e-4aba-82d1-6d02c3cf988e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.851662 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-scripts" (OuterVolumeSpecName: "scripts") pod "9869a8d4-db8e-4aba-82d1-6d02c3cf988e" (UID: "9869a8d4-db8e-4aba-82d1-6d02c3cf988e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.858217 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-config-data" (OuterVolumeSpecName: "config-data") pod "77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" (UID: "77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.861690 4854 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.878563 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9869a8d4-db8e-4aba-82d1-6d02c3cf988e" (UID: "9869a8d4-db8e-4aba-82d1-6d02c3cf988e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.887021 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-config-data" (OuterVolumeSpecName: "config-data") pod "9869a8d4-db8e-4aba-82d1-6d02c3cf988e" (UID: "9869a8d4-db8e-4aba-82d1-6d02c3cf988e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.904826 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9869a8d4-db8e-4aba-82d1-6d02c3cf988e" (UID: "9869a8d4-db8e-4aba-82d1-6d02c3cf988e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.907231 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9869a8d4-db8e-4aba-82d1-6d02c3cf988e" (UID: "9869a8d4-db8e-4aba-82d1-6d02c3cf988e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.912901 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-metrics-certs-tls-certs\") pod \"698aae03-92da-4cc2-a9d2-ecdb5f143439\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.912940 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-combined-ca-bundle\") pod \"698aae03-92da-4cc2-a9d2-ecdb5f143439\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913027 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmz8x\" (UniqueName: \"kubernetes.io/projected/698aae03-92da-4cc2-a9d2-ecdb5f143439-kube-api-access-bmz8x\") pod \"698aae03-92da-4cc2-a9d2-ecdb5f143439\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913048 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-ovn-northd-tls-certs\") pod \"698aae03-92da-4cc2-a9d2-ecdb5f143439\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913074 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/698aae03-92da-4cc2-a9d2-ecdb5f143439-scripts\") pod \"698aae03-92da-4cc2-a9d2-ecdb5f143439\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913124 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/698aae03-92da-4cc2-a9d2-ecdb5f143439-ovn-rundir\") pod \"698aae03-92da-4cc2-a9d2-ecdb5f143439\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913194 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/698aae03-92da-4cc2-a9d2-ecdb5f143439-config\") pod \"698aae03-92da-4cc2-a9d2-ecdb5f143439\" (UID: \"698aae03-92da-4cc2-a9d2-ecdb5f143439\") " Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913465 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913483 4854 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913497 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913509 4854 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913520 4854 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913532 4854 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913542 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913552 4854 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913560 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.913568 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clfkj\" (UniqueName: \"kubernetes.io/projected/9869a8d4-db8e-4aba-82d1-6d02c3cf988e-kube-api-access-clfkj\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.914062 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698aae03-92da-4cc2-a9d2-ecdb5f143439-config" (OuterVolumeSpecName: "config") pod "698aae03-92da-4cc2-a9d2-ecdb5f143439" (UID: "698aae03-92da-4cc2-a9d2-ecdb5f143439"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.914455 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698aae03-92da-4cc2-a9d2-ecdb5f143439-scripts" (OuterVolumeSpecName: "scripts") pod "698aae03-92da-4cc2-a9d2-ecdb5f143439" (UID: "698aae03-92da-4cc2-a9d2-ecdb5f143439"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.914694 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/698aae03-92da-4cc2-a9d2-ecdb5f143439-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "698aae03-92da-4cc2-a9d2-ecdb5f143439" (UID: "698aae03-92da-4cc2-a9d2-ecdb5f143439"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.916913 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698aae03-92da-4cc2-a9d2-ecdb5f143439-kube-api-access-bmz8x" (OuterVolumeSpecName: "kube-api-access-bmz8x") pod "698aae03-92da-4cc2-a9d2-ecdb5f143439" (UID: "698aae03-92da-4cc2-a9d2-ecdb5f143439"). InnerVolumeSpecName "kube-api-access-bmz8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.927337 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "79513100-48d2-4e7b-ae14-888322cab8f3" (UID: "79513100-48d2-4e7b-ae14-888322cab8f3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.942191 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "698aae03-92da-4cc2-a9d2-ecdb5f143439" (UID: "698aae03-92da-4cc2-a9d2-ecdb5f143439"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.951272 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.982360 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "698aae03-92da-4cc2-a9d2-ecdb5f143439" (UID: "698aae03-92da-4cc2-a9d2-ecdb5f143439"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:09 crc kubenswrapper[4854]: I1007 12:46:09.992344 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "698aae03-92da-4cc2-a9d2-ecdb5f143439" (UID: "698aae03-92da-4cc2-a9d2-ecdb5f143439"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.014950 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.015035 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmz8x\" (UniqueName: \"kubernetes.io/projected/698aae03-92da-4cc2-a9d2-ecdb5f143439-kube-api-access-bmz8x\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.015048 4854 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.015058 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/698aae03-92da-4cc2-a9d2-ecdb5f143439-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.015066 4854 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/698aae03-92da-4cc2-a9d2-ecdb5f143439-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.015074 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/698aae03-92da-4cc2-a9d2-ecdb5f143439-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.015082 4854 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/79513100-48d2-4e7b-ae14-888322cab8f3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.015091 4854 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/698aae03-92da-4cc2-a9d2-ecdb5f143439-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: E1007 12:46:10.020929 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:10 crc kubenswrapper[4854]: E1007 12:46:10.021272 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:10 crc kubenswrapper[4854]: E1007 12:46:10.021639 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:10 crc kubenswrapper[4854]: E1007 12:46:10.021685 4854 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-j5h2b" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovsdb-server" Oct 07 12:46:10 crc kubenswrapper[4854]: E1007 12:46:10.022539 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:10 crc kubenswrapper[4854]: E1007 12:46:10.023920 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:10 crc kubenswrapper[4854]: E1007 12:46:10.028458 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:10 crc kubenswrapper[4854]: E1007 12:46:10.028511 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-j5h2b" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovs-vswitchd" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.115890 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhxlw\" (UniqueName: \"kubernetes.io/projected/cac73de2-996a-4e04-abde-1153b44058bc-kube-api-access-lhxlw\") pod \"cac73de2-996a-4e04-abde-1153b44058bc\" (UID: \"cac73de2-996a-4e04-abde-1153b44058bc\") " Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.115953 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac73de2-996a-4e04-abde-1153b44058bc-combined-ca-bundle\") pod \"cac73de2-996a-4e04-abde-1153b44058bc\" (UID: \"cac73de2-996a-4e04-abde-1153b44058bc\") " Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.115983 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cac73de2-996a-4e04-abde-1153b44058bc-config-data\") pod \"cac73de2-996a-4e04-abde-1153b44058bc\" (UID: \"cac73de2-996a-4e04-abde-1153b44058bc\") " Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.118208 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.120686 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac73de2-996a-4e04-abde-1153b44058bc-kube-api-access-lhxlw" (OuterVolumeSpecName: "kube-api-access-lhxlw") pod "cac73de2-996a-4e04-abde-1153b44058bc" (UID: "cac73de2-996a-4e04-abde-1153b44058bc"). InnerVolumeSpecName "kube-api-access-lhxlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.162762 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cac73de2-996a-4e04-abde-1153b44058bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cac73de2-996a-4e04-abde-1153b44058bc" (UID: "cac73de2-996a-4e04-abde-1153b44058bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.165436 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cac73de2-996a-4e04-abde-1153b44058bc-config-data" (OuterVolumeSpecName: "config-data") pod "cac73de2-996a-4e04-abde-1153b44058bc" (UID: "cac73de2-996a-4e04-abde-1153b44058bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.217164 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-galera-tls-certs\") pod \"e599e18f-63c0-4756-845c-973257921fd0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.217210 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e599e18f-63c0-4756-845c-973257921fd0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.217290 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-combined-ca-bundle\") pod \"e599e18f-63c0-4756-845c-973257921fd0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.217334 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-config-data-default\") pod \"e599e18f-63c0-4756-845c-973257921fd0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.217395 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-secrets\") pod \"e599e18f-63c0-4756-845c-973257921fd0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.217424 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e599e18f-63c0-4756-845c-973257921fd0-config-data-generated\") pod \"e599e18f-63c0-4756-845c-973257921fd0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.217441 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-operator-scripts\") pod \"e599e18f-63c0-4756-845c-973257921fd0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.217472 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6rdc\" (UniqueName: \"kubernetes.io/projected/e599e18f-63c0-4756-845c-973257921fd0-kube-api-access-z6rdc\") pod \"e599e18f-63c0-4756-845c-973257921fd0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.217492 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-kolla-config\") pod \"e599e18f-63c0-4756-845c-973257921fd0\" (UID: \"e599e18f-63c0-4756-845c-973257921fd0\") " Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.217757 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhxlw\" (UniqueName: \"kubernetes.io/projected/cac73de2-996a-4e04-abde-1153b44058bc-kube-api-access-lhxlw\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.217776 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac73de2-996a-4e04-abde-1153b44058bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.217786 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cac73de2-996a-4e04-abde-1153b44058bc-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.218192 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e599e18f-63c0-4756-845c-973257921fd0" (UID: "e599e18f-63c0-4756-845c-973257921fd0"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.218448 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e599e18f-63c0-4756-845c-973257921fd0-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "e599e18f-63c0-4756-845c-973257921fd0" (UID: "e599e18f-63c0-4756-845c-973257921fd0"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.218733 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "e599e18f-63c0-4756-845c-973257921fd0" (UID: "e599e18f-63c0-4756-845c-973257921fd0"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.218844 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e599e18f-63c0-4756-845c-973257921fd0" (UID: "e599e18f-63c0-4756-845c-973257921fd0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.223437 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-secrets" (OuterVolumeSpecName: "secrets") pod "e599e18f-63c0-4756-845c-973257921fd0" (UID: "e599e18f-63c0-4756-845c-973257921fd0"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.224293 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e599e18f-63c0-4756-845c-973257921fd0-kube-api-access-z6rdc" (OuterVolumeSpecName: "kube-api-access-z6rdc") pod "e599e18f-63c0-4756-845c-973257921fd0" (UID: "e599e18f-63c0-4756-845c-973257921fd0"). InnerVolumeSpecName "kube-api-access-z6rdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.227615 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "e599e18f-63c0-4756-845c-973257921fd0" (UID: "e599e18f-63c0-4756-845c-973257921fd0"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.240285 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e599e18f-63c0-4756-845c-973257921fd0" (UID: "e599e18f-63c0-4756-845c-973257921fd0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.266633 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "e599e18f-63c0-4756-845c-973257921fd0" (UID: "e599e18f-63c0-4756-845c-973257921fd0"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.319263 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e599e18f-63c0-4756-845c-973257921fd0-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.319325 4854 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.319344 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6rdc\" (UniqueName: \"kubernetes.io/projected/e599e18f-63c0-4756-845c-973257921fd0-kube-api-access-z6rdc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.319363 4854 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.319380 4854 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.319450 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.319473 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.327688 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e599e18f-63c0-4756-845c-973257921fd0-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.327714 4854 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e599e18f-63c0-4756-845c-973257921fd0-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.352440 4854 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.429161 4854 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.574889 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_698aae03-92da-4cc2-a9d2-ecdb5f143439/ovn-northd/0.log" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.574976 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"698aae03-92da-4cc2-a9d2-ecdb5f143439","Type":"ContainerDied","Data":"147c22642e55d4e465f5e81792155fda570f16630ec6fc54a0deefbceed4dcaf"} Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.575019 4854 scope.go:117] "RemoveContainer" containerID="b1471691e8926652d5ee413690eb324d89b03c02d93cd735cdce65f268baf71f" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.575238 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.589681 4854 generic.go:334] "Generic (PLEG): container finished" podID="e599e18f-63c0-4756-845c-973257921fd0" containerID="00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc" exitCode=0 Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.589773 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e599e18f-63c0-4756-845c-973257921fd0","Type":"ContainerDied","Data":"00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc"} Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.589816 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e599e18f-63c0-4756-845c-973257921fd0","Type":"ContainerDied","Data":"5cf807c10f39826889aef602ca0e5723b38bc982f61ccda6d7a95eb756b531b0"} Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.589752 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.592688 4854 generic.go:334] "Generic (PLEG): container finished" podID="cac73de2-996a-4e04-abde-1153b44058bc" containerID="efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700" exitCode=0 Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.592709 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.592766 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cac73de2-996a-4e04-abde-1153b44058bc","Type":"ContainerDied","Data":"efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700"} Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.592792 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cac73de2-996a-4e04-abde-1153b44058bc","Type":"ContainerDied","Data":"e57d352e529b217f9eee7228993f4cb047af8376ef7600b4f508594cb288e477"} Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.592856 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.592877 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b8cbbf7f5-8gzrj" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.592940 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.592986 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-775676c768-frcfp" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.637509 4854 scope.go:117] "RemoveContainer" containerID="89f0eb49b67715625593c9e8be3b27b5e68c8ad630cb00e0f86cd890329ac748" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.666992 4854 scope.go:117] "RemoveContainer" containerID="00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.672294 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.691791 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.701200 4854 scope.go:117] "RemoveContainer" containerID="9c4dd23c8a3639b3bf49f7fd9dbee5833829885f4ce50007c496ab2ed016afff" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.723421 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c293f13-b2a5-4d4b-9f69-fd118e34eab2" path="/var/lib/kubelet/pods/4c293f13-b2a5-4d4b-9f69-fd118e34eab2/volumes" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.724215 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698aae03-92da-4cc2-a9d2-ecdb5f143439" path="/var/lib/kubelet/pods/698aae03-92da-4cc2-a9d2-ecdb5f143439/volumes" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.725437 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e6702f3-b113-49f9-b85f-a2d294bac6dc" path="/var/lib/kubelet/pods/6e6702f3-b113-49f9-b85f-a2d294bac6dc/volumes" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.726479 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f725ba88-4d40-4eab-890d-e114448fabe9" path="/var/lib/kubelet/pods/f725ba88-4d40-4eab-890d-e114448fabe9/volumes" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.726995 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.727023 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.730369 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7b8cbbf7f5-8gzrj"] Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.737815 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7b8cbbf7f5-8gzrj"] Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.743258 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.751025 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.752555 4854 scope.go:117] "RemoveContainer" containerID="00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc" Oct 07 12:46:10 crc kubenswrapper[4854]: E1007 12:46:10.754550 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc\": container with ID starting with 00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc not found: ID does not exist" containerID="00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.754593 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc"} err="failed to get container status \"00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc\": rpc error: code = NotFound desc = could not find container \"00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc\": container with ID starting with 00cf8ccd1bcd63c009deb5be5b2933469ec3d1b0a13db00b6361bb703ec2d7fc not found: ID does not exist" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.754623 4854 scope.go:117] "RemoveContainer" containerID="9c4dd23c8a3639b3bf49f7fd9dbee5833829885f4ce50007c496ab2ed016afff" Oct 07 12:46:10 crc kubenswrapper[4854]: E1007 12:46:10.754930 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c4dd23c8a3639b3bf49f7fd9dbee5833829885f4ce50007c496ab2ed016afff\": container with ID starting with 9c4dd23c8a3639b3bf49f7fd9dbee5833829885f4ce50007c496ab2ed016afff not found: ID does not exist" containerID="9c4dd23c8a3639b3bf49f7fd9dbee5833829885f4ce50007c496ab2ed016afff" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.754962 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c4dd23c8a3639b3bf49f7fd9dbee5833829885f4ce50007c496ab2ed016afff"} err="failed to get container status \"9c4dd23c8a3639b3bf49f7fd9dbee5833829885f4ce50007c496ab2ed016afff\": rpc error: code = NotFound desc = could not find container \"9c4dd23c8a3639b3bf49f7fd9dbee5833829885f4ce50007c496ab2ed016afff\": container with ID starting with 9c4dd23c8a3639b3bf49f7fd9dbee5833829885f4ce50007c496ab2ed016afff not found: ID does not exist" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.754982 4854 scope.go:117] "RemoveContainer" containerID="efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.756428 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.767237 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.772888 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-775676c768-frcfp"] Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.777070 4854 scope.go:117] "RemoveContainer" containerID="efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700" Oct 07 12:46:10 crc kubenswrapper[4854]: E1007 12:46:10.777491 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700\": container with ID starting with efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700 not found: ID does not exist" containerID="efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.777517 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700"} err="failed to get container status \"efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700\": rpc error: code = NotFound desc = could not find container \"efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700\": container with ID starting with efd6f20e41e37b885bf213a2da547584c13e4bdab4263b0b4585c26d56600700 not found: ID does not exist" Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.777948 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-775676c768-frcfp"] Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.783299 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:46:10 crc kubenswrapper[4854]: I1007 12:46:10.789356 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.052360 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.163030 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c42pz\" (UniqueName: \"kubernetes.io/projected/47e6a48a-4ef0-4764-a132-50140d86a6b2-kube-api-access-c42pz\") pod \"47e6a48a-4ef0-4764-a132-50140d86a6b2\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.163074 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-ceilometer-tls-certs\") pod \"47e6a48a-4ef0-4764-a132-50140d86a6b2\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.163096 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-scripts\") pod \"47e6a48a-4ef0-4764-a132-50140d86a6b2\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.164177 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47e6a48a-4ef0-4764-a132-50140d86a6b2-log-httpd\") pod \"47e6a48a-4ef0-4764-a132-50140d86a6b2\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.164225 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-config-data\") pod \"47e6a48a-4ef0-4764-a132-50140d86a6b2\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.164275 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-combined-ca-bundle\") pod \"47e6a48a-4ef0-4764-a132-50140d86a6b2\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.164366 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47e6a48a-4ef0-4764-a132-50140d86a6b2-run-httpd\") pod \"47e6a48a-4ef0-4764-a132-50140d86a6b2\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.164404 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-sg-core-conf-yaml\") pod \"47e6a48a-4ef0-4764-a132-50140d86a6b2\" (UID: \"47e6a48a-4ef0-4764-a132-50140d86a6b2\") " Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.164904 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e6a48a-4ef0-4764-a132-50140d86a6b2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "47e6a48a-4ef0-4764-a132-50140d86a6b2" (UID: "47e6a48a-4ef0-4764-a132-50140d86a6b2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.165016 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e6a48a-4ef0-4764-a132-50140d86a6b2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "47e6a48a-4ef0-4764-a132-50140d86a6b2" (UID: "47e6a48a-4ef0-4764-a132-50140d86a6b2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.170963 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47e6a48a-4ef0-4764-a132-50140d86a6b2-kube-api-access-c42pz" (OuterVolumeSpecName: "kube-api-access-c42pz") pod "47e6a48a-4ef0-4764-a132-50140d86a6b2" (UID: "47e6a48a-4ef0-4764-a132-50140d86a6b2"). InnerVolumeSpecName "kube-api-access-c42pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.176915 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-scripts" (OuterVolumeSpecName: "scripts") pod "47e6a48a-4ef0-4764-a132-50140d86a6b2" (UID: "47e6a48a-4ef0-4764-a132-50140d86a6b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.189802 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "47e6a48a-4ef0-4764-a132-50140d86a6b2" (UID: "47e6a48a-4ef0-4764-a132-50140d86a6b2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.222337 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "47e6a48a-4ef0-4764-a132-50140d86a6b2" (UID: "47e6a48a-4ef0-4764-a132-50140d86a6b2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.229486 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47e6a48a-4ef0-4764-a132-50140d86a6b2" (UID: "47e6a48a-4ef0-4764-a132-50140d86a6b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.246529 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-config-data" (OuterVolumeSpecName: "config-data") pod "47e6a48a-4ef0-4764-a132-50140d86a6b2" (UID: "47e6a48a-4ef0-4764-a132-50140d86a6b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.265852 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c42pz\" (UniqueName: \"kubernetes.io/projected/47e6a48a-4ef0-4764-a132-50140d86a6b2-kube-api-access-c42pz\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.265889 4854 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.265898 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.265907 4854 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47e6a48a-4ef0-4764-a132-50140d86a6b2-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.265915 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.265923 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.265930 4854 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47e6a48a-4ef0-4764-a132-50140d86a6b2-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.265939 4854 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47e6a48a-4ef0-4764-a132-50140d86a6b2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.621247 4854 generic.go:334] "Generic (PLEG): container finished" podID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerID="e2647a1c022554b80ce34c49ff2bbcb7d62b1334b8589db6fa5444a0549bf597" exitCode=0 Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.621305 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47e6a48a-4ef0-4764-a132-50140d86a6b2","Type":"ContainerDied","Data":"e2647a1c022554b80ce34c49ff2bbcb7d62b1334b8589db6fa5444a0549bf597"} Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.621334 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47e6a48a-4ef0-4764-a132-50140d86a6b2","Type":"ContainerDied","Data":"e5ddfb8ca73ff551352cd917d51450d9d9d9a3f562d75cea88f86c7793cfc38e"} Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.621356 4854 scope.go:117] "RemoveContainer" containerID="43911c074518df2d12734bda5be2d0438565e37df58ee0f1bb19e3735feca614" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.621494 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.645096 4854 scope.go:117] "RemoveContainer" containerID="1bb3783752e90f8f65fc57d4fca86c174ad0436acd90b5d1426146067be381e1" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.674846 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.680638 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.689131 4854 scope.go:117] "RemoveContainer" containerID="e2647a1c022554b80ce34c49ff2bbcb7d62b1334b8589db6fa5444a0549bf597" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.711583 4854 scope.go:117] "RemoveContainer" containerID="b9891d843c836c188cc5c0d2b4df4db333aab6e636378878b75cfbd65ecdb10b" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.714399 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="016c2264-9ba4-48c0-b416-02c468232b6b" path="/var/lib/kubelet/pods/016c2264-9ba4-48c0-b416-02c468232b6b/volumes" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.715228 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" path="/var/lib/kubelet/pods/47e6a48a-4ef0-4764-a132-50140d86a6b2/volumes" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.716301 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" path="/var/lib/kubelet/pods/77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e/volumes" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.718572 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79513100-48d2-4e7b-ae14-888322cab8f3" path="/var/lib/kubelet/pods/79513100-48d2-4e7b-ae14-888322cab8f3/volumes" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.719848 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9869a8d4-db8e-4aba-82d1-6d02c3cf988e" path="/var/lib/kubelet/pods/9869a8d4-db8e-4aba-82d1-6d02c3cf988e/volumes" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.721406 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cac73de2-996a-4e04-abde-1153b44058bc" path="/var/lib/kubelet/pods/cac73de2-996a-4e04-abde-1153b44058bc/volumes" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.722502 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e599e18f-63c0-4756-845c-973257921fd0" path="/var/lib/kubelet/pods/e599e18f-63c0-4756-845c-973257921fd0/volumes" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.756135 4854 scope.go:117] "RemoveContainer" containerID="43911c074518df2d12734bda5be2d0438565e37df58ee0f1bb19e3735feca614" Oct 07 12:46:12 crc kubenswrapper[4854]: E1007 12:46:12.756709 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43911c074518df2d12734bda5be2d0438565e37df58ee0f1bb19e3735feca614\": container with ID starting with 43911c074518df2d12734bda5be2d0438565e37df58ee0f1bb19e3735feca614 not found: ID does not exist" containerID="43911c074518df2d12734bda5be2d0438565e37df58ee0f1bb19e3735feca614" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.756842 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43911c074518df2d12734bda5be2d0438565e37df58ee0f1bb19e3735feca614"} err="failed to get container status \"43911c074518df2d12734bda5be2d0438565e37df58ee0f1bb19e3735feca614\": rpc error: code = NotFound desc = could not find container \"43911c074518df2d12734bda5be2d0438565e37df58ee0f1bb19e3735feca614\": container with ID starting with 43911c074518df2d12734bda5be2d0438565e37df58ee0f1bb19e3735feca614 not found: ID does not exist" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.756865 4854 scope.go:117] "RemoveContainer" containerID="1bb3783752e90f8f65fc57d4fca86c174ad0436acd90b5d1426146067be381e1" Oct 07 12:46:12 crc kubenswrapper[4854]: E1007 12:46:12.757334 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb3783752e90f8f65fc57d4fca86c174ad0436acd90b5d1426146067be381e1\": container with ID starting with 1bb3783752e90f8f65fc57d4fca86c174ad0436acd90b5d1426146067be381e1 not found: ID does not exist" containerID="1bb3783752e90f8f65fc57d4fca86c174ad0436acd90b5d1426146067be381e1" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.757385 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb3783752e90f8f65fc57d4fca86c174ad0436acd90b5d1426146067be381e1"} err="failed to get container status \"1bb3783752e90f8f65fc57d4fca86c174ad0436acd90b5d1426146067be381e1\": rpc error: code = NotFound desc = could not find container \"1bb3783752e90f8f65fc57d4fca86c174ad0436acd90b5d1426146067be381e1\": container with ID starting with 1bb3783752e90f8f65fc57d4fca86c174ad0436acd90b5d1426146067be381e1 not found: ID does not exist" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.757411 4854 scope.go:117] "RemoveContainer" containerID="e2647a1c022554b80ce34c49ff2bbcb7d62b1334b8589db6fa5444a0549bf597" Oct 07 12:46:12 crc kubenswrapper[4854]: E1007 12:46:12.757853 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2647a1c022554b80ce34c49ff2bbcb7d62b1334b8589db6fa5444a0549bf597\": container with ID starting with e2647a1c022554b80ce34c49ff2bbcb7d62b1334b8589db6fa5444a0549bf597 not found: ID does not exist" containerID="e2647a1c022554b80ce34c49ff2bbcb7d62b1334b8589db6fa5444a0549bf597" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.757931 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2647a1c022554b80ce34c49ff2bbcb7d62b1334b8589db6fa5444a0549bf597"} err="failed to get container status \"e2647a1c022554b80ce34c49ff2bbcb7d62b1334b8589db6fa5444a0549bf597\": rpc error: code = NotFound desc = could not find container \"e2647a1c022554b80ce34c49ff2bbcb7d62b1334b8589db6fa5444a0549bf597\": container with ID starting with e2647a1c022554b80ce34c49ff2bbcb7d62b1334b8589db6fa5444a0549bf597 not found: ID does not exist" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.757962 4854 scope.go:117] "RemoveContainer" containerID="b9891d843c836c188cc5c0d2b4df4db333aab6e636378878b75cfbd65ecdb10b" Oct 07 12:46:12 crc kubenswrapper[4854]: E1007 12:46:12.758345 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9891d843c836c188cc5c0d2b4df4db333aab6e636378878b75cfbd65ecdb10b\": container with ID starting with b9891d843c836c188cc5c0d2b4df4db333aab6e636378878b75cfbd65ecdb10b not found: ID does not exist" containerID="b9891d843c836c188cc5c0d2b4df4db333aab6e636378878b75cfbd65ecdb10b" Oct 07 12:46:12 crc kubenswrapper[4854]: I1007 12:46:12.758376 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9891d843c836c188cc5c0d2b4df4db333aab6e636378878b75cfbd65ecdb10b"} err="failed to get container status \"b9891d843c836c188cc5c0d2b4df4db333aab6e636378878b75cfbd65ecdb10b\": rpc error: code = NotFound desc = could not find container \"b9891d843c836c188cc5c0d2b4df4db333aab6e636378878b75cfbd65ecdb10b\": container with ID starting with b9891d843c836c188cc5c0d2b4df4db333aab6e636378878b75cfbd65ecdb10b not found: ID does not exist" Oct 07 12:46:15 crc kubenswrapper[4854]: E1007 12:46:15.021492 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:15 crc kubenswrapper[4854]: E1007 12:46:15.022366 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:15 crc kubenswrapper[4854]: E1007 12:46:15.022863 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:15 crc kubenswrapper[4854]: E1007 12:46:15.022913 4854 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-j5h2b" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovsdb-server" Oct 07 12:46:15 crc kubenswrapper[4854]: E1007 12:46:15.023220 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:15 crc kubenswrapper[4854]: E1007 12:46:15.024873 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:15 crc kubenswrapper[4854]: E1007 12:46:15.026975 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:15 crc kubenswrapper[4854]: E1007 12:46:15.027047 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-j5h2b" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovs-vswitchd" Oct 07 12:46:18 crc kubenswrapper[4854]: I1007 12:46:18.667042 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7445f79585-rckdn" podUID="c2c26a76-531a-4a6b-ac0f-6aa23680f903" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.162:9696/\": dial tcp 10.217.0.162:9696: connect: connection refused" Oct 07 12:46:20 crc kubenswrapper[4854]: E1007 12:46:20.021118 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:20 crc kubenswrapper[4854]: E1007 12:46:20.023088 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:20 crc kubenswrapper[4854]: E1007 12:46:20.023230 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:20 crc kubenswrapper[4854]: E1007 12:46:20.023585 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:20 crc kubenswrapper[4854]: E1007 12:46:20.023623 4854 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-j5h2b" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovsdb-server" Oct 07 12:46:20 crc kubenswrapper[4854]: E1007 12:46:20.024805 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:20 crc kubenswrapper[4854]: E1007 12:46:20.026577 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:20 crc kubenswrapper[4854]: E1007 12:46:20.026621 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-j5h2b" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovs-vswitchd" Oct 07 12:46:22 crc kubenswrapper[4854]: I1007 12:46:22.761250 4854 generic.go:334] "Generic (PLEG): container finished" podID="c2c26a76-531a-4a6b-ac0f-6aa23680f903" containerID="82adcb574899325e5c5ccb3b6bb13576d4a4618eac34de03953fb2c28cd68412" exitCode=0 Oct 07 12:46:22 crc kubenswrapper[4854]: I1007 12:46:22.761321 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445f79585-rckdn" event={"ID":"c2c26a76-531a-4a6b-ac0f-6aa23680f903","Type":"ContainerDied","Data":"82adcb574899325e5c5ccb3b6bb13576d4a4618eac34de03953fb2c28cd68412"} Oct 07 12:46:22 crc kubenswrapper[4854]: I1007 12:46:22.877596 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:46:22 crc kubenswrapper[4854]: I1007 12:46:22.956459 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-internal-tls-certs\") pod \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " Oct 07 12:46:22 crc kubenswrapper[4854]: I1007 12:46:22.956542 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6226q\" (UniqueName: \"kubernetes.io/projected/c2c26a76-531a-4a6b-ac0f-6aa23680f903-kube-api-access-6226q\") pod \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " Oct 07 12:46:22 crc kubenswrapper[4854]: I1007 12:46:22.956576 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-ovndb-tls-certs\") pod \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " Oct 07 12:46:22 crc kubenswrapper[4854]: I1007 12:46:22.956620 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-config\") pod \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " Oct 07 12:46:22 crc kubenswrapper[4854]: I1007 12:46:22.956790 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-httpd-config\") pod \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " Oct 07 12:46:22 crc kubenswrapper[4854]: I1007 12:46:22.957046 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-public-tls-certs\") pod \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " Oct 07 12:46:22 crc kubenswrapper[4854]: I1007 12:46:22.957817 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-combined-ca-bundle\") pod \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\" (UID: \"c2c26a76-531a-4a6b-ac0f-6aa23680f903\") " Oct 07 12:46:22 crc kubenswrapper[4854]: I1007 12:46:22.962980 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c2c26a76-531a-4a6b-ac0f-6aa23680f903" (UID: "c2c26a76-531a-4a6b-ac0f-6aa23680f903"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:22 crc kubenswrapper[4854]: I1007 12:46:22.971019 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c26a76-531a-4a6b-ac0f-6aa23680f903-kube-api-access-6226q" (OuterVolumeSpecName: "kube-api-access-6226q") pod "c2c26a76-531a-4a6b-ac0f-6aa23680f903" (UID: "c2c26a76-531a-4a6b-ac0f-6aa23680f903"). InnerVolumeSpecName "kube-api-access-6226q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:22 crc kubenswrapper[4854]: I1007 12:46:22.971411 4854 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:22 crc kubenswrapper[4854]: I1007 12:46:22.971449 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6226q\" (UniqueName: \"kubernetes.io/projected/c2c26a76-531a-4a6b-ac0f-6aa23680f903-kube-api-access-6226q\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.003578 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c2c26a76-531a-4a6b-ac0f-6aa23680f903" (UID: "c2c26a76-531a-4a6b-ac0f-6aa23680f903"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.004267 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2c26a76-531a-4a6b-ac0f-6aa23680f903" (UID: "c2c26a76-531a-4a6b-ac0f-6aa23680f903"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.015343 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-config" (OuterVolumeSpecName: "config") pod "c2c26a76-531a-4a6b-ac0f-6aa23680f903" (UID: "c2c26a76-531a-4a6b-ac0f-6aa23680f903"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.021503 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c2c26a76-531a-4a6b-ac0f-6aa23680f903" (UID: "c2c26a76-531a-4a6b-ac0f-6aa23680f903"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.022234 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c2c26a76-531a-4a6b-ac0f-6aa23680f903" (UID: "c2c26a76-531a-4a6b-ac0f-6aa23680f903"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.072600 4854 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.072651 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.072661 4854 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.072669 4854 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.072678 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2c26a76-531a-4a6b-ac0f-6aa23680f903-config\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.775836 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445f79585-rckdn" event={"ID":"c2c26a76-531a-4a6b-ac0f-6aa23680f903","Type":"ContainerDied","Data":"30c8c40f65d07566ebbf2ad35ec8aa1ad52ea6bd2d2446546bd98b3e799c22e1"} Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.775917 4854 scope.go:117] "RemoveContainer" containerID="4442c10882343c29c4d45b4876f4c5029aa8758304578d81129d781083582260" Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.776123 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7445f79585-rckdn" Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.800495 4854 scope.go:117] "RemoveContainer" containerID="82adcb574899325e5c5ccb3b6bb13576d4a4618eac34de03953fb2c28cd68412" Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.821716 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7445f79585-rckdn"] Oct 07 12:46:23 crc kubenswrapper[4854]: I1007 12:46:23.826005 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7445f79585-rckdn"] Oct 07 12:46:24 crc kubenswrapper[4854]: I1007 12:46:24.718522 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c26a76-531a-4a6b-ac0f-6aa23680f903" path="/var/lib/kubelet/pods/c2c26a76-531a-4a6b-ac0f-6aa23680f903/volumes" Oct 07 12:46:25 crc kubenswrapper[4854]: E1007 12:46:25.020820 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:25 crc kubenswrapper[4854]: E1007 12:46:25.021306 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:25 crc kubenswrapper[4854]: E1007 12:46:25.021779 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 07 12:46:25 crc kubenswrapper[4854]: E1007 12:46:25.021830 4854 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-j5h2b" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovsdb-server" Oct 07 12:46:25 crc kubenswrapper[4854]: E1007 12:46:25.023393 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:25 crc kubenswrapper[4854]: E1007 12:46:25.025203 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:25 crc kubenswrapper[4854]: E1007 12:46:25.026823 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 07 12:46:25 crc kubenswrapper[4854]: E1007 12:46:25.026937 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-j5h2b" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovs-vswitchd" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.358364 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.493977 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-config-data-custom\") pod \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.494246 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-config-data\") pod \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.495117 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-combined-ca-bundle\") pod \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.495275 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-scripts\") pod \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.495320 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk5v4\" (UniqueName: \"kubernetes.io/projected/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-kube-api-access-rk5v4\") pod \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.495362 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-etc-machine-id\") pod \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\" (UID: \"21065050-7bdc-4f4e-9a7b-9dbcc2dab200\") " Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.495588 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "21065050-7bdc-4f4e-9a7b-9dbcc2dab200" (UID: "21065050-7bdc-4f4e-9a7b-9dbcc2dab200"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.496233 4854 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.501263 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "21065050-7bdc-4f4e-9a7b-9dbcc2dab200" (UID: "21065050-7bdc-4f4e-9a7b-9dbcc2dab200"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.501971 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-kube-api-access-rk5v4" (OuterVolumeSpecName: "kube-api-access-rk5v4") pod "21065050-7bdc-4f4e-9a7b-9dbcc2dab200" (UID: "21065050-7bdc-4f4e-9a7b-9dbcc2dab200"). InnerVolumeSpecName "kube-api-access-rk5v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.502449 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-scripts" (OuterVolumeSpecName: "scripts") pod "21065050-7bdc-4f4e-9a7b-9dbcc2dab200" (UID: "21065050-7bdc-4f4e-9a7b-9dbcc2dab200"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.536908 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21065050-7bdc-4f4e-9a7b-9dbcc2dab200" (UID: "21065050-7bdc-4f4e-9a7b-9dbcc2dab200"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.597966 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.598003 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.598016 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk5v4\" (UniqueName: \"kubernetes.io/projected/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-kube-api-access-rk5v4\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.598031 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.604713 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-config-data" (OuterVolumeSpecName: "config-data") pod "21065050-7bdc-4f4e-9a7b-9dbcc2dab200" (UID: "21065050-7bdc-4f4e-9a7b-9dbcc2dab200"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.699974 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21065050-7bdc-4f4e-9a7b-9dbcc2dab200-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.850115 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j5h2b_847eb385-fc80-4568-813d-638dac11d81a/ovs-vswitchd/0.log" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.852237 4854 generic.go:334] "Generic (PLEG): container finished" podID="847eb385-fc80-4568-813d-638dac11d81a" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" exitCode=137 Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.852323 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j5h2b" event={"ID":"847eb385-fc80-4568-813d-638dac11d81a","Type":"ContainerDied","Data":"1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38"} Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.854937 4854 generic.go:334] "Generic (PLEG): container finished" podID="21065050-7bdc-4f4e-9a7b-9dbcc2dab200" containerID="32ad432a8d636a25c21369c163b0875cef6bace27819c32cee125a524960df32" exitCode=137 Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.855009 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21065050-7bdc-4f4e-9a7b-9dbcc2dab200","Type":"ContainerDied","Data":"32ad432a8d636a25c21369c163b0875cef6bace27819c32cee125a524960df32"} Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.855055 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21065050-7bdc-4f4e-9a7b-9dbcc2dab200","Type":"ContainerDied","Data":"ce7b642e4ff9856acd7c51c12f4b1bffb1d2e910fc862dfc77237c8e0e6bacc9"} Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.855083 4854 scope.go:117] "RemoveContainer" containerID="3cca971c6b0c47f58de9e946459a6537fae5efe97def4c9f7b9f8c2749fe2fc8" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.855321 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.882274 4854 scope.go:117] "RemoveContainer" containerID="32ad432a8d636a25c21369c163b0875cef6bace27819c32cee125a524960df32" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.897877 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.902805 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.906000 4854 scope.go:117] "RemoveContainer" containerID="3cca971c6b0c47f58de9e946459a6537fae5efe97def4c9f7b9f8c2749fe2fc8" Oct 07 12:46:29 crc kubenswrapper[4854]: E1007 12:46:29.906738 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cca971c6b0c47f58de9e946459a6537fae5efe97def4c9f7b9f8c2749fe2fc8\": container with ID starting with 3cca971c6b0c47f58de9e946459a6537fae5efe97def4c9f7b9f8c2749fe2fc8 not found: ID does not exist" containerID="3cca971c6b0c47f58de9e946459a6537fae5efe97def4c9f7b9f8c2749fe2fc8" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.906788 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cca971c6b0c47f58de9e946459a6537fae5efe97def4c9f7b9f8c2749fe2fc8"} err="failed to get container status \"3cca971c6b0c47f58de9e946459a6537fae5efe97def4c9f7b9f8c2749fe2fc8\": rpc error: code = NotFound desc = could not find container \"3cca971c6b0c47f58de9e946459a6537fae5efe97def4c9f7b9f8c2749fe2fc8\": container with ID starting with 3cca971c6b0c47f58de9e946459a6537fae5efe97def4c9f7b9f8c2749fe2fc8 not found: ID does not exist" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.906813 4854 scope.go:117] "RemoveContainer" containerID="32ad432a8d636a25c21369c163b0875cef6bace27819c32cee125a524960df32" Oct 07 12:46:29 crc kubenswrapper[4854]: E1007 12:46:29.907107 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32ad432a8d636a25c21369c163b0875cef6bace27819c32cee125a524960df32\": container with ID starting with 32ad432a8d636a25c21369c163b0875cef6bace27819c32cee125a524960df32 not found: ID does not exist" containerID="32ad432a8d636a25c21369c163b0875cef6bace27819c32cee125a524960df32" Oct 07 12:46:29 crc kubenswrapper[4854]: I1007 12:46:29.907173 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ad432a8d636a25c21369c163b0875cef6bace27819c32cee125a524960df32"} err="failed to get container status \"32ad432a8d636a25c21369c163b0875cef6bace27819c32cee125a524960df32\": rpc error: code = NotFound desc = could not find container \"32ad432a8d636a25c21369c163b0875cef6bace27819c32cee125a524960df32\": container with ID starting with 32ad432a8d636a25c21369c163b0875cef6bace27819c32cee125a524960df32 not found: ID does not exist" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.022350 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j5h2b_847eb385-fc80-4568-813d-638dac11d81a/ovs-vswitchd/0.log" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.023915 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.106117 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-etc-ovs\") pod \"847eb385-fc80-4568-813d-638dac11d81a\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.106183 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-log\") pod \"847eb385-fc80-4568-813d-638dac11d81a\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.106214 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "847eb385-fc80-4568-813d-638dac11d81a" (UID: "847eb385-fc80-4568-813d-638dac11d81a"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.106218 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-log" (OuterVolumeSpecName: "var-log") pod "847eb385-fc80-4568-813d-638dac11d81a" (UID: "847eb385-fc80-4568-813d-638dac11d81a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.106302 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jxd7\" (UniqueName: \"kubernetes.io/projected/847eb385-fc80-4568-813d-638dac11d81a-kube-api-access-4jxd7\") pod \"847eb385-fc80-4568-813d-638dac11d81a\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.106339 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/847eb385-fc80-4568-813d-638dac11d81a-scripts\") pod \"847eb385-fc80-4568-813d-638dac11d81a\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.106374 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-lib\") pod \"847eb385-fc80-4568-813d-638dac11d81a\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.106629 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-run\") pod \"847eb385-fc80-4568-813d-638dac11d81a\" (UID: \"847eb385-fc80-4568-813d-638dac11d81a\") " Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.106677 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-lib" (OuterVolumeSpecName: "var-lib") pod "847eb385-fc80-4568-813d-638dac11d81a" (UID: "847eb385-fc80-4568-813d-638dac11d81a"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.106757 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-run" (OuterVolumeSpecName: "var-run") pod "847eb385-fc80-4568-813d-638dac11d81a" (UID: "847eb385-fc80-4568-813d-638dac11d81a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.106964 4854 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-run\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.107007 4854 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.107019 4854 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-log\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.107026 4854 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/847eb385-fc80-4568-813d-638dac11d81a-var-lib\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.108206 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/847eb385-fc80-4568-813d-638dac11d81a-scripts" (OuterVolumeSpecName: "scripts") pod "847eb385-fc80-4568-813d-638dac11d81a" (UID: "847eb385-fc80-4568-813d-638dac11d81a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.110094 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/847eb385-fc80-4568-813d-638dac11d81a-kube-api-access-4jxd7" (OuterVolumeSpecName: "kube-api-access-4jxd7") pod "847eb385-fc80-4568-813d-638dac11d81a" (UID: "847eb385-fc80-4568-813d-638dac11d81a"). InnerVolumeSpecName "kube-api-access-4jxd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.207858 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jxd7\" (UniqueName: \"kubernetes.io/projected/847eb385-fc80-4568-813d-638dac11d81a-kube-api-access-4jxd7\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.207895 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/847eb385-fc80-4568-813d-638dac11d81a-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.711593 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21065050-7bdc-4f4e-9a7b-9dbcc2dab200" path="/var/lib/kubelet/pods/21065050-7bdc-4f4e-9a7b-9dbcc2dab200/volumes" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.867542 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j5h2b_847eb385-fc80-4568-813d-638dac11d81a/ovs-vswitchd/0.log" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.870577 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-j5h2b" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.871301 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j5h2b" event={"ID":"847eb385-fc80-4568-813d-638dac11d81a","Type":"ContainerDied","Data":"8d8bfd0c8e1c01256c77de2d1efaa967dc06074cc98619a0d133d5d16b4687b4"} Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.871366 4854 scope.go:117] "RemoveContainer" containerID="1c6a7b5092dfdf765b8d78edcb5848f5ab195e1b5fbac03359ca96b418f42d38" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.879578 4854 generic.go:334] "Generic (PLEG): container finished" podID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerID="bd423040637c8b41729c179d3e2612729ced5992aafd5990c3c9b5c7ca5727c0" exitCode=137 Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.879648 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"bd423040637c8b41729c179d3e2612729ced5992aafd5990c3c9b5c7ca5727c0"} Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.919195 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.922024 4854 scope.go:117] "RemoveContainer" containerID="49664c4eacd10194cd4eaaf0aca67e1328ae47efa8cae6560f85b2f783b3a2b6" Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.926670 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-j5h2b"] Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.931527 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-j5h2b"] Oct 07 12:46:30 crc kubenswrapper[4854]: I1007 12:46:30.942769 4854 scope.go:117] "RemoveContainer" containerID="f75d1ca815bc77a4c9197c11735a40bdd70e1b2d81354ef594fe6844420ca14c" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.020142 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6f9410d0-f08a-4288-901b-8c28b54f6d53-cache\") pod \"6f9410d0-f08a-4288-901b-8c28b54f6d53\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.020217 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"6f9410d0-f08a-4288-901b-8c28b54f6d53\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.020237 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift\") pod \"6f9410d0-f08a-4288-901b-8c28b54f6d53\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.020252 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6f9410d0-f08a-4288-901b-8c28b54f6d53-lock\") pod \"6f9410d0-f08a-4288-901b-8c28b54f6d53\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.020306 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j85zd\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-kube-api-access-j85zd\") pod \"6f9410d0-f08a-4288-901b-8c28b54f6d53\" (UID: \"6f9410d0-f08a-4288-901b-8c28b54f6d53\") " Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.021619 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9410d0-f08a-4288-901b-8c28b54f6d53-cache" (OuterVolumeSpecName: "cache") pod "6f9410d0-f08a-4288-901b-8c28b54f6d53" (UID: "6f9410d0-f08a-4288-901b-8c28b54f6d53"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.021695 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9410d0-f08a-4288-901b-8c28b54f6d53-lock" (OuterVolumeSpecName: "lock") pod "6f9410d0-f08a-4288-901b-8c28b54f6d53" (UID: "6f9410d0-f08a-4288-901b-8c28b54f6d53"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.024115 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "6f9410d0-f08a-4288-901b-8c28b54f6d53" (UID: "6f9410d0-f08a-4288-901b-8c28b54f6d53"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.024173 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-kube-api-access-j85zd" (OuterVolumeSpecName: "kube-api-access-j85zd") pod "6f9410d0-f08a-4288-901b-8c28b54f6d53" (UID: "6f9410d0-f08a-4288-901b-8c28b54f6d53"). InnerVolumeSpecName "kube-api-access-j85zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.024212 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6f9410d0-f08a-4288-901b-8c28b54f6d53" (UID: "6f9410d0-f08a-4288-901b-8c28b54f6d53"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.122364 4854 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6f9410d0-f08a-4288-901b-8c28b54f6d53-cache\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.122481 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.122514 4854 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.122541 4854 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6f9410d0-f08a-4288-901b-8c28b54f6d53-lock\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.122568 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j85zd\" (UniqueName: \"kubernetes.io/projected/6f9410d0-f08a-4288-901b-8c28b54f6d53-kube-api-access-j85zd\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.136330 4854 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.224622 4854 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.927606 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6f9410d0-f08a-4288-901b-8c28b54f6d53","Type":"ContainerDied","Data":"dd938a5843b7a1cab14cbbfebb68babf32b91355005f96e744472dfda46a4b27"} Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.927682 4854 scope.go:117] "RemoveContainer" containerID="bd423040637c8b41729c179d3e2612729ced5992aafd5990c3c9b5c7ca5727c0" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.927807 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.969580 4854 scope.go:117] "RemoveContainer" containerID="6e9be7ed32f4b1f0286eb9c12fca3356d25542eaefb68cc3134f42aca219812e" Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.980332 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.985843 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 07 12:46:31 crc kubenswrapper[4854]: I1007 12:46:31.993052 4854 scope.go:117] "RemoveContainer" containerID="3ead17d08219606cf249dcd61e3873b2c9b921a7f531b3b5ff7b4f38fcea5a39" Oct 07 12:46:32 crc kubenswrapper[4854]: I1007 12:46:32.010395 4854 scope.go:117] "RemoveContainer" containerID="16029fd8b03b23770b20c5cb19ee5eac37c556a4d1e730daaeee2955d74d1015" Oct 07 12:46:32 crc kubenswrapper[4854]: I1007 12:46:32.026986 4854 scope.go:117] "RemoveContainer" containerID="d139c560fe8a80f0502b14596ea2825172ee972c7730deb571bbc27b9d43e3a4" Oct 07 12:46:32 crc kubenswrapper[4854]: I1007 12:46:32.043182 4854 scope.go:117] "RemoveContainer" containerID="ef4e6844e5b6720c32f74f7e722421b7675513872680d1a7f5bcc693b7983ed2" Oct 07 12:46:32 crc kubenswrapper[4854]: I1007 12:46:32.064690 4854 scope.go:117] "RemoveContainer" containerID="41fbd8df61c4cf14f1e960d26b75dfc2bc04a3598dc2d46c94fb7688efb55eb9" Oct 07 12:46:32 crc kubenswrapper[4854]: I1007 12:46:32.085506 4854 scope.go:117] "RemoveContainer" containerID="c1daeb00bdab450dbfd0e8a2fc796a0cac58f94569069238d66bafbd697e21fe" Oct 07 12:46:32 crc kubenswrapper[4854]: I1007 12:46:32.106918 4854 scope.go:117] "RemoveContainer" containerID="04a36a15910bb4bfdcb57384d35dbb7269dfced501b76e98351c42047a3a007d" Oct 07 12:46:32 crc kubenswrapper[4854]: I1007 12:46:32.125652 4854 scope.go:117] "RemoveContainer" containerID="bf5285e76e2b7c58fdaeb7235c95ba9d504989e3ace943cdfc50b7e56f2b19f6" Oct 07 12:46:32 crc kubenswrapper[4854]: I1007 12:46:32.148132 4854 scope.go:117] "RemoveContainer" containerID="6c57c9247bb283b1cdb97c6d4a8501239230072515b3dfc540b0bb0d946a0d67" Oct 07 12:46:32 crc kubenswrapper[4854]: I1007 12:46:32.167366 4854 scope.go:117] "RemoveContainer" containerID="7ec2bd59e5244b757b1dc78381bc28563a41c3f4cb1938a20f0abef176871dca" Oct 07 12:46:32 crc kubenswrapper[4854]: I1007 12:46:32.184831 4854 scope.go:117] "RemoveContainer" containerID="46e4da89bda9a00a4ff3b8768a33badf8cfee5c8e4687bcdeb7761033e1a22b5" Oct 07 12:46:32 crc kubenswrapper[4854]: I1007 12:46:32.201825 4854 scope.go:117] "RemoveContainer" containerID="64082d363d907b3ab83993dcfe80f76c9b9736fb8b76e7c87b840d99d91af5c8" Oct 07 12:46:32 crc kubenswrapper[4854]: I1007 12:46:32.220596 4854 scope.go:117] "RemoveContainer" containerID="e7d1be76b597fbb83db1e8cb27aca94cb03bb1ca865e7042ef7e99b025c26823" Oct 07 12:46:32 crc kubenswrapper[4854]: I1007 12:46:32.713304 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" path="/var/lib/kubelet/pods/6f9410d0-f08a-4288-901b-8c28b54f6d53/volumes" Oct 07 12:46:32 crc kubenswrapper[4854]: I1007 12:46:32.715805 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="847eb385-fc80-4568-813d-638dac11d81a" path="/var/lib/kubelet/pods/847eb385-fc80-4568-813d-638dac11d81a/volumes" Oct 07 12:46:34 crc kubenswrapper[4854]: I1007 12:46:34.973313 4854 generic.go:334] "Generic (PLEG): container finished" podID="04ffb838-3774-402c-9cdf-d11e51fb21e5" containerID="76a801916c6498bb2e49a91144e0ff94292e20664e61ad28d638eadcee76fb94" exitCode=137 Oct 07 12:46:34 crc kubenswrapper[4854]: I1007 12:46:34.973702 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" event={"ID":"04ffb838-3774-402c-9cdf-d11e51fb21e5","Type":"ContainerDied","Data":"76a801916c6498bb2e49a91144e0ff94292e20664e61ad28d638eadcee76fb94"} Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.256267 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.388589 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-combined-ca-bundle\") pod \"04ffb838-3774-402c-9cdf-d11e51fb21e5\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.388700 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-config-data\") pod \"04ffb838-3774-402c-9cdf-d11e51fb21e5\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.388753 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04ffb838-3774-402c-9cdf-d11e51fb21e5-logs\") pod \"04ffb838-3774-402c-9cdf-d11e51fb21e5\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.388778 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-config-data-custom\") pod \"04ffb838-3774-402c-9cdf-d11e51fb21e5\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.388895 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw68d\" (UniqueName: \"kubernetes.io/projected/04ffb838-3774-402c-9cdf-d11e51fb21e5-kube-api-access-dw68d\") pod \"04ffb838-3774-402c-9cdf-d11e51fb21e5\" (UID: \"04ffb838-3774-402c-9cdf-d11e51fb21e5\") " Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.391460 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04ffb838-3774-402c-9cdf-d11e51fb21e5-logs" (OuterVolumeSpecName: "logs") pod "04ffb838-3774-402c-9cdf-d11e51fb21e5" (UID: "04ffb838-3774-402c-9cdf-d11e51fb21e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.397545 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "04ffb838-3774-402c-9cdf-d11e51fb21e5" (UID: "04ffb838-3774-402c-9cdf-d11e51fb21e5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.397570 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ffb838-3774-402c-9cdf-d11e51fb21e5-kube-api-access-dw68d" (OuterVolumeSpecName: "kube-api-access-dw68d") pod "04ffb838-3774-402c-9cdf-d11e51fb21e5" (UID: "04ffb838-3774-402c-9cdf-d11e51fb21e5"). InnerVolumeSpecName "kube-api-access-dw68d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.417745 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04ffb838-3774-402c-9cdf-d11e51fb21e5" (UID: "04ffb838-3774-402c-9cdf-d11e51fb21e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.431406 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-config-data" (OuterVolumeSpecName: "config-data") pod "04ffb838-3774-402c-9cdf-d11e51fb21e5" (UID: "04ffb838-3774-402c-9cdf-d11e51fb21e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.491543 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.491591 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04ffb838-3774-402c-9cdf-d11e51fb21e5-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.491622 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.491637 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw68d\" (UniqueName: \"kubernetes.io/projected/04ffb838-3774-402c-9cdf-d11e51fb21e5-kube-api-access-dw68d\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.491648 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ffb838-3774-402c-9cdf-d11e51fb21e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.617759 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.693884 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-combined-ca-bundle\") pod \"5179f78d-3a8f-4621-95f3-e147ff8da79f\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.693972 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5179f78d-3a8f-4621-95f3-e147ff8da79f-logs\") pod \"5179f78d-3a8f-4621-95f3-e147ff8da79f\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.694042 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-config-data-custom\") pod \"5179f78d-3a8f-4621-95f3-e147ff8da79f\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.694235 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drm6n\" (UniqueName: \"kubernetes.io/projected/5179f78d-3a8f-4621-95f3-e147ff8da79f-kube-api-access-drm6n\") pod \"5179f78d-3a8f-4621-95f3-e147ff8da79f\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.694282 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-config-data\") pod \"5179f78d-3a8f-4621-95f3-e147ff8da79f\" (UID: \"5179f78d-3a8f-4621-95f3-e147ff8da79f\") " Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.696737 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5179f78d-3a8f-4621-95f3-e147ff8da79f-logs" (OuterVolumeSpecName: "logs") pod "5179f78d-3a8f-4621-95f3-e147ff8da79f" (UID: "5179f78d-3a8f-4621-95f3-e147ff8da79f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.699535 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5179f78d-3a8f-4621-95f3-e147ff8da79f-kube-api-access-drm6n" (OuterVolumeSpecName: "kube-api-access-drm6n") pod "5179f78d-3a8f-4621-95f3-e147ff8da79f" (UID: "5179f78d-3a8f-4621-95f3-e147ff8da79f"). InnerVolumeSpecName "kube-api-access-drm6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.700844 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5179f78d-3a8f-4621-95f3-e147ff8da79f" (UID: "5179f78d-3a8f-4621-95f3-e147ff8da79f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.717335 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5179f78d-3a8f-4621-95f3-e147ff8da79f" (UID: "5179f78d-3a8f-4621-95f3-e147ff8da79f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.759028 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-config-data" (OuterVolumeSpecName: "config-data") pod "5179f78d-3a8f-4621-95f3-e147ff8da79f" (UID: "5179f78d-3a8f-4621-95f3-e147ff8da79f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.796970 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.797019 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5179f78d-3a8f-4621-95f3-e147ff8da79f-logs\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.797030 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.797040 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drm6n\" (UniqueName: \"kubernetes.io/projected/5179f78d-3a8f-4621-95f3-e147ff8da79f-kube-api-access-drm6n\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.797053 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5179f78d-3a8f-4621-95f3-e147ff8da79f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.988114 4854 generic.go:334] "Generic (PLEG): container finished" podID="5179f78d-3a8f-4621-95f3-e147ff8da79f" containerID="6c97ffba80b987fc973726fde0f552970e4fb94bcc265aa80afeaa87e15eaa7e" exitCode=137 Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.988241 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65b8874fd7-dnnjf" event={"ID":"5179f78d-3a8f-4621-95f3-e147ff8da79f","Type":"ContainerDied","Data":"6c97ffba80b987fc973726fde0f552970e4fb94bcc265aa80afeaa87e15eaa7e"} Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.988266 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65b8874fd7-dnnjf" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.989252 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65b8874fd7-dnnjf" event={"ID":"5179f78d-3a8f-4621-95f3-e147ff8da79f","Type":"ContainerDied","Data":"420df5386f6bce64ac480f93e7cefca1f7b6d0c1344eee0f4357340f7631529e"} Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.989296 4854 scope.go:117] "RemoveContainer" containerID="6c97ffba80b987fc973726fde0f552970e4fb94bcc265aa80afeaa87e15eaa7e" Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.992543 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" event={"ID":"04ffb838-3774-402c-9cdf-d11e51fb21e5","Type":"ContainerDied","Data":"8efacd13cb9d6ed637f2722f33cc0eb702b0a0faa85193aa1e09cbcd8d27bb12"} Oct 07 12:46:35 crc kubenswrapper[4854]: I1007 12:46:35.992639 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75fd88c566-5j4xn" Oct 07 12:46:36 crc kubenswrapper[4854]: I1007 12:46:36.025343 4854 scope.go:117] "RemoveContainer" containerID="02fc41252a41df476e3c3e2c65ab0a5aca662f6c1710215b0dcdfa49eb74b167" Oct 07 12:46:36 crc kubenswrapper[4854]: I1007 12:46:36.050308 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-75fd88c566-5j4xn"] Oct 07 12:46:36 crc kubenswrapper[4854]: I1007 12:46:36.050362 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-75fd88c566-5j4xn"] Oct 07 12:46:36 crc kubenswrapper[4854]: I1007 12:46:36.050379 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-65b8874fd7-dnnjf"] Oct 07 12:46:36 crc kubenswrapper[4854]: I1007 12:46:36.054875 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-65b8874fd7-dnnjf"] Oct 07 12:46:36 crc kubenswrapper[4854]: I1007 12:46:36.095716 4854 scope.go:117] "RemoveContainer" containerID="6c97ffba80b987fc973726fde0f552970e4fb94bcc265aa80afeaa87e15eaa7e" Oct 07 12:46:36 crc kubenswrapper[4854]: E1007 12:46:36.096466 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c97ffba80b987fc973726fde0f552970e4fb94bcc265aa80afeaa87e15eaa7e\": container with ID starting with 6c97ffba80b987fc973726fde0f552970e4fb94bcc265aa80afeaa87e15eaa7e not found: ID does not exist" containerID="6c97ffba80b987fc973726fde0f552970e4fb94bcc265aa80afeaa87e15eaa7e" Oct 07 12:46:36 crc kubenswrapper[4854]: I1007 12:46:36.096536 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c97ffba80b987fc973726fde0f552970e4fb94bcc265aa80afeaa87e15eaa7e"} err="failed to get container status \"6c97ffba80b987fc973726fde0f552970e4fb94bcc265aa80afeaa87e15eaa7e\": rpc error: code = NotFound desc = could not find container \"6c97ffba80b987fc973726fde0f552970e4fb94bcc265aa80afeaa87e15eaa7e\": container with ID starting with 6c97ffba80b987fc973726fde0f552970e4fb94bcc265aa80afeaa87e15eaa7e not found: ID does not exist" Oct 07 12:46:36 crc kubenswrapper[4854]: I1007 12:46:36.096561 4854 scope.go:117] "RemoveContainer" containerID="02fc41252a41df476e3c3e2c65ab0a5aca662f6c1710215b0dcdfa49eb74b167" Oct 07 12:46:36 crc kubenswrapper[4854]: E1007 12:46:36.097380 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02fc41252a41df476e3c3e2c65ab0a5aca662f6c1710215b0dcdfa49eb74b167\": container with ID starting with 02fc41252a41df476e3c3e2c65ab0a5aca662f6c1710215b0dcdfa49eb74b167 not found: ID does not exist" containerID="02fc41252a41df476e3c3e2c65ab0a5aca662f6c1710215b0dcdfa49eb74b167" Oct 07 12:46:36 crc kubenswrapper[4854]: I1007 12:46:36.097444 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02fc41252a41df476e3c3e2c65ab0a5aca662f6c1710215b0dcdfa49eb74b167"} err="failed to get container status \"02fc41252a41df476e3c3e2c65ab0a5aca662f6c1710215b0dcdfa49eb74b167\": rpc error: code = NotFound desc = could not find container \"02fc41252a41df476e3c3e2c65ab0a5aca662f6c1710215b0dcdfa49eb74b167\": container with ID starting with 02fc41252a41df476e3c3e2c65ab0a5aca662f6c1710215b0dcdfa49eb74b167 not found: ID does not exist" Oct 07 12:46:36 crc kubenswrapper[4854]: I1007 12:46:36.097472 4854 scope.go:117] "RemoveContainer" containerID="76a801916c6498bb2e49a91144e0ff94292e20664e61ad28d638eadcee76fb94" Oct 07 12:46:36 crc kubenswrapper[4854]: I1007 12:46:36.117429 4854 scope.go:117] "RemoveContainer" containerID="1ee628a9c0b67d36be2b170a1da5ccab6d647b96fb94a73b6590d405c4039a11" Oct 07 12:46:36 crc kubenswrapper[4854]: I1007 12:46:36.724749 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ffb838-3774-402c-9cdf-d11e51fb21e5" path="/var/lib/kubelet/pods/04ffb838-3774-402c-9cdf-d11e51fb21e5/volumes" Oct 07 12:46:36 crc kubenswrapper[4854]: I1007 12:46:36.726306 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5179f78d-3a8f-4621-95f3-e147ff8da79f" path="/var/lib/kubelet/pods/5179f78d-3a8f-4621-95f3-e147ff8da79f/volumes" Oct 07 12:47:16 crc kubenswrapper[4854]: I1007 12:47:16.382440 4854 scope.go:117] "RemoveContainer" containerID="6b0f9e765f42e0528d289da2b61440b957c5df88bdf61759d5eef1939df5e18b" Oct 07 12:47:16 crc kubenswrapper[4854]: I1007 12:47:16.414248 4854 scope.go:117] "RemoveContainer" containerID="60bb9fc490e0c40f6772256cf4a2a1ca8e51250ec86c89bd133d4b4d9de69302" Oct 07 12:47:16 crc kubenswrapper[4854]: I1007 12:47:16.466838 4854 scope.go:117] "RemoveContainer" containerID="b178e7943cef5d513b5b68323ca5b565273085d4c85f96bb739f2a36aa5dbd2d" Oct 07 12:47:16 crc kubenswrapper[4854]: I1007 12:47:16.508991 4854 scope.go:117] "RemoveContainer" containerID="1b0f1975974dee7a4bec144321d0e281dc8c0efd5c6aa89d6f4fd870a7a51e3a" Oct 07 12:47:16 crc kubenswrapper[4854]: I1007 12:47:16.539608 4854 scope.go:117] "RemoveContainer" containerID="9c7ddd3a4c8f213d021b724e370c377203bc8a7c36b48c8171f8d9f35b3f1843" Oct 07 12:47:16 crc kubenswrapper[4854]: I1007 12:47:16.608313 4854 scope.go:117] "RemoveContainer" containerID="f86bf46a9d38e3d5f5310e3c1c0c4e4d1de7f562c6847e508b6bccd1073c8209" Oct 07 12:47:16 crc kubenswrapper[4854]: I1007 12:47:16.630683 4854 scope.go:117] "RemoveContainer" containerID="13cd0930c993fb29506907fbe02b5640f437f2eb9dee0a69b79aa644c500010d" Oct 07 12:47:16 crc kubenswrapper[4854]: I1007 12:47:16.667866 4854 scope.go:117] "RemoveContainer" containerID="3f494071d1ef0d6df80043a7409a0e9096a6b62c0faf1b32f7a34dccf7655466" Oct 07 12:47:16 crc kubenswrapper[4854]: I1007 12:47:16.693925 4854 scope.go:117] "RemoveContainer" containerID="692cc3bb8cbbd18ce414ae4744d581b16427eea3290c632474a8a767fdbd9811" Oct 07 12:47:16 crc kubenswrapper[4854]: I1007 12:47:16.716136 4854 scope.go:117] "RemoveContainer" containerID="15908e800d395af5a8d7f5f01dfdbf58a442b4cdc1794a444f36b3b3f543c0f8" Oct 07 12:47:16 crc kubenswrapper[4854]: I1007 12:47:16.734647 4854 scope.go:117] "RemoveContainer" containerID="b8ca0301667963e3ec2366a136d0517b008b1b5e00eb6795c50b7ce516616afb" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.061734 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mcv7r"] Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.064699 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f725ba88-4d40-4eab-890d-e114448fabe9" containerName="galera" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.064735 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f725ba88-4d40-4eab-890d-e114448fabe9" containerName="galera" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.064749 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovs-vswitchd" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.064762 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovs-vswitchd" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.064779 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac73de2-996a-4e04-abde-1153b44058bc" containerName="nova-cell1-conductor-conductor" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.064792 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac73de2-996a-4e04-abde-1153b44058bc" containerName="nova-cell1-conductor-conductor" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.064809 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5179f78d-3a8f-4621-95f3-e147ff8da79f" containerName="barbican-worker" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.064821 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5179f78d-3a8f-4621-95f3-e147ff8da79f" containerName="barbican-worker" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.064848 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-auditor" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.064860 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-auditor" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.064878 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-auditor" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.064891 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-auditor" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.064910 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21065050-7bdc-4f4e-9a7b-9dbcc2dab200" containerName="probe" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.064922 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="21065050-7bdc-4f4e-9a7b-9dbcc2dab200" containerName="probe" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.064937 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" containerName="barbican-worker-log" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.064949 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" containerName="barbican-worker-log" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.064971 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5179f78d-3a8f-4621-95f3-e147ff8da79f" containerName="barbican-worker-log" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.064983 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5179f78d-3a8f-4621-95f3-e147ff8da79f" containerName="barbican-worker-log" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065003 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-replicator" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065015 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-replicator" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065044 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21065050-7bdc-4f4e-9a7b-9dbcc2dab200" containerName="cinder-scheduler" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065060 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="21065050-7bdc-4f4e-9a7b-9dbcc2dab200" containerName="cinder-scheduler" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065077 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e599e18f-63c0-4756-845c-973257921fd0" containerName="mysql-bootstrap" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065089 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e599e18f-63c0-4756-845c-973257921fd0" containerName="mysql-bootstrap" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065104 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a03c4a0d-6346-43e4-8db1-f653b5dfa420" containerName="placement-api" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065115 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a03c4a0d-6346-43e4-8db1-f653b5dfa420" containerName="placement-api" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065128 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c293f13-b2a5-4d4b-9f69-fd118e34eab2" containerName="rabbitmq" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065140 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c293f13-b2a5-4d4b-9f69-fd118e34eab2" containerName="rabbitmq" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065186 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6002f7a4-27d6-4554-a486-87926ebcf57e" containerName="barbican-api-log" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065198 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6002f7a4-27d6-4554-a486-87926ebcf57e" containerName="barbican-api-log" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065214 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f725ba88-4d40-4eab-890d-e114448fabe9" containerName="mysql-bootstrap" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065226 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f725ba88-4d40-4eab-890d-e114448fabe9" containerName="mysql-bootstrap" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065252 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac535972-fa59-4e7f-818b-345da6937c14" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065264 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac535972-fa59-4e7f-818b-345da6937c14" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065278 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovsdb-server-init" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065290 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovsdb-server-init" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065303 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e309be64-7a5a-4156-89a6-d1201eaaff63" containerName="kube-state-metrics" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065314 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e309be64-7a5a-4156-89a6-d1201eaaff63" containerName="kube-state-metrics" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065338 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c26a76-531a-4a6b-ac0f-6aa23680f903" containerName="neutron-httpd" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065356 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c26a76-531a-4a6b-ac0f-6aa23680f903" containerName="neutron-httpd" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065374 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-server" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065387 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-server" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065410 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-server" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065422 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-server" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065442 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c293f13-b2a5-4d4b-9f69-fd118e34eab2" containerName="setup-container" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065454 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c293f13-b2a5-4d4b-9f69-fd118e34eab2" containerName="setup-container" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065468 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-updater" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065480 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-updater" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065497 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="swift-recon-cron" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065509 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="swift-recon-cron" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065526 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f80399-ed98-4aba-9db5-759ad2e314fa" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065539 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f80399-ed98-4aba-9db5-759ad2e314fa" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065560 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c26a76-531a-4a6b-ac0f-6aa23680f903" containerName="neutron-api" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065571 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c26a76-531a-4a6b-ac0f-6aa23680f903" containerName="neutron-api" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065587 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovsdb-server" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065599 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovsdb-server" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065619 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6002f7a4-27d6-4554-a486-87926ebcf57e" containerName="barbican-api" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065630 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6002f7a4-27d6-4554-a486-87926ebcf57e" containerName="barbican-api" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065646 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9869a8d4-db8e-4aba-82d1-6d02c3cf988e" containerName="keystone-api" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065658 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9869a8d4-db8e-4aba-82d1-6d02c3cf988e" containerName="keystone-api" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065676 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c47fe71-3b92-4490-8488-d98d0e25519e" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065687 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c47fe71-3b92-4490-8488-d98d0e25519e" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065706 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="016c2264-9ba4-48c0-b416-02c468232b6b" containerName="nova-scheduler-scheduler" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065718 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="016c2264-9ba4-48c0-b416-02c468232b6b" containerName="nova-scheduler-scheduler" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065741 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79513100-48d2-4e7b-ae14-888322cab8f3" containerName="setup-container" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065753 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="79513100-48d2-4e7b-ae14-888322cab8f3" containerName="setup-container" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065774 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37dd5983-0d4d-4097-8657-f408e9bc68c0" containerName="nova-api-api" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065786 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="37dd5983-0d4d-4097-8657-f408e9bc68c0" containerName="nova-api-api" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065806 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698aae03-92da-4cc2-a9d2-ecdb5f143439" containerName="ovn-northd" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065817 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="698aae03-92da-4cc2-a9d2-ecdb5f143439" containerName="ovn-northd" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065837 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-expirer" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065850 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-expirer" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065867 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-server" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065878 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-server" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065895 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13453d02-4f55-45a7-98be-1cd41c741a3e" containerName="glance-httpd" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065905 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="13453d02-4f55-45a7-98be-1cd41c741a3e" containerName="glance-httpd" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065924 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13453d02-4f55-45a7-98be-1cd41c741a3e" containerName="glance-log" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065936 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="13453d02-4f55-45a7-98be-1cd41c741a3e" containerName="glance-log" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065949 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cebadb-2142-477a-85b3-53e7c73fa6cc" containerName="memcached" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065961 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cebadb-2142-477a-85b3-53e7c73fa6cc" containerName="memcached" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.065978 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-reaper" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.065990 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-reaper" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066013 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-auditor" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066025 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-auditor" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066042 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-updater" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066055 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-updater" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066072 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="ceilometer-notification-agent" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066084 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="ceilometer-notification-agent" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066101 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6702f3-b113-49f9-b85f-a2d294bac6dc" containerName="ovn-controller" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066113 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6702f3-b113-49f9-b85f-a2d294bac6dc" containerName="ovn-controller" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066133 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698aae03-92da-4cc2-a9d2-ecdb5f143439" containerName="openstack-network-exporter" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066144 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="698aae03-92da-4cc2-a9d2-ecdb5f143439" containerName="openstack-network-exporter" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066189 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="ceilometer-central-agent" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066202 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="ceilometer-central-agent" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066220 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a03c4a0d-6346-43e4-8db1-f653b5dfa420" containerName="placement-log" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066231 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a03c4a0d-6346-43e4-8db1-f653b5dfa420" containerName="placement-log" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066246 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa0738a-daf1-479f-9dbd-913806703370" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066259 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa0738a-daf1-479f-9dbd-913806703370" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066282 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-replicator" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066294 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-replicator" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066380 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="787f934e-4f31-4b00-8cf6-380efd34aaad" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066396 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="787f934e-4f31-4b00-8cf6-380efd34aaad" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066413 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="rsync" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066427 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="rsync" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066450 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="735fa97d-e751-4957-bdae-3ae0b10635d2" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066461 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="735fa97d-e751-4957-bdae-3ae0b10635d2" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066474 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" containerName="barbican-worker" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066488 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" containerName="barbican-worker" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066502 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="sg-core" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066515 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="sg-core" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066534 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ffb838-3774-402c-9cdf-d11e51fb21e5" containerName="barbican-keystone-listener-log" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066547 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ffb838-3774-402c-9cdf-d11e51fb21e5" containerName="barbican-keystone-listener-log" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066566 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ffb838-3774-402c-9cdf-d11e51fb21e5" containerName="barbican-keystone-listener" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066578 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ffb838-3774-402c-9cdf-d11e51fb21e5" containerName="barbican-keystone-listener" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066600 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37dd5983-0d4d-4097-8657-f408e9bc68c0" containerName="nova-api-log" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066611 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="37dd5983-0d4d-4097-8657-f408e9bc68c0" containerName="nova-api-log" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066629 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-replicator" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066641 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-replicator" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066661 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7453b38-f6c3-4fe7-b15d-5bd8112dc687" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066673 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7453b38-f6c3-4fe7-b15d-5bd8112dc687" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066698 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="proxy-httpd" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066710 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="proxy-httpd" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066726 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e599e18f-63c0-4756-845c-973257921fd0" containerName="galera" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066737 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e599e18f-63c0-4756-845c-973257921fd0" containerName="galera" Oct 07 12:47:20 crc kubenswrapper[4854]: E1007 12:47:20.066751 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79513100-48d2-4e7b-ae14-888322cab8f3" containerName="rabbitmq" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066762 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="79513100-48d2-4e7b-ae14-888322cab8f3" containerName="rabbitmq" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.066997 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="13453d02-4f55-45a7-98be-1cd41c741a3e" containerName="glance-httpd" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067025 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-server" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067042 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6002f7a4-27d6-4554-a486-87926ebcf57e" containerName="barbican-api" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067055 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" containerName="barbican-worker-log" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067075 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="016c2264-9ba4-48c0-b416-02c468232b6b" containerName="nova-scheduler-scheduler" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067088 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac535972-fa59-4e7f-818b-345da6937c14" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067103 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="698aae03-92da-4cc2-a9d2-ecdb5f143439" containerName="openstack-network-exporter" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067215 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-auditor" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067233 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6002f7a4-27d6-4554-a486-87926ebcf57e" containerName="barbican-api-log" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067248 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ffb838-3774-402c-9cdf-d11e51fb21e5" containerName="barbican-keystone-listener-log" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067268 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="787f934e-4f31-4b00-8cf6-380efd34aaad" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067292 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-server" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067311 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="ceilometer-central-agent" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067520 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="79513100-48d2-4e7b-ae14-888322cab8f3" containerName="rabbitmq" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067538 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2cebadb-2142-477a-85b3-53e7c73fa6cc" containerName="memcached" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067558 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="swift-recon-cron" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067577 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5179f78d-3a8f-4621-95f3-e147ff8da79f" containerName="barbican-worker-log" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067595 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5179f78d-3a8f-4621-95f3-e147ff8da79f" containerName="barbican-worker" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067613 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ffb838-3774-402c-9cdf-d11e51fb21e5" containerName="barbican-keystone-listener" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067632 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="77bd17d4-00f8-4a1f-ba0a-e57f5cdbea2e" containerName="barbican-worker" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067650 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f725ba88-4d40-4eab-890d-e114448fabe9" containerName="galera" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067667 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a03c4a0d-6346-43e4-8db1-f653b5dfa420" containerName="placement-log" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067679 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="proxy-httpd" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067694 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-updater" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067709 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-replicator" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067723 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="sg-core" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067740 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="698aae03-92da-4cc2-a9d2-ecdb5f143439" containerName="ovn-northd" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067753 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-auditor" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067771 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="37dd5983-0d4d-4097-8657-f408e9bc68c0" containerName="nova-api-log" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067784 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="e599e18f-63c0-4756-845c-973257921fd0" containerName="galera" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067799 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovsdb-server" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067812 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-server" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067828 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9869a8d4-db8e-4aba-82d1-6d02c3cf988e" containerName="keystone-api" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067850 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e6702f3-b113-49f9-b85f-a2d294bac6dc" containerName="ovn-controller" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067872 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a03c4a0d-6346-43e4-8db1-f653b5dfa420" containerName="placement-api" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067891 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="47e6a48a-4ef0-4764-a132-50140d86a6b2" containerName="ceilometer-notification-agent" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067951 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="37dd5983-0d4d-4097-8657-f408e9bc68c0" containerName="nova-api-api" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067971 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="rsync" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.067990 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c26a76-531a-4a6b-ac0f-6aa23680f903" containerName="neutron-httpd" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068012 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="735fa97d-e751-4957-bdae-3ae0b10635d2" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068029 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7453b38-f6c3-4fe7-b15d-5bd8112dc687" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068044 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c47fe71-3b92-4490-8488-d98d0e25519e" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068065 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-replicator" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068080 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="847eb385-fc80-4568-813d-638dac11d81a" containerName="ovs-vswitchd" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068099 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-replicator" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068110 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="e309be64-7a5a-4156-89a6-d1201eaaff63" containerName="kube-state-metrics" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068124 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-updater" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068137 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c26a76-531a-4a6b-ac0f-6aa23680f903" containerName="neutron-api" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068189 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="66f80399-ed98-4aba-9db5-759ad2e314fa" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068211 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="13453d02-4f55-45a7-98be-1cd41c741a3e" containerName="glance-log" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068230 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaa0738a-daf1-479f-9dbd-913806703370" containerName="mariadb-account-delete" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068251 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="21065050-7bdc-4f4e-9a7b-9dbcc2dab200" containerName="cinder-scheduler" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068265 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="account-reaper" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068293 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="container-auditor" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068313 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c293f13-b2a5-4d4b-9f69-fd118e34eab2" containerName="rabbitmq" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068324 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="cac73de2-996a-4e04-abde-1153b44058bc" containerName="nova-cell1-conductor-conductor" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068343 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="21065050-7bdc-4f4e-9a7b-9dbcc2dab200" containerName="probe" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.068362 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9410d0-f08a-4288-901b-8c28b54f6d53" containerName="object-expirer" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.072081 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.081332 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mcv7r"] Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.192616 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hplqp\" (UniqueName: \"kubernetes.io/projected/99150e60-b286-431a-8f36-0c22ab89715c-kube-api-access-hplqp\") pod \"redhat-marketplace-mcv7r\" (UID: \"99150e60-b286-431a-8f36-0c22ab89715c\") " pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.192674 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99150e60-b286-431a-8f36-0c22ab89715c-catalog-content\") pod \"redhat-marketplace-mcv7r\" (UID: \"99150e60-b286-431a-8f36-0c22ab89715c\") " pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.192700 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99150e60-b286-431a-8f36-0c22ab89715c-utilities\") pod \"redhat-marketplace-mcv7r\" (UID: \"99150e60-b286-431a-8f36-0c22ab89715c\") " pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.293842 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hplqp\" (UniqueName: \"kubernetes.io/projected/99150e60-b286-431a-8f36-0c22ab89715c-kube-api-access-hplqp\") pod \"redhat-marketplace-mcv7r\" (UID: \"99150e60-b286-431a-8f36-0c22ab89715c\") " pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.293922 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99150e60-b286-431a-8f36-0c22ab89715c-catalog-content\") pod \"redhat-marketplace-mcv7r\" (UID: \"99150e60-b286-431a-8f36-0c22ab89715c\") " pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.293980 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99150e60-b286-431a-8f36-0c22ab89715c-utilities\") pod \"redhat-marketplace-mcv7r\" (UID: \"99150e60-b286-431a-8f36-0c22ab89715c\") " pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.294916 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99150e60-b286-431a-8f36-0c22ab89715c-utilities\") pod \"redhat-marketplace-mcv7r\" (UID: \"99150e60-b286-431a-8f36-0c22ab89715c\") " pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.295046 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99150e60-b286-431a-8f36-0c22ab89715c-catalog-content\") pod \"redhat-marketplace-mcv7r\" (UID: \"99150e60-b286-431a-8f36-0c22ab89715c\") " pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.327206 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hplqp\" (UniqueName: \"kubernetes.io/projected/99150e60-b286-431a-8f36-0c22ab89715c-kube-api-access-hplqp\") pod \"redhat-marketplace-mcv7r\" (UID: \"99150e60-b286-431a-8f36-0c22ab89715c\") " pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.396861 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:20 crc kubenswrapper[4854]: I1007 12:47:20.660237 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mcv7r"] Oct 07 12:47:21 crc kubenswrapper[4854]: I1007 12:47:21.548992 4854 generic.go:334] "Generic (PLEG): container finished" podID="99150e60-b286-431a-8f36-0c22ab89715c" containerID="4c27d8e8f6a45b70fb1b85b5eceabe620cc1c40392b68d21dc4b56adabe1de21" exitCode=0 Oct 07 12:47:21 crc kubenswrapper[4854]: I1007 12:47:21.549086 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mcv7r" event={"ID":"99150e60-b286-431a-8f36-0c22ab89715c","Type":"ContainerDied","Data":"4c27d8e8f6a45b70fb1b85b5eceabe620cc1c40392b68d21dc4b56adabe1de21"} Oct 07 12:47:21 crc kubenswrapper[4854]: I1007 12:47:21.549567 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mcv7r" event={"ID":"99150e60-b286-431a-8f36-0c22ab89715c","Type":"ContainerStarted","Data":"3cb2cd2a17fe2dc0ce51f77329f54aa1a7226837f3ffadbf7456df95d470e6c8"} Oct 07 12:47:23 crc kubenswrapper[4854]: I1007 12:47:23.576632 4854 generic.go:334] "Generic (PLEG): container finished" podID="99150e60-b286-431a-8f36-0c22ab89715c" containerID="0688583cf48f483af905a32475a52808fd78d26e4007aeead0bfbc2e2c46d8ae" exitCode=0 Oct 07 12:47:23 crc kubenswrapper[4854]: I1007 12:47:23.576753 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mcv7r" event={"ID":"99150e60-b286-431a-8f36-0c22ab89715c","Type":"ContainerDied","Data":"0688583cf48f483af905a32475a52808fd78d26e4007aeead0bfbc2e2c46d8ae"} Oct 07 12:47:24 crc kubenswrapper[4854]: I1007 12:47:24.591767 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mcv7r" event={"ID":"99150e60-b286-431a-8f36-0c22ab89715c","Type":"ContainerStarted","Data":"21b6f882c22977b9f0a0645efa9c6822f245a2bdbbde979fd5171aa4b91668e8"} Oct 07 12:47:24 crc kubenswrapper[4854]: I1007 12:47:24.613400 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mcv7r" podStartSLOduration=1.932867505 podStartE2EDuration="4.613377944s" podCreationTimestamp="2025-10-07 12:47:20 +0000 UTC" firstStartedPulling="2025-10-07 12:47:21.552919451 +0000 UTC m=+1357.540751746" lastFinishedPulling="2025-10-07 12:47:24.23342992 +0000 UTC m=+1360.221262185" observedRunningTime="2025-10-07 12:47:24.61219384 +0000 UTC m=+1360.600026125" watchObservedRunningTime="2025-10-07 12:47:24.613377944 +0000 UTC m=+1360.601210209" Oct 07 12:47:30 crc kubenswrapper[4854]: I1007 12:47:30.397580 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:30 crc kubenswrapper[4854]: I1007 12:47:30.398421 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:30 crc kubenswrapper[4854]: I1007 12:47:30.461872 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:30 crc kubenswrapper[4854]: I1007 12:47:30.725511 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:30 crc kubenswrapper[4854]: I1007 12:47:30.777439 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mcv7r"] Oct 07 12:47:32 crc kubenswrapper[4854]: I1007 12:47:32.687198 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mcv7r" podUID="99150e60-b286-431a-8f36-0c22ab89715c" containerName="registry-server" containerID="cri-o://21b6f882c22977b9f0a0645efa9c6822f245a2bdbbde979fd5171aa4b91668e8" gracePeriod=2 Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.637796 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.700032 4854 generic.go:334] "Generic (PLEG): container finished" podID="99150e60-b286-431a-8f36-0c22ab89715c" containerID="21b6f882c22977b9f0a0645efa9c6822f245a2bdbbde979fd5171aa4b91668e8" exitCode=0 Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.700086 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mcv7r" event={"ID":"99150e60-b286-431a-8f36-0c22ab89715c","Type":"ContainerDied","Data":"21b6f882c22977b9f0a0645efa9c6822f245a2bdbbde979fd5171aa4b91668e8"} Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.700102 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mcv7r" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.700181 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mcv7r" event={"ID":"99150e60-b286-431a-8f36-0c22ab89715c","Type":"ContainerDied","Data":"3cb2cd2a17fe2dc0ce51f77329f54aa1a7226837f3ffadbf7456df95d470e6c8"} Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.700214 4854 scope.go:117] "RemoveContainer" containerID="21b6f882c22977b9f0a0645efa9c6822f245a2bdbbde979fd5171aa4b91668e8" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.727243 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99150e60-b286-431a-8f36-0c22ab89715c-catalog-content\") pod \"99150e60-b286-431a-8f36-0c22ab89715c\" (UID: \"99150e60-b286-431a-8f36-0c22ab89715c\") " Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.727295 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hplqp\" (UniqueName: \"kubernetes.io/projected/99150e60-b286-431a-8f36-0c22ab89715c-kube-api-access-hplqp\") pod \"99150e60-b286-431a-8f36-0c22ab89715c\" (UID: \"99150e60-b286-431a-8f36-0c22ab89715c\") " Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.727372 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99150e60-b286-431a-8f36-0c22ab89715c-utilities\") pod \"99150e60-b286-431a-8f36-0c22ab89715c\" (UID: \"99150e60-b286-431a-8f36-0c22ab89715c\") " Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.732992 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99150e60-b286-431a-8f36-0c22ab89715c-utilities" (OuterVolumeSpecName: "utilities") pod "99150e60-b286-431a-8f36-0c22ab89715c" (UID: "99150e60-b286-431a-8f36-0c22ab89715c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.739386 4854 scope.go:117] "RemoveContainer" containerID="0688583cf48f483af905a32475a52808fd78d26e4007aeead0bfbc2e2c46d8ae" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.740561 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99150e60-b286-431a-8f36-0c22ab89715c-kube-api-access-hplqp" (OuterVolumeSpecName: "kube-api-access-hplqp") pod "99150e60-b286-431a-8f36-0c22ab89715c" (UID: "99150e60-b286-431a-8f36-0c22ab89715c"). InnerVolumeSpecName "kube-api-access-hplqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.773684 4854 scope.go:117] "RemoveContainer" containerID="4c27d8e8f6a45b70fb1b85b5eceabe620cc1c40392b68d21dc4b56adabe1de21" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.801315 4854 scope.go:117] "RemoveContainer" containerID="21b6f882c22977b9f0a0645efa9c6822f245a2bdbbde979fd5171aa4b91668e8" Oct 07 12:47:33 crc kubenswrapper[4854]: E1007 12:47:33.801769 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21b6f882c22977b9f0a0645efa9c6822f245a2bdbbde979fd5171aa4b91668e8\": container with ID starting with 21b6f882c22977b9f0a0645efa9c6822f245a2bdbbde979fd5171aa4b91668e8 not found: ID does not exist" containerID="21b6f882c22977b9f0a0645efa9c6822f245a2bdbbde979fd5171aa4b91668e8" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.801806 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21b6f882c22977b9f0a0645efa9c6822f245a2bdbbde979fd5171aa4b91668e8"} err="failed to get container status \"21b6f882c22977b9f0a0645efa9c6822f245a2bdbbde979fd5171aa4b91668e8\": rpc error: code = NotFound desc = could not find container \"21b6f882c22977b9f0a0645efa9c6822f245a2bdbbde979fd5171aa4b91668e8\": container with ID starting with 21b6f882c22977b9f0a0645efa9c6822f245a2bdbbde979fd5171aa4b91668e8 not found: ID does not exist" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.801825 4854 scope.go:117] "RemoveContainer" containerID="0688583cf48f483af905a32475a52808fd78d26e4007aeead0bfbc2e2c46d8ae" Oct 07 12:47:33 crc kubenswrapper[4854]: E1007 12:47:33.802107 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0688583cf48f483af905a32475a52808fd78d26e4007aeead0bfbc2e2c46d8ae\": container with ID starting with 0688583cf48f483af905a32475a52808fd78d26e4007aeead0bfbc2e2c46d8ae not found: ID does not exist" containerID="0688583cf48f483af905a32475a52808fd78d26e4007aeead0bfbc2e2c46d8ae" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.802132 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0688583cf48f483af905a32475a52808fd78d26e4007aeead0bfbc2e2c46d8ae"} err="failed to get container status \"0688583cf48f483af905a32475a52808fd78d26e4007aeead0bfbc2e2c46d8ae\": rpc error: code = NotFound desc = could not find container \"0688583cf48f483af905a32475a52808fd78d26e4007aeead0bfbc2e2c46d8ae\": container with ID starting with 0688583cf48f483af905a32475a52808fd78d26e4007aeead0bfbc2e2c46d8ae not found: ID does not exist" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.802162 4854 scope.go:117] "RemoveContainer" containerID="4c27d8e8f6a45b70fb1b85b5eceabe620cc1c40392b68d21dc4b56adabe1de21" Oct 07 12:47:33 crc kubenswrapper[4854]: E1007 12:47:33.802497 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c27d8e8f6a45b70fb1b85b5eceabe620cc1c40392b68d21dc4b56adabe1de21\": container with ID starting with 4c27d8e8f6a45b70fb1b85b5eceabe620cc1c40392b68d21dc4b56adabe1de21 not found: ID does not exist" containerID="4c27d8e8f6a45b70fb1b85b5eceabe620cc1c40392b68d21dc4b56adabe1de21" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.802555 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c27d8e8f6a45b70fb1b85b5eceabe620cc1c40392b68d21dc4b56adabe1de21"} err="failed to get container status \"4c27d8e8f6a45b70fb1b85b5eceabe620cc1c40392b68d21dc4b56adabe1de21\": rpc error: code = NotFound desc = could not find container \"4c27d8e8f6a45b70fb1b85b5eceabe620cc1c40392b68d21dc4b56adabe1de21\": container with ID starting with 4c27d8e8f6a45b70fb1b85b5eceabe620cc1c40392b68d21dc4b56adabe1de21 not found: ID does not exist" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.828064 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99150e60-b286-431a-8f36-0c22ab89715c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99150e60-b286-431a-8f36-0c22ab89715c" (UID: "99150e60-b286-431a-8f36-0c22ab89715c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.828961 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99150e60-b286-431a-8f36-0c22ab89715c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.828985 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hplqp\" (UniqueName: \"kubernetes.io/projected/99150e60-b286-431a-8f36-0c22ab89715c-kube-api-access-hplqp\") on node \"crc\" DevicePath \"\"" Oct 07 12:47:33 crc kubenswrapper[4854]: I1007 12:47:33.828997 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99150e60-b286-431a-8f36-0c22ab89715c-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:47:34 crc kubenswrapper[4854]: I1007 12:47:34.066982 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mcv7r"] Oct 07 12:47:34 crc kubenswrapper[4854]: I1007 12:47:34.076421 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mcv7r"] Oct 07 12:47:34 crc kubenswrapper[4854]: I1007 12:47:34.718821 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99150e60-b286-431a-8f36-0c22ab89715c" path="/var/lib/kubelet/pods/99150e60-b286-431a-8f36-0c22ab89715c/volumes" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.290698 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rh2d9"] Oct 07 12:47:57 crc kubenswrapper[4854]: E1007 12:47:57.291767 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99150e60-b286-431a-8f36-0c22ab89715c" containerName="extract-utilities" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.291788 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="99150e60-b286-431a-8f36-0c22ab89715c" containerName="extract-utilities" Oct 07 12:47:57 crc kubenswrapper[4854]: E1007 12:47:57.291813 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99150e60-b286-431a-8f36-0c22ab89715c" containerName="registry-server" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.291827 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="99150e60-b286-431a-8f36-0c22ab89715c" containerName="registry-server" Oct 07 12:47:57 crc kubenswrapper[4854]: E1007 12:47:57.291843 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99150e60-b286-431a-8f36-0c22ab89715c" containerName="extract-content" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.291856 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="99150e60-b286-431a-8f36-0c22ab89715c" containerName="extract-content" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.292142 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="99150e60-b286-431a-8f36-0c22ab89715c" containerName="registry-server" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.296233 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.300012 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rh2d9"] Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.431381 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwbpv\" (UniqueName: \"kubernetes.io/projected/9f298e6d-9539-454c-90b8-2792b0fd10fc-kube-api-access-vwbpv\") pod \"community-operators-rh2d9\" (UID: \"9f298e6d-9539-454c-90b8-2792b0fd10fc\") " pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.431499 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f298e6d-9539-454c-90b8-2792b0fd10fc-catalog-content\") pod \"community-operators-rh2d9\" (UID: \"9f298e6d-9539-454c-90b8-2792b0fd10fc\") " pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.431600 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f298e6d-9539-454c-90b8-2792b0fd10fc-utilities\") pod \"community-operators-rh2d9\" (UID: \"9f298e6d-9539-454c-90b8-2792b0fd10fc\") " pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.532927 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwbpv\" (UniqueName: \"kubernetes.io/projected/9f298e6d-9539-454c-90b8-2792b0fd10fc-kube-api-access-vwbpv\") pod \"community-operators-rh2d9\" (UID: \"9f298e6d-9539-454c-90b8-2792b0fd10fc\") " pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.533268 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f298e6d-9539-454c-90b8-2792b0fd10fc-catalog-content\") pod \"community-operators-rh2d9\" (UID: \"9f298e6d-9539-454c-90b8-2792b0fd10fc\") " pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.533309 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f298e6d-9539-454c-90b8-2792b0fd10fc-utilities\") pod \"community-operators-rh2d9\" (UID: \"9f298e6d-9539-454c-90b8-2792b0fd10fc\") " pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.533817 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f298e6d-9539-454c-90b8-2792b0fd10fc-catalog-content\") pod \"community-operators-rh2d9\" (UID: \"9f298e6d-9539-454c-90b8-2792b0fd10fc\") " pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.533875 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f298e6d-9539-454c-90b8-2792b0fd10fc-utilities\") pod \"community-operators-rh2d9\" (UID: \"9f298e6d-9539-454c-90b8-2792b0fd10fc\") " pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.552375 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwbpv\" (UniqueName: \"kubernetes.io/projected/9f298e6d-9539-454c-90b8-2792b0fd10fc-kube-api-access-vwbpv\") pod \"community-operators-rh2d9\" (UID: \"9f298e6d-9539-454c-90b8-2792b0fd10fc\") " pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:47:57 crc kubenswrapper[4854]: I1007 12:47:57.643985 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:47:58 crc kubenswrapper[4854]: I1007 12:47:58.162913 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rh2d9"] Oct 07 12:47:58 crc kubenswrapper[4854]: I1007 12:47:58.987435 4854 generic.go:334] "Generic (PLEG): container finished" podID="9f298e6d-9539-454c-90b8-2792b0fd10fc" containerID="0f5e14d60f0499d27481534ca9b69b1a59d9210d677c25afa60af3629e3faa37" exitCode=0 Oct 07 12:47:58 crc kubenswrapper[4854]: I1007 12:47:58.987514 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh2d9" event={"ID":"9f298e6d-9539-454c-90b8-2792b0fd10fc","Type":"ContainerDied","Data":"0f5e14d60f0499d27481534ca9b69b1a59d9210d677c25afa60af3629e3faa37"} Oct 07 12:47:58 crc kubenswrapper[4854]: I1007 12:47:58.987554 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh2d9" event={"ID":"9f298e6d-9539-454c-90b8-2792b0fd10fc","Type":"ContainerStarted","Data":"0b344fb97be6dba3a24643ba3e4076b2e14d08bb3403b9dd638da443175b12c1"} Oct 07 12:48:01 crc kubenswrapper[4854]: I1007 12:48:01.008065 4854 generic.go:334] "Generic (PLEG): container finished" podID="9f298e6d-9539-454c-90b8-2792b0fd10fc" containerID="6e575e9488ac0070a3bf7ec2f748c30fe38dd139a530e79231f8d3519dcaa973" exitCode=0 Oct 07 12:48:01 crc kubenswrapper[4854]: I1007 12:48:01.008191 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh2d9" event={"ID":"9f298e6d-9539-454c-90b8-2792b0fd10fc","Type":"ContainerDied","Data":"6e575e9488ac0070a3bf7ec2f748c30fe38dd139a530e79231f8d3519dcaa973"} Oct 07 12:48:02 crc kubenswrapper[4854]: I1007 12:48:02.018212 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh2d9" event={"ID":"9f298e6d-9539-454c-90b8-2792b0fd10fc","Type":"ContainerStarted","Data":"98a032ba2843e454f5507265a2ac9372cea464fe703264f7629bd9265bff3aca"} Oct 07 12:48:02 crc kubenswrapper[4854]: I1007 12:48:02.041100 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rh2d9" podStartSLOduration=2.616295977 podStartE2EDuration="5.041083936s" podCreationTimestamp="2025-10-07 12:47:57 +0000 UTC" firstStartedPulling="2025-10-07 12:47:58.989376002 +0000 UTC m=+1394.977208297" lastFinishedPulling="2025-10-07 12:48:01.414164001 +0000 UTC m=+1397.401996256" observedRunningTime="2025-10-07 12:48:02.03777267 +0000 UTC m=+1398.025605005" watchObservedRunningTime="2025-10-07 12:48:02.041083936 +0000 UTC m=+1398.028916191" Oct 07 12:48:07 crc kubenswrapper[4854]: I1007 12:48:07.644609 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:48:07 crc kubenswrapper[4854]: I1007 12:48:07.645411 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:48:07 crc kubenswrapper[4854]: I1007 12:48:07.706130 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:48:08 crc kubenswrapper[4854]: I1007 12:48:08.113705 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:48:08 crc kubenswrapper[4854]: I1007 12:48:08.171711 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rh2d9"] Oct 07 12:48:10 crc kubenswrapper[4854]: I1007 12:48:10.087293 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rh2d9" podUID="9f298e6d-9539-454c-90b8-2792b0fd10fc" containerName="registry-server" containerID="cri-o://98a032ba2843e454f5507265a2ac9372cea464fe703264f7629bd9265bff3aca" gracePeriod=2 Oct 07 12:48:10 crc kubenswrapper[4854]: I1007 12:48:10.484988 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:48:10 crc kubenswrapper[4854]: I1007 12:48:10.615281 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f298e6d-9539-454c-90b8-2792b0fd10fc-catalog-content\") pod \"9f298e6d-9539-454c-90b8-2792b0fd10fc\" (UID: \"9f298e6d-9539-454c-90b8-2792b0fd10fc\") " Oct 07 12:48:10 crc kubenswrapper[4854]: I1007 12:48:10.615375 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f298e6d-9539-454c-90b8-2792b0fd10fc-utilities\") pod \"9f298e6d-9539-454c-90b8-2792b0fd10fc\" (UID: \"9f298e6d-9539-454c-90b8-2792b0fd10fc\") " Oct 07 12:48:10 crc kubenswrapper[4854]: I1007 12:48:10.615442 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwbpv\" (UniqueName: \"kubernetes.io/projected/9f298e6d-9539-454c-90b8-2792b0fd10fc-kube-api-access-vwbpv\") pod \"9f298e6d-9539-454c-90b8-2792b0fd10fc\" (UID: \"9f298e6d-9539-454c-90b8-2792b0fd10fc\") " Oct 07 12:48:10 crc kubenswrapper[4854]: I1007 12:48:10.617567 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f298e6d-9539-454c-90b8-2792b0fd10fc-utilities" (OuterVolumeSpecName: "utilities") pod "9f298e6d-9539-454c-90b8-2792b0fd10fc" (UID: "9f298e6d-9539-454c-90b8-2792b0fd10fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:48:10 crc kubenswrapper[4854]: I1007 12:48:10.627819 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f298e6d-9539-454c-90b8-2792b0fd10fc-kube-api-access-vwbpv" (OuterVolumeSpecName: "kube-api-access-vwbpv") pod "9f298e6d-9539-454c-90b8-2792b0fd10fc" (UID: "9f298e6d-9539-454c-90b8-2792b0fd10fc"). InnerVolumeSpecName "kube-api-access-vwbpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:48:10 crc kubenswrapper[4854]: I1007 12:48:10.716995 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f298e6d-9539-454c-90b8-2792b0fd10fc-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:48:10 crc kubenswrapper[4854]: I1007 12:48:10.717047 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwbpv\" (UniqueName: \"kubernetes.io/projected/9f298e6d-9539-454c-90b8-2792b0fd10fc-kube-api-access-vwbpv\") on node \"crc\" DevicePath \"\"" Oct 07 12:48:10 crc kubenswrapper[4854]: I1007 12:48:10.807460 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:48:10 crc kubenswrapper[4854]: I1007 12:48:10.807543 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:48:10 crc kubenswrapper[4854]: I1007 12:48:10.902922 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f298e6d-9539-454c-90b8-2792b0fd10fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f298e6d-9539-454c-90b8-2792b0fd10fc" (UID: "9f298e6d-9539-454c-90b8-2792b0fd10fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:48:10 crc kubenswrapper[4854]: I1007 12:48:10.921499 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f298e6d-9539-454c-90b8-2792b0fd10fc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.099857 4854 generic.go:334] "Generic (PLEG): container finished" podID="9f298e6d-9539-454c-90b8-2792b0fd10fc" containerID="98a032ba2843e454f5507265a2ac9372cea464fe703264f7629bd9265bff3aca" exitCode=0 Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.099920 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh2d9" event={"ID":"9f298e6d-9539-454c-90b8-2792b0fd10fc","Type":"ContainerDied","Data":"98a032ba2843e454f5507265a2ac9372cea464fe703264f7629bd9265bff3aca"} Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.099945 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rh2d9" Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.099971 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh2d9" event={"ID":"9f298e6d-9539-454c-90b8-2792b0fd10fc","Type":"ContainerDied","Data":"0b344fb97be6dba3a24643ba3e4076b2e14d08bb3403b9dd638da443175b12c1"} Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.100001 4854 scope.go:117] "RemoveContainer" containerID="98a032ba2843e454f5507265a2ac9372cea464fe703264f7629bd9265bff3aca" Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.129068 4854 scope.go:117] "RemoveContainer" containerID="6e575e9488ac0070a3bf7ec2f748c30fe38dd139a530e79231f8d3519dcaa973" Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.153346 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rh2d9"] Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.161646 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rh2d9"] Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.165533 4854 scope.go:117] "RemoveContainer" containerID="0f5e14d60f0499d27481534ca9b69b1a59d9210d677c25afa60af3629e3faa37" Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.200811 4854 scope.go:117] "RemoveContainer" containerID="98a032ba2843e454f5507265a2ac9372cea464fe703264f7629bd9265bff3aca" Oct 07 12:48:11 crc kubenswrapper[4854]: E1007 12:48:11.201306 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98a032ba2843e454f5507265a2ac9372cea464fe703264f7629bd9265bff3aca\": container with ID starting with 98a032ba2843e454f5507265a2ac9372cea464fe703264f7629bd9265bff3aca not found: ID does not exist" containerID="98a032ba2843e454f5507265a2ac9372cea464fe703264f7629bd9265bff3aca" Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.201344 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98a032ba2843e454f5507265a2ac9372cea464fe703264f7629bd9265bff3aca"} err="failed to get container status \"98a032ba2843e454f5507265a2ac9372cea464fe703264f7629bd9265bff3aca\": rpc error: code = NotFound desc = could not find container \"98a032ba2843e454f5507265a2ac9372cea464fe703264f7629bd9265bff3aca\": container with ID starting with 98a032ba2843e454f5507265a2ac9372cea464fe703264f7629bd9265bff3aca not found: ID does not exist" Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.201368 4854 scope.go:117] "RemoveContainer" containerID="6e575e9488ac0070a3bf7ec2f748c30fe38dd139a530e79231f8d3519dcaa973" Oct 07 12:48:11 crc kubenswrapper[4854]: E1007 12:48:11.201809 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e575e9488ac0070a3bf7ec2f748c30fe38dd139a530e79231f8d3519dcaa973\": container with ID starting with 6e575e9488ac0070a3bf7ec2f748c30fe38dd139a530e79231f8d3519dcaa973 not found: ID does not exist" containerID="6e575e9488ac0070a3bf7ec2f748c30fe38dd139a530e79231f8d3519dcaa973" Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.201828 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e575e9488ac0070a3bf7ec2f748c30fe38dd139a530e79231f8d3519dcaa973"} err="failed to get container status \"6e575e9488ac0070a3bf7ec2f748c30fe38dd139a530e79231f8d3519dcaa973\": rpc error: code = NotFound desc = could not find container \"6e575e9488ac0070a3bf7ec2f748c30fe38dd139a530e79231f8d3519dcaa973\": container with ID starting with 6e575e9488ac0070a3bf7ec2f748c30fe38dd139a530e79231f8d3519dcaa973 not found: ID does not exist" Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.201843 4854 scope.go:117] "RemoveContainer" containerID="0f5e14d60f0499d27481534ca9b69b1a59d9210d677c25afa60af3629e3faa37" Oct 07 12:48:11 crc kubenswrapper[4854]: E1007 12:48:11.202125 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f5e14d60f0499d27481534ca9b69b1a59d9210d677c25afa60af3629e3faa37\": container with ID starting with 0f5e14d60f0499d27481534ca9b69b1a59d9210d677c25afa60af3629e3faa37 not found: ID does not exist" containerID="0f5e14d60f0499d27481534ca9b69b1a59d9210d677c25afa60af3629e3faa37" Oct 07 12:48:11 crc kubenswrapper[4854]: I1007 12:48:11.202141 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f5e14d60f0499d27481534ca9b69b1a59d9210d677c25afa60af3629e3faa37"} err="failed to get container status \"0f5e14d60f0499d27481534ca9b69b1a59d9210d677c25afa60af3629e3faa37\": rpc error: code = NotFound desc = could not find container \"0f5e14d60f0499d27481534ca9b69b1a59d9210d677c25afa60af3629e3faa37\": container with ID starting with 0f5e14d60f0499d27481534ca9b69b1a59d9210d677c25afa60af3629e3faa37 not found: ID does not exist" Oct 07 12:48:12 crc kubenswrapper[4854]: I1007 12:48:12.717959 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f298e6d-9539-454c-90b8-2792b0fd10fc" path="/var/lib/kubelet/pods/9f298e6d-9539-454c-90b8-2792b0fd10fc/volumes" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.031113 4854 scope.go:117] "RemoveContainer" containerID="1e97e61af866a59485cfb0388fedb4446c940191621505d289d2386b79d94bba" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.062091 4854 scope.go:117] "RemoveContainer" containerID="4c83e5b21d149d18964b200c85f6be36b591a3e55657f010be96d19c43e54add" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.124435 4854 scope.go:117] "RemoveContainer" containerID="cd9b6d596186c3b7a5af7dc88f6515e0c483d24ba77e4029222b5d08fd5ab2b3" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.145129 4854 scope.go:117] "RemoveContainer" containerID="22df39064c8d3fc8a015bb827b717afaeaa993111ac4bb03c31fa5e53e7438f3" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.185627 4854 scope.go:117] "RemoveContainer" containerID="9a8de10acfbda038f5f2220e688063f4e5b34be8fcfd2fa26c8f493b2c99adb4" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.208074 4854 scope.go:117] "RemoveContainer" containerID="ca0db39e60e9d204fd72ee919b6b446ed0dae75bed3ce0f642c58490cabd0159" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.234499 4854 scope.go:117] "RemoveContainer" containerID="214457d553a0c0b56dcc7fb564ba8c1329e444532bf12a604618bd94f53179fc" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.279486 4854 scope.go:117] "RemoveContainer" containerID="6f2b3d88892a064ad90556a7c9f95263f2d028c6a003fde0f818fd8da3809ec3" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.312085 4854 scope.go:117] "RemoveContainer" containerID="71c7c2899fa0ebe6be9173463561bc932028e5a6606d93d0e217bab53577b794" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.338581 4854 scope.go:117] "RemoveContainer" containerID="86def19086aabe002c256380aee9698cfdc030d6deb57886de5b569a36e128f9" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.381574 4854 scope.go:117] "RemoveContainer" containerID="374a7faa09b1f2933202f50219d10eed6a8601aaa44361171ee0ef98440e6c0c" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.398778 4854 scope.go:117] "RemoveContainer" containerID="9efa40bc71134d86579d5973dade48957f94cbd186d67ecf45f9895e68a1183d" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.421420 4854 scope.go:117] "RemoveContainer" containerID="a95bd116db9ea390605319534fc00572f56d0206068c040f76d694264158b473" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.445103 4854 scope.go:117] "RemoveContainer" containerID="d6e466dc031d969ff433a05de16ba57f46bc96a928c3c245b5265fd6a8ec3281" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.465421 4854 scope.go:117] "RemoveContainer" containerID="033723c2d3fe66822608ed294192bb053c3e6f5a1e3050f8582dbdd9eee28d65" Oct 07 12:48:17 crc kubenswrapper[4854]: I1007 12:48:17.493243 4854 scope.go:117] "RemoveContainer" containerID="1a711adb612268414ef00836cdfdcd54f8b6b0c7ae72e867963de67d1ce4ec42" Oct 07 12:48:40 crc kubenswrapper[4854]: I1007 12:48:40.807235 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:48:40 crc kubenswrapper[4854]: I1007 12:48:40.807672 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:49:10 crc kubenswrapper[4854]: I1007 12:49:10.807763 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:49:10 crc kubenswrapper[4854]: I1007 12:49:10.808485 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:49:10 crc kubenswrapper[4854]: I1007 12:49:10.808541 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:49:10 crc kubenswrapper[4854]: I1007 12:49:10.809184 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"00a7905fded3000a5bba04fc59df5ffada6a53acc5865f31e13e38f5ecdf35c6"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 12:49:10 crc kubenswrapper[4854]: I1007 12:49:10.809256 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://00a7905fded3000a5bba04fc59df5ffada6a53acc5865f31e13e38f5ecdf35c6" gracePeriod=600 Oct 07 12:49:11 crc kubenswrapper[4854]: I1007 12:49:11.726526 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="00a7905fded3000a5bba04fc59df5ffada6a53acc5865f31e13e38f5ecdf35c6" exitCode=0 Oct 07 12:49:11 crc kubenswrapper[4854]: I1007 12:49:11.726612 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"00a7905fded3000a5bba04fc59df5ffada6a53acc5865f31e13e38f5ecdf35c6"} Oct 07 12:49:11 crc kubenswrapper[4854]: I1007 12:49:11.727186 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5"} Oct 07 12:49:11 crc kubenswrapper[4854]: I1007 12:49:11.727222 4854 scope.go:117] "RemoveContainer" containerID="294ebdf1eb05d7d685f741557c635439a170a1959729e348cac4935535da7799" Oct 07 12:49:17 crc kubenswrapper[4854]: I1007 12:49:17.773438 4854 scope.go:117] "RemoveContainer" containerID="b307bfacb066ee957083e0a594099299bdfed229bc968e0f23b8125a6b5c50a4" Oct 07 12:49:17 crc kubenswrapper[4854]: I1007 12:49:17.815363 4854 scope.go:117] "RemoveContainer" containerID="a426a0855633dd84dc768e5b4c01a36c6fac3eb00da8f75a86ee52fe9fa8ecd2" Oct 07 12:49:17 crc kubenswrapper[4854]: I1007 12:49:17.884909 4854 scope.go:117] "RemoveContainer" containerID="d7769a32795acfc7b8c09a637fbe56dba9fefbe79baa9e6e7baa998c4c190434" Oct 07 12:49:17 crc kubenswrapper[4854]: I1007 12:49:17.906031 4854 scope.go:117] "RemoveContainer" containerID="85e71836a3ed17c4ff8324140a6164289c633f406fdb437654545bf3071cb059" Oct 07 12:49:17 crc kubenswrapper[4854]: I1007 12:49:17.942813 4854 scope.go:117] "RemoveContainer" containerID="a9eb89ef4efa661c4083f60a2506bb42ebbedcf85f2dfd9263e2f0e6599f9588" Oct 07 12:49:17 crc kubenswrapper[4854]: I1007 12:49:17.966774 4854 scope.go:117] "RemoveContainer" containerID="4fec556e8b7c7348f34af57027490f7e654cae87c6f4e4c77e42d40f0f3e5c81" Oct 07 12:49:18 crc kubenswrapper[4854]: I1007 12:49:18.034024 4854 scope.go:117] "RemoveContainer" containerID="dda148e71ce3d2269edc8e05951c34651b383998b2ea4117e0f5773827d60523" Oct 07 12:49:18 crc kubenswrapper[4854]: I1007 12:49:18.058281 4854 scope.go:117] "RemoveContainer" containerID="b70abd6d9bcc3a200f109c4995793bf820e1b93bdac5dd9db4c650a6ff934a3d" Oct 07 12:49:18 crc kubenswrapper[4854]: I1007 12:49:18.085941 4854 scope.go:117] "RemoveContainer" containerID="ab6b9b6df7a6df7e7a7e15ca4a8d64babc1a2ae8feac7c3dc87e742af29e9944" Oct 07 12:49:18 crc kubenswrapper[4854]: I1007 12:49:18.113260 4854 scope.go:117] "RemoveContainer" containerID="6a8cb87e42de1baa0fe321962804fbc2f8635887612d2573377887b576e22a6c" Oct 07 12:49:18 crc kubenswrapper[4854]: I1007 12:49:18.141473 4854 scope.go:117] "RemoveContainer" containerID="512023c1348affae988ba42ea4c9852e6519486ab0ddc2e0e6e48fce90d79cf9" Oct 07 12:49:18 crc kubenswrapper[4854]: I1007 12:49:18.168510 4854 scope.go:117] "RemoveContainer" containerID="0a7cb163d43e07e5f8bed43aadca4cd323a0ec917a18fb9d72237c12ae237809" Oct 07 12:49:18 crc kubenswrapper[4854]: I1007 12:49:18.190047 4854 scope.go:117] "RemoveContainer" containerID="f0ff8ce507731e5f1fa6188aabef5933b1d02dc7dee5ef5454692c6f625fabed" Oct 07 12:49:18 crc kubenswrapper[4854]: I1007 12:49:18.242791 4854 scope.go:117] "RemoveContainer" containerID="0b7c87e468f2b8ba32d54b131d0c6194d6f52f79c88bd597eeb0b8875705a1d6" Oct 07 12:49:18 crc kubenswrapper[4854]: I1007 12:49:18.271191 4854 scope.go:117] "RemoveContainer" containerID="d16e3dc32eb57dea6b6b77bcedd6210c8586d9780a519f65e63564f3af418dfd" Oct 07 12:50:18 crc kubenswrapper[4854]: I1007 12:50:18.455013 4854 scope.go:117] "RemoveContainer" containerID="f08094d673885fd0314ed4b83c9aab5d37e0546bd66ad1d1040044725431047a" Oct 07 12:50:18 crc kubenswrapper[4854]: I1007 12:50:18.521745 4854 scope.go:117] "RemoveContainer" containerID="82f009346826759cec317a3226ffe23dbce339f0f5f46da29da41f5605075cb6" Oct 07 12:50:18 crc kubenswrapper[4854]: I1007 12:50:18.564352 4854 scope.go:117] "RemoveContainer" containerID="b3abd6b1e0a9375f23c25035816e904a52eb9a5e4ddd0a5fa6d460553c77a7c4" Oct 07 12:50:18 crc kubenswrapper[4854]: I1007 12:50:18.602620 4854 scope.go:117] "RemoveContainer" containerID="3318ca888028ea7d06858c1fbf4206179181c704acdcc3a3ddadfaea72e5b3a0" Oct 07 12:50:18 crc kubenswrapper[4854]: I1007 12:50:18.623227 4854 scope.go:117] "RemoveContainer" containerID="4f650846279a6172ef21f2029a264b0a99efd4af6906193c022b89126c8051db" Oct 07 12:50:18 crc kubenswrapper[4854]: I1007 12:50:18.649328 4854 scope.go:117] "RemoveContainer" containerID="22436301dc51bb07908e34417e3def5bdc13345529116eef4d8a01af67741c06" Oct 07 12:50:18 crc kubenswrapper[4854]: I1007 12:50:18.678716 4854 scope.go:117] "RemoveContainer" containerID="6ab3625afd9d5554eb97be59fc81dda17398623a3a3702f50c0d684653f13606" Oct 07 12:50:18 crc kubenswrapper[4854]: I1007 12:50:18.705512 4854 scope.go:117] "RemoveContainer" containerID="583495e30479442aa4785338cb425c3825f402c3aefd36d65751b6b6adc984b2" Oct 07 12:50:18 crc kubenswrapper[4854]: I1007 12:50:18.729253 4854 scope.go:117] "RemoveContainer" containerID="aed246739a200dc3292b614b7f083808741f3422b08b991f5eeb78b24a7512a7" Oct 07 12:50:18 crc kubenswrapper[4854]: I1007 12:50:18.762547 4854 scope.go:117] "RemoveContainer" containerID="503b644b1615c7eaf8b54360f22e93631c08812f2633b92cb50fd767300a0958" Oct 07 12:50:18 crc kubenswrapper[4854]: I1007 12:50:18.781406 4854 scope.go:117] "RemoveContainer" containerID="ed5fef242c07e5006a5c9e021bb5ac1cf15a88bbaf3824b5c1abeec3b9cc3490" Oct 07 12:50:32 crc kubenswrapper[4854]: I1007 12:50:32.953314 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rqbpx"] Oct 07 12:50:32 crc kubenswrapper[4854]: E1007 12:50:32.954866 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f298e6d-9539-454c-90b8-2792b0fd10fc" containerName="extract-content" Oct 07 12:50:32 crc kubenswrapper[4854]: I1007 12:50:32.954896 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f298e6d-9539-454c-90b8-2792b0fd10fc" containerName="extract-content" Oct 07 12:50:32 crc kubenswrapper[4854]: E1007 12:50:32.954949 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f298e6d-9539-454c-90b8-2792b0fd10fc" containerName="registry-server" Oct 07 12:50:32 crc kubenswrapper[4854]: I1007 12:50:32.954967 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f298e6d-9539-454c-90b8-2792b0fd10fc" containerName="registry-server" Oct 07 12:50:32 crc kubenswrapper[4854]: E1007 12:50:32.955013 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f298e6d-9539-454c-90b8-2792b0fd10fc" containerName="extract-utilities" Oct 07 12:50:32 crc kubenswrapper[4854]: I1007 12:50:32.955032 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f298e6d-9539-454c-90b8-2792b0fd10fc" containerName="extract-utilities" Oct 07 12:50:32 crc kubenswrapper[4854]: I1007 12:50:32.955418 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f298e6d-9539-454c-90b8-2792b0fd10fc" containerName="registry-server" Oct 07 12:50:32 crc kubenswrapper[4854]: I1007 12:50:32.957883 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:32 crc kubenswrapper[4854]: I1007 12:50:32.977664 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqbpx"] Oct 07 12:50:33 crc kubenswrapper[4854]: I1007 12:50:33.036603 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020f1b29-4e54-4e79-8271-cea1a6e3aec0-catalog-content\") pod \"certified-operators-rqbpx\" (UID: \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\") " pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:33 crc kubenswrapper[4854]: I1007 12:50:33.036673 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44n97\" (UniqueName: \"kubernetes.io/projected/020f1b29-4e54-4e79-8271-cea1a6e3aec0-kube-api-access-44n97\") pod \"certified-operators-rqbpx\" (UID: \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\") " pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:33 crc kubenswrapper[4854]: I1007 12:50:33.036742 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020f1b29-4e54-4e79-8271-cea1a6e3aec0-utilities\") pod \"certified-operators-rqbpx\" (UID: \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\") " pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:33 crc kubenswrapper[4854]: I1007 12:50:33.138183 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020f1b29-4e54-4e79-8271-cea1a6e3aec0-utilities\") pod \"certified-operators-rqbpx\" (UID: \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\") " pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:33 crc kubenswrapper[4854]: I1007 12:50:33.138326 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020f1b29-4e54-4e79-8271-cea1a6e3aec0-catalog-content\") pod \"certified-operators-rqbpx\" (UID: \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\") " pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:33 crc kubenswrapper[4854]: I1007 12:50:33.138368 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44n97\" (UniqueName: \"kubernetes.io/projected/020f1b29-4e54-4e79-8271-cea1a6e3aec0-kube-api-access-44n97\") pod \"certified-operators-rqbpx\" (UID: \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\") " pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:33 crc kubenswrapper[4854]: I1007 12:50:33.138817 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020f1b29-4e54-4e79-8271-cea1a6e3aec0-utilities\") pod \"certified-operators-rqbpx\" (UID: \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\") " pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:33 crc kubenswrapper[4854]: I1007 12:50:33.139115 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020f1b29-4e54-4e79-8271-cea1a6e3aec0-catalog-content\") pod \"certified-operators-rqbpx\" (UID: \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\") " pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:33 crc kubenswrapper[4854]: I1007 12:50:33.158411 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44n97\" (UniqueName: \"kubernetes.io/projected/020f1b29-4e54-4e79-8271-cea1a6e3aec0-kube-api-access-44n97\") pod \"certified-operators-rqbpx\" (UID: \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\") " pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:33 crc kubenswrapper[4854]: I1007 12:50:33.300736 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:33 crc kubenswrapper[4854]: I1007 12:50:33.780422 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqbpx"] Oct 07 12:50:33 crc kubenswrapper[4854]: W1007 12:50:33.791621 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod020f1b29_4e54_4e79_8271_cea1a6e3aec0.slice/crio-8d6667d54cf2501773a93fcda2942426c66d0a8a3bdcfcd0ba212b648ad1f6e2 WatchSource:0}: Error finding container 8d6667d54cf2501773a93fcda2942426c66d0a8a3bdcfcd0ba212b648ad1f6e2: Status 404 returned error can't find the container with id 8d6667d54cf2501773a93fcda2942426c66d0a8a3bdcfcd0ba212b648ad1f6e2 Oct 07 12:50:34 crc kubenswrapper[4854]: I1007 12:50:34.541338 4854 generic.go:334] "Generic (PLEG): container finished" podID="020f1b29-4e54-4e79-8271-cea1a6e3aec0" containerID="963ab04e36c398d455a6a3ae01a0b655a0c0534c542a74511802e6960e6d44d1" exitCode=0 Oct 07 12:50:34 crc kubenswrapper[4854]: I1007 12:50:34.541599 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqbpx" event={"ID":"020f1b29-4e54-4e79-8271-cea1a6e3aec0","Type":"ContainerDied","Data":"963ab04e36c398d455a6a3ae01a0b655a0c0534c542a74511802e6960e6d44d1"} Oct 07 12:50:34 crc kubenswrapper[4854]: I1007 12:50:34.541656 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqbpx" event={"ID":"020f1b29-4e54-4e79-8271-cea1a6e3aec0","Type":"ContainerStarted","Data":"8d6667d54cf2501773a93fcda2942426c66d0a8a3bdcfcd0ba212b648ad1f6e2"} Oct 07 12:50:34 crc kubenswrapper[4854]: I1007 12:50:34.546502 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 12:50:35 crc kubenswrapper[4854]: I1007 12:50:35.346455 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g9mcq"] Oct 07 12:50:35 crc kubenswrapper[4854]: I1007 12:50:35.349220 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:35 crc kubenswrapper[4854]: I1007 12:50:35.357956 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g9mcq"] Oct 07 12:50:35 crc kubenswrapper[4854]: I1007 12:50:35.469331 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2p4h\" (UniqueName: \"kubernetes.io/projected/68481956-b7a9-4578-bcd5-d216a9a324bb-kube-api-access-c2p4h\") pod \"redhat-operators-g9mcq\" (UID: \"68481956-b7a9-4578-bcd5-d216a9a324bb\") " pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:35 crc kubenswrapper[4854]: I1007 12:50:35.469890 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68481956-b7a9-4578-bcd5-d216a9a324bb-catalog-content\") pod \"redhat-operators-g9mcq\" (UID: \"68481956-b7a9-4578-bcd5-d216a9a324bb\") " pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:35 crc kubenswrapper[4854]: I1007 12:50:35.470000 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68481956-b7a9-4578-bcd5-d216a9a324bb-utilities\") pod \"redhat-operators-g9mcq\" (UID: \"68481956-b7a9-4578-bcd5-d216a9a324bb\") " pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:35 crc kubenswrapper[4854]: I1007 12:50:35.571681 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2p4h\" (UniqueName: \"kubernetes.io/projected/68481956-b7a9-4578-bcd5-d216a9a324bb-kube-api-access-c2p4h\") pod \"redhat-operators-g9mcq\" (UID: \"68481956-b7a9-4578-bcd5-d216a9a324bb\") " pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:35 crc kubenswrapper[4854]: I1007 12:50:35.571748 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68481956-b7a9-4578-bcd5-d216a9a324bb-catalog-content\") pod \"redhat-operators-g9mcq\" (UID: \"68481956-b7a9-4578-bcd5-d216a9a324bb\") " pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:35 crc kubenswrapper[4854]: I1007 12:50:35.571768 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68481956-b7a9-4578-bcd5-d216a9a324bb-utilities\") pod \"redhat-operators-g9mcq\" (UID: \"68481956-b7a9-4578-bcd5-d216a9a324bb\") " pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:35 crc kubenswrapper[4854]: I1007 12:50:35.572207 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68481956-b7a9-4578-bcd5-d216a9a324bb-utilities\") pod \"redhat-operators-g9mcq\" (UID: \"68481956-b7a9-4578-bcd5-d216a9a324bb\") " pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:35 crc kubenswrapper[4854]: I1007 12:50:35.572263 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68481956-b7a9-4578-bcd5-d216a9a324bb-catalog-content\") pod \"redhat-operators-g9mcq\" (UID: \"68481956-b7a9-4578-bcd5-d216a9a324bb\") " pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:35 crc kubenswrapper[4854]: I1007 12:50:35.605322 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2p4h\" (UniqueName: \"kubernetes.io/projected/68481956-b7a9-4578-bcd5-d216a9a324bb-kube-api-access-c2p4h\") pod \"redhat-operators-g9mcq\" (UID: \"68481956-b7a9-4578-bcd5-d216a9a324bb\") " pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:35 crc kubenswrapper[4854]: I1007 12:50:35.678752 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:35 crc kubenswrapper[4854]: I1007 12:50:35.909836 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g9mcq"] Oct 07 12:50:35 crc kubenswrapper[4854]: W1007 12:50:35.923939 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68481956_b7a9_4578_bcd5_d216a9a324bb.slice/crio-47c72e3dc730cf4369464c07da73e7089f6177af45feebb1a5ca513cf330ff15 WatchSource:0}: Error finding container 47c72e3dc730cf4369464c07da73e7089f6177af45feebb1a5ca513cf330ff15: Status 404 returned error can't find the container with id 47c72e3dc730cf4369464c07da73e7089f6177af45feebb1a5ca513cf330ff15 Oct 07 12:50:36 crc kubenswrapper[4854]: I1007 12:50:36.554774 4854 generic.go:334] "Generic (PLEG): container finished" podID="68481956-b7a9-4578-bcd5-d216a9a324bb" containerID="04e86231eaea54392c207e01e5dee2b05094907c92e95560976b6c4d5e06c873" exitCode=0 Oct 07 12:50:36 crc kubenswrapper[4854]: I1007 12:50:36.554823 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9mcq" event={"ID":"68481956-b7a9-4578-bcd5-d216a9a324bb","Type":"ContainerDied","Data":"04e86231eaea54392c207e01e5dee2b05094907c92e95560976b6c4d5e06c873"} Oct 07 12:50:36 crc kubenswrapper[4854]: I1007 12:50:36.554870 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9mcq" event={"ID":"68481956-b7a9-4578-bcd5-d216a9a324bb","Type":"ContainerStarted","Data":"47c72e3dc730cf4369464c07da73e7089f6177af45feebb1a5ca513cf330ff15"} Oct 07 12:50:36 crc kubenswrapper[4854]: I1007 12:50:36.557536 4854 generic.go:334] "Generic (PLEG): container finished" podID="020f1b29-4e54-4e79-8271-cea1a6e3aec0" containerID="a8144cf5600ec665daed1d04a430274f189f48b74e3bdf89876fdc9ffa71ad17" exitCode=0 Oct 07 12:50:36 crc kubenswrapper[4854]: I1007 12:50:36.557570 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqbpx" event={"ID":"020f1b29-4e54-4e79-8271-cea1a6e3aec0","Type":"ContainerDied","Data":"a8144cf5600ec665daed1d04a430274f189f48b74e3bdf89876fdc9ffa71ad17"} Oct 07 12:50:37 crc kubenswrapper[4854]: I1007 12:50:37.569020 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqbpx" event={"ID":"020f1b29-4e54-4e79-8271-cea1a6e3aec0","Type":"ContainerStarted","Data":"e53afc4de56a6ad06f7c6dca3e4af8d455e59ee8fe1294af93e9958aff1a79fb"} Oct 07 12:50:37 crc kubenswrapper[4854]: I1007 12:50:37.597363 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rqbpx" podStartSLOduration=2.922257789 podStartE2EDuration="5.59732657s" podCreationTimestamp="2025-10-07 12:50:32 +0000 UTC" firstStartedPulling="2025-10-07 12:50:34.546299063 +0000 UTC m=+1550.534131318" lastFinishedPulling="2025-10-07 12:50:37.221367834 +0000 UTC m=+1553.209200099" observedRunningTime="2025-10-07 12:50:37.587497897 +0000 UTC m=+1553.575330172" watchObservedRunningTime="2025-10-07 12:50:37.59732657 +0000 UTC m=+1553.585158865" Oct 07 12:50:38 crc kubenswrapper[4854]: I1007 12:50:38.580691 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9mcq" event={"ID":"68481956-b7a9-4578-bcd5-d216a9a324bb","Type":"ContainerStarted","Data":"4cefa3f93780643b4c4877580344e206e39b3335e2a95f7d9cef2149190bd7bf"} Oct 07 12:50:43 crc kubenswrapper[4854]: I1007 12:50:43.301372 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:43 crc kubenswrapper[4854]: I1007 12:50:43.301883 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:43 crc kubenswrapper[4854]: I1007 12:50:43.353390 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:43 crc kubenswrapper[4854]: I1007 12:50:43.618088 4854 generic.go:334] "Generic (PLEG): container finished" podID="68481956-b7a9-4578-bcd5-d216a9a324bb" containerID="4cefa3f93780643b4c4877580344e206e39b3335e2a95f7d9cef2149190bd7bf" exitCode=0 Oct 07 12:50:43 crc kubenswrapper[4854]: I1007 12:50:43.618858 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9mcq" event={"ID":"68481956-b7a9-4578-bcd5-d216a9a324bb","Type":"ContainerDied","Data":"4cefa3f93780643b4c4877580344e206e39b3335e2a95f7d9cef2149190bd7bf"} Oct 07 12:50:43 crc kubenswrapper[4854]: I1007 12:50:43.671071 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:44 crc kubenswrapper[4854]: I1007 12:50:44.628734 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9mcq" event={"ID":"68481956-b7a9-4578-bcd5-d216a9a324bb","Type":"ContainerStarted","Data":"20afa86e352443d57fb9626992808fe08fbfafe3197f8ab1bd5f73b9f90c97e4"} Oct 07 12:50:44 crc kubenswrapper[4854]: I1007 12:50:44.650041 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g9mcq" podStartSLOduration=2.139611387 podStartE2EDuration="9.64992675s" podCreationTimestamp="2025-10-07 12:50:35 +0000 UTC" firstStartedPulling="2025-10-07 12:50:36.557476999 +0000 UTC m=+1552.545309254" lastFinishedPulling="2025-10-07 12:50:44.067792362 +0000 UTC m=+1560.055624617" observedRunningTime="2025-10-07 12:50:44.646165592 +0000 UTC m=+1560.633997837" watchObservedRunningTime="2025-10-07 12:50:44.64992675 +0000 UTC m=+1560.637759005" Oct 07 12:50:44 crc kubenswrapper[4854]: I1007 12:50:44.740230 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rqbpx"] Oct 07 12:50:45 crc kubenswrapper[4854]: I1007 12:50:45.634845 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rqbpx" podUID="020f1b29-4e54-4e79-8271-cea1a6e3aec0" containerName="registry-server" containerID="cri-o://e53afc4de56a6ad06f7c6dca3e4af8d455e59ee8fe1294af93e9958aff1a79fb" gracePeriod=2 Oct 07 12:50:45 crc kubenswrapper[4854]: I1007 12:50:45.679816 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:45 crc kubenswrapper[4854]: I1007 12:50:45.680005 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.035571 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.219846 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020f1b29-4e54-4e79-8271-cea1a6e3aec0-catalog-content\") pod \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\" (UID: \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\") " Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.220082 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44n97\" (UniqueName: \"kubernetes.io/projected/020f1b29-4e54-4e79-8271-cea1a6e3aec0-kube-api-access-44n97\") pod \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\" (UID: \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\") " Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.220172 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020f1b29-4e54-4e79-8271-cea1a6e3aec0-utilities\") pod \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\" (UID: \"020f1b29-4e54-4e79-8271-cea1a6e3aec0\") " Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.221071 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020f1b29-4e54-4e79-8271-cea1a6e3aec0-utilities" (OuterVolumeSpecName: "utilities") pod "020f1b29-4e54-4e79-8271-cea1a6e3aec0" (UID: "020f1b29-4e54-4e79-8271-cea1a6e3aec0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.228783 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020f1b29-4e54-4e79-8271-cea1a6e3aec0-kube-api-access-44n97" (OuterVolumeSpecName: "kube-api-access-44n97") pod "020f1b29-4e54-4e79-8271-cea1a6e3aec0" (UID: "020f1b29-4e54-4e79-8271-cea1a6e3aec0"). InnerVolumeSpecName "kube-api-access-44n97". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.271931 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020f1b29-4e54-4e79-8271-cea1a6e3aec0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "020f1b29-4e54-4e79-8271-cea1a6e3aec0" (UID: "020f1b29-4e54-4e79-8271-cea1a6e3aec0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.321819 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020f1b29-4e54-4e79-8271-cea1a6e3aec0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.321864 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44n97\" (UniqueName: \"kubernetes.io/projected/020f1b29-4e54-4e79-8271-cea1a6e3aec0-kube-api-access-44n97\") on node \"crc\" DevicePath \"\"" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.321880 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020f1b29-4e54-4e79-8271-cea1a6e3aec0-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.648866 4854 generic.go:334] "Generic (PLEG): container finished" podID="020f1b29-4e54-4e79-8271-cea1a6e3aec0" containerID="e53afc4de56a6ad06f7c6dca3e4af8d455e59ee8fe1294af93e9958aff1a79fb" exitCode=0 Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.649001 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqbpx" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.649125 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqbpx" event={"ID":"020f1b29-4e54-4e79-8271-cea1a6e3aec0","Type":"ContainerDied","Data":"e53afc4de56a6ad06f7c6dca3e4af8d455e59ee8fe1294af93e9958aff1a79fb"} Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.649259 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqbpx" event={"ID":"020f1b29-4e54-4e79-8271-cea1a6e3aec0","Type":"ContainerDied","Data":"8d6667d54cf2501773a93fcda2942426c66d0a8a3bdcfcd0ba212b648ad1f6e2"} Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.649303 4854 scope.go:117] "RemoveContainer" containerID="e53afc4de56a6ad06f7c6dca3e4af8d455e59ee8fe1294af93e9958aff1a79fb" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.672673 4854 scope.go:117] "RemoveContainer" containerID="a8144cf5600ec665daed1d04a430274f189f48b74e3bdf89876fdc9ffa71ad17" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.705400 4854 scope.go:117] "RemoveContainer" containerID="963ab04e36c398d455a6a3ae01a0b655a0c0534c542a74511802e6960e6d44d1" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.730300 4854 scope.go:117] "RemoveContainer" containerID="e53afc4de56a6ad06f7c6dca3e4af8d455e59ee8fe1294af93e9958aff1a79fb" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.734256 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g9mcq" podUID="68481956-b7a9-4578-bcd5-d216a9a324bb" containerName="registry-server" probeResult="failure" output=< Oct 07 12:50:46 crc kubenswrapper[4854]: timeout: failed to connect service ":50051" within 1s Oct 07 12:50:46 crc kubenswrapper[4854]: > Oct 07 12:50:46 crc kubenswrapper[4854]: E1007 12:50:46.745552 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e53afc4de56a6ad06f7c6dca3e4af8d455e59ee8fe1294af93e9958aff1a79fb\": container with ID starting with e53afc4de56a6ad06f7c6dca3e4af8d455e59ee8fe1294af93e9958aff1a79fb not found: ID does not exist" containerID="e53afc4de56a6ad06f7c6dca3e4af8d455e59ee8fe1294af93e9958aff1a79fb" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.745618 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e53afc4de56a6ad06f7c6dca3e4af8d455e59ee8fe1294af93e9958aff1a79fb"} err="failed to get container status \"e53afc4de56a6ad06f7c6dca3e4af8d455e59ee8fe1294af93e9958aff1a79fb\": rpc error: code = NotFound desc = could not find container \"e53afc4de56a6ad06f7c6dca3e4af8d455e59ee8fe1294af93e9958aff1a79fb\": container with ID starting with e53afc4de56a6ad06f7c6dca3e4af8d455e59ee8fe1294af93e9958aff1a79fb not found: ID does not exist" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.745645 4854 scope.go:117] "RemoveContainer" containerID="a8144cf5600ec665daed1d04a430274f189f48b74e3bdf89876fdc9ffa71ad17" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.746724 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rqbpx"] Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.746771 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rqbpx"] Oct 07 12:50:46 crc kubenswrapper[4854]: E1007 12:50:46.746899 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8144cf5600ec665daed1d04a430274f189f48b74e3bdf89876fdc9ffa71ad17\": container with ID starting with a8144cf5600ec665daed1d04a430274f189f48b74e3bdf89876fdc9ffa71ad17 not found: ID does not exist" containerID="a8144cf5600ec665daed1d04a430274f189f48b74e3bdf89876fdc9ffa71ad17" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.746936 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8144cf5600ec665daed1d04a430274f189f48b74e3bdf89876fdc9ffa71ad17"} err="failed to get container status \"a8144cf5600ec665daed1d04a430274f189f48b74e3bdf89876fdc9ffa71ad17\": rpc error: code = NotFound desc = could not find container \"a8144cf5600ec665daed1d04a430274f189f48b74e3bdf89876fdc9ffa71ad17\": container with ID starting with a8144cf5600ec665daed1d04a430274f189f48b74e3bdf89876fdc9ffa71ad17 not found: ID does not exist" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.746961 4854 scope.go:117] "RemoveContainer" containerID="963ab04e36c398d455a6a3ae01a0b655a0c0534c542a74511802e6960e6d44d1" Oct 07 12:50:46 crc kubenswrapper[4854]: E1007 12:50:46.747447 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963ab04e36c398d455a6a3ae01a0b655a0c0534c542a74511802e6960e6d44d1\": container with ID starting with 963ab04e36c398d455a6a3ae01a0b655a0c0534c542a74511802e6960e6d44d1 not found: ID does not exist" containerID="963ab04e36c398d455a6a3ae01a0b655a0c0534c542a74511802e6960e6d44d1" Oct 07 12:50:46 crc kubenswrapper[4854]: I1007 12:50:46.747546 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963ab04e36c398d455a6a3ae01a0b655a0c0534c542a74511802e6960e6d44d1"} err="failed to get container status \"963ab04e36c398d455a6a3ae01a0b655a0c0534c542a74511802e6960e6d44d1\": rpc error: code = NotFound desc = could not find container \"963ab04e36c398d455a6a3ae01a0b655a0c0534c542a74511802e6960e6d44d1\": container with ID starting with 963ab04e36c398d455a6a3ae01a0b655a0c0534c542a74511802e6960e6d44d1 not found: ID does not exist" Oct 07 12:50:48 crc kubenswrapper[4854]: I1007 12:50:48.712273 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020f1b29-4e54-4e79-8271-cea1a6e3aec0" path="/var/lib/kubelet/pods/020f1b29-4e54-4e79-8271-cea1a6e3aec0/volumes" Oct 07 12:50:55 crc kubenswrapper[4854]: I1007 12:50:55.728681 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:55 crc kubenswrapper[4854]: I1007 12:50:55.797262 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:55 crc kubenswrapper[4854]: I1007 12:50:55.988186 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g9mcq"] Oct 07 12:50:57 crc kubenswrapper[4854]: I1007 12:50:57.754635 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g9mcq" podUID="68481956-b7a9-4578-bcd5-d216a9a324bb" containerName="registry-server" containerID="cri-o://20afa86e352443d57fb9626992808fe08fbfafe3197f8ab1bd5f73b9f90c97e4" gracePeriod=2 Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.759302 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.765124 4854 generic.go:334] "Generic (PLEG): container finished" podID="68481956-b7a9-4578-bcd5-d216a9a324bb" containerID="20afa86e352443d57fb9626992808fe08fbfafe3197f8ab1bd5f73b9f90c97e4" exitCode=0 Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.765183 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9mcq" event={"ID":"68481956-b7a9-4578-bcd5-d216a9a324bb","Type":"ContainerDied","Data":"20afa86e352443d57fb9626992808fe08fbfafe3197f8ab1bd5f73b9f90c97e4"} Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.765216 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9mcq" event={"ID":"68481956-b7a9-4578-bcd5-d216a9a324bb","Type":"ContainerDied","Data":"47c72e3dc730cf4369464c07da73e7089f6177af45feebb1a5ca513cf330ff15"} Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.765213 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9mcq" Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.765313 4854 scope.go:117] "RemoveContainer" containerID="20afa86e352443d57fb9626992808fe08fbfafe3197f8ab1bd5f73b9f90c97e4" Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.799816 4854 scope.go:117] "RemoveContainer" containerID="4cefa3f93780643b4c4877580344e206e39b3335e2a95f7d9cef2149190bd7bf" Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.823722 4854 scope.go:117] "RemoveContainer" containerID="04e86231eaea54392c207e01e5dee2b05094907c92e95560976b6c4d5e06c873" Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.842948 4854 scope.go:117] "RemoveContainer" containerID="20afa86e352443d57fb9626992808fe08fbfafe3197f8ab1bd5f73b9f90c97e4" Oct 07 12:50:58 crc kubenswrapper[4854]: E1007 12:50:58.843439 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20afa86e352443d57fb9626992808fe08fbfafe3197f8ab1bd5f73b9f90c97e4\": container with ID starting with 20afa86e352443d57fb9626992808fe08fbfafe3197f8ab1bd5f73b9f90c97e4 not found: ID does not exist" containerID="20afa86e352443d57fb9626992808fe08fbfafe3197f8ab1bd5f73b9f90c97e4" Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.843486 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20afa86e352443d57fb9626992808fe08fbfafe3197f8ab1bd5f73b9f90c97e4"} err="failed to get container status \"20afa86e352443d57fb9626992808fe08fbfafe3197f8ab1bd5f73b9f90c97e4\": rpc error: code = NotFound desc = could not find container \"20afa86e352443d57fb9626992808fe08fbfafe3197f8ab1bd5f73b9f90c97e4\": container with ID starting with 20afa86e352443d57fb9626992808fe08fbfafe3197f8ab1bd5f73b9f90c97e4 not found: ID does not exist" Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.843516 4854 scope.go:117] "RemoveContainer" containerID="4cefa3f93780643b4c4877580344e206e39b3335e2a95f7d9cef2149190bd7bf" Oct 07 12:50:58 crc kubenswrapper[4854]: E1007 12:50:58.843935 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cefa3f93780643b4c4877580344e206e39b3335e2a95f7d9cef2149190bd7bf\": container with ID starting with 4cefa3f93780643b4c4877580344e206e39b3335e2a95f7d9cef2149190bd7bf not found: ID does not exist" containerID="4cefa3f93780643b4c4877580344e206e39b3335e2a95f7d9cef2149190bd7bf" Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.843984 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cefa3f93780643b4c4877580344e206e39b3335e2a95f7d9cef2149190bd7bf"} err="failed to get container status \"4cefa3f93780643b4c4877580344e206e39b3335e2a95f7d9cef2149190bd7bf\": rpc error: code = NotFound desc = could not find container \"4cefa3f93780643b4c4877580344e206e39b3335e2a95f7d9cef2149190bd7bf\": container with ID starting with 4cefa3f93780643b4c4877580344e206e39b3335e2a95f7d9cef2149190bd7bf not found: ID does not exist" Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.844016 4854 scope.go:117] "RemoveContainer" containerID="04e86231eaea54392c207e01e5dee2b05094907c92e95560976b6c4d5e06c873" Oct 07 12:50:58 crc kubenswrapper[4854]: E1007 12:50:58.844389 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04e86231eaea54392c207e01e5dee2b05094907c92e95560976b6c4d5e06c873\": container with ID starting with 04e86231eaea54392c207e01e5dee2b05094907c92e95560976b6c4d5e06c873 not found: ID does not exist" containerID="04e86231eaea54392c207e01e5dee2b05094907c92e95560976b6c4d5e06c873" Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.844472 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04e86231eaea54392c207e01e5dee2b05094907c92e95560976b6c4d5e06c873"} err="failed to get container status \"04e86231eaea54392c207e01e5dee2b05094907c92e95560976b6c4d5e06c873\": rpc error: code = NotFound desc = could not find container \"04e86231eaea54392c207e01e5dee2b05094907c92e95560976b6c4d5e06c873\": container with ID starting with 04e86231eaea54392c207e01e5dee2b05094907c92e95560976b6c4d5e06c873 not found: ID does not exist" Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.922579 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68481956-b7a9-4578-bcd5-d216a9a324bb-catalog-content\") pod \"68481956-b7a9-4578-bcd5-d216a9a324bb\" (UID: \"68481956-b7a9-4578-bcd5-d216a9a324bb\") " Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.922658 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2p4h\" (UniqueName: \"kubernetes.io/projected/68481956-b7a9-4578-bcd5-d216a9a324bb-kube-api-access-c2p4h\") pod \"68481956-b7a9-4578-bcd5-d216a9a324bb\" (UID: \"68481956-b7a9-4578-bcd5-d216a9a324bb\") " Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.922689 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68481956-b7a9-4578-bcd5-d216a9a324bb-utilities\") pod \"68481956-b7a9-4578-bcd5-d216a9a324bb\" (UID: \"68481956-b7a9-4578-bcd5-d216a9a324bb\") " Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.924027 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68481956-b7a9-4578-bcd5-d216a9a324bb-utilities" (OuterVolumeSpecName: "utilities") pod "68481956-b7a9-4578-bcd5-d216a9a324bb" (UID: "68481956-b7a9-4578-bcd5-d216a9a324bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:50:58 crc kubenswrapper[4854]: I1007 12:50:58.931444 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68481956-b7a9-4578-bcd5-d216a9a324bb-kube-api-access-c2p4h" (OuterVolumeSpecName: "kube-api-access-c2p4h") pod "68481956-b7a9-4578-bcd5-d216a9a324bb" (UID: "68481956-b7a9-4578-bcd5-d216a9a324bb"). InnerVolumeSpecName "kube-api-access-c2p4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:50:59 crc kubenswrapper[4854]: I1007 12:50:59.025135 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2p4h\" (UniqueName: \"kubernetes.io/projected/68481956-b7a9-4578-bcd5-d216a9a324bb-kube-api-access-c2p4h\") on node \"crc\" DevicePath \"\"" Oct 07 12:50:59 crc kubenswrapper[4854]: I1007 12:50:59.025207 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68481956-b7a9-4578-bcd5-d216a9a324bb-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:50:59 crc kubenswrapper[4854]: I1007 12:50:59.028509 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68481956-b7a9-4578-bcd5-d216a9a324bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68481956-b7a9-4578-bcd5-d216a9a324bb" (UID: "68481956-b7a9-4578-bcd5-d216a9a324bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:50:59 crc kubenswrapper[4854]: I1007 12:50:59.106071 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g9mcq"] Oct 07 12:50:59 crc kubenswrapper[4854]: I1007 12:50:59.112055 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g9mcq"] Oct 07 12:50:59 crc kubenswrapper[4854]: I1007 12:50:59.126718 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68481956-b7a9-4578-bcd5-d216a9a324bb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:51:00 crc kubenswrapper[4854]: I1007 12:51:00.718853 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68481956-b7a9-4578-bcd5-d216a9a324bb" path="/var/lib/kubelet/pods/68481956-b7a9-4578-bcd5-d216a9a324bb/volumes" Oct 07 12:51:18 crc kubenswrapper[4854]: I1007 12:51:18.995583 4854 scope.go:117] "RemoveContainer" containerID="6c3add82afc5a4706f36b57114ee421863a370ea330e776fc1686cfde8841f49" Oct 07 12:51:19 crc kubenswrapper[4854]: I1007 12:51:19.027479 4854 scope.go:117] "RemoveContainer" containerID="39620a647b460de5f35393c3799df63e681c9e03d0decae7a6f3e51d1f325986" Oct 07 12:51:19 crc kubenswrapper[4854]: I1007 12:51:19.056869 4854 scope.go:117] "RemoveContainer" containerID="bd7103524f01d027fb1b2f752223346853e431f9b045e756199823ef3b9a9a3d" Oct 07 12:51:19 crc kubenswrapper[4854]: I1007 12:51:19.095495 4854 scope.go:117] "RemoveContainer" containerID="9b03a80d2912fe2dddcd14b778e7f6883a6f0136c4cf7cecbaf81439b6bff4fd" Oct 07 12:51:19 crc kubenswrapper[4854]: I1007 12:51:19.133919 4854 scope.go:117] "RemoveContainer" containerID="b33111a1b0f14dc72a4fc34de5d78d44cbf9938a6e3632fa063f8bb7985aeaf3" Oct 07 12:51:40 crc kubenswrapper[4854]: I1007 12:51:40.808175 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:51:40 crc kubenswrapper[4854]: I1007 12:51:40.808780 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:52:10 crc kubenswrapper[4854]: I1007 12:52:10.807434 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:52:10 crc kubenswrapper[4854]: I1007 12:52:10.808174 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:52:40 crc kubenswrapper[4854]: I1007 12:52:40.808180 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 12:52:40 crc kubenswrapper[4854]: I1007 12:52:40.809054 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 12:52:40 crc kubenswrapper[4854]: I1007 12:52:40.809150 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 12:52:40 crc kubenswrapper[4854]: I1007 12:52:40.810112 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 12:52:40 crc kubenswrapper[4854]: I1007 12:52:40.810263 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" gracePeriod=600 Oct 07 12:52:40 crc kubenswrapper[4854]: E1007 12:52:40.958841 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:52:41 crc kubenswrapper[4854]: I1007 12:52:41.695735 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" exitCode=0 Oct 07 12:52:41 crc kubenswrapper[4854]: I1007 12:52:41.695813 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5"} Oct 07 12:52:41 crc kubenswrapper[4854]: I1007 12:52:41.696419 4854 scope.go:117] "RemoveContainer" containerID="00a7905fded3000a5bba04fc59df5ffada6a53acc5865f31e13e38f5ecdf35c6" Oct 07 12:52:41 crc kubenswrapper[4854]: I1007 12:52:41.697860 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:52:41 crc kubenswrapper[4854]: E1007 12:52:41.698877 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:52:55 crc kubenswrapper[4854]: I1007 12:52:55.702821 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:52:55 crc kubenswrapper[4854]: E1007 12:52:55.703809 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:53:09 crc kubenswrapper[4854]: I1007 12:53:09.703240 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:53:09 crc kubenswrapper[4854]: E1007 12:53:09.704553 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:53:24 crc kubenswrapper[4854]: I1007 12:53:24.708470 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:53:24 crc kubenswrapper[4854]: E1007 12:53:24.709651 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:53:36 crc kubenswrapper[4854]: I1007 12:53:36.702684 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:53:36 crc kubenswrapper[4854]: E1007 12:53:36.703514 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:53:50 crc kubenswrapper[4854]: I1007 12:53:50.703313 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:53:50 crc kubenswrapper[4854]: E1007 12:53:50.704053 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:54:02 crc kubenswrapper[4854]: I1007 12:54:02.703886 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:54:02 crc kubenswrapper[4854]: E1007 12:54:02.704793 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:54:17 crc kubenswrapper[4854]: I1007 12:54:17.703749 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:54:17 crc kubenswrapper[4854]: E1007 12:54:17.704828 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:54:32 crc kubenswrapper[4854]: I1007 12:54:32.704450 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:54:32 crc kubenswrapper[4854]: E1007 12:54:32.705398 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:54:45 crc kubenswrapper[4854]: I1007 12:54:45.702958 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:54:45 crc kubenswrapper[4854]: E1007 12:54:45.703888 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:54:56 crc kubenswrapper[4854]: I1007 12:54:56.702227 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:54:56 crc kubenswrapper[4854]: E1007 12:54:56.702904 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:55:09 crc kubenswrapper[4854]: I1007 12:55:09.702753 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:55:09 crc kubenswrapper[4854]: E1007 12:55:09.703381 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:55:24 crc kubenswrapper[4854]: I1007 12:55:24.707534 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:55:24 crc kubenswrapper[4854]: E1007 12:55:24.709928 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:55:35 crc kubenswrapper[4854]: I1007 12:55:35.703193 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:55:35 crc kubenswrapper[4854]: E1007 12:55:35.703997 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:55:46 crc kubenswrapper[4854]: I1007 12:55:46.702626 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:55:46 crc kubenswrapper[4854]: E1007 12:55:46.703447 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:55:57 crc kubenswrapper[4854]: I1007 12:55:57.703113 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:55:57 crc kubenswrapper[4854]: E1007 12:55:57.703890 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:56:09 crc kubenswrapper[4854]: I1007 12:56:09.702811 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:56:09 crc kubenswrapper[4854]: E1007 12:56:09.704291 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:56:22 crc kubenswrapper[4854]: I1007 12:56:22.703215 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:56:22 crc kubenswrapper[4854]: E1007 12:56:22.704888 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:56:33 crc kubenswrapper[4854]: I1007 12:56:33.702879 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:56:33 crc kubenswrapper[4854]: E1007 12:56:33.704796 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:56:44 crc kubenswrapper[4854]: I1007 12:56:44.713321 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:56:44 crc kubenswrapper[4854]: E1007 12:56:44.714569 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:56:57 crc kubenswrapper[4854]: I1007 12:56:57.703413 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:56:57 crc kubenswrapper[4854]: E1007 12:56:57.704088 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:57:11 crc kubenswrapper[4854]: I1007 12:57:11.702998 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:57:11 crc kubenswrapper[4854]: E1007 12:57:11.703933 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:57:26 crc kubenswrapper[4854]: I1007 12:57:26.702729 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:57:26 crc kubenswrapper[4854]: E1007 12:57:26.703466 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:57:39 crc kubenswrapper[4854]: I1007 12:57:39.702733 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:57:39 crc kubenswrapper[4854]: E1007 12:57:39.703765 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 12:57:52 crc kubenswrapper[4854]: I1007 12:57:52.702836 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 12:57:53 crc kubenswrapper[4854]: I1007 12:57:53.391422 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"d2a1a38e20691746e12ccd6dfe6c642f0b09a208db24501927e2ee26a8b722a7"} Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.462391 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b7clm"] Oct 07 12:58:13 crc kubenswrapper[4854]: E1007 12:58:13.463258 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020f1b29-4e54-4e79-8271-cea1a6e3aec0" containerName="extract-content" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.463273 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="020f1b29-4e54-4e79-8271-cea1a6e3aec0" containerName="extract-content" Oct 07 12:58:13 crc kubenswrapper[4854]: E1007 12:58:13.463283 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020f1b29-4e54-4e79-8271-cea1a6e3aec0" containerName="registry-server" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.463289 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="020f1b29-4e54-4e79-8271-cea1a6e3aec0" containerName="registry-server" Oct 07 12:58:13 crc kubenswrapper[4854]: E1007 12:58:13.463299 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68481956-b7a9-4578-bcd5-d216a9a324bb" containerName="registry-server" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.463306 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="68481956-b7a9-4578-bcd5-d216a9a324bb" containerName="registry-server" Oct 07 12:58:13 crc kubenswrapper[4854]: E1007 12:58:13.463320 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68481956-b7a9-4578-bcd5-d216a9a324bb" containerName="extract-utilities" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.463327 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="68481956-b7a9-4578-bcd5-d216a9a324bb" containerName="extract-utilities" Oct 07 12:58:13 crc kubenswrapper[4854]: E1007 12:58:13.463341 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020f1b29-4e54-4e79-8271-cea1a6e3aec0" containerName="extract-utilities" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.463348 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="020f1b29-4e54-4e79-8271-cea1a6e3aec0" containerName="extract-utilities" Oct 07 12:58:13 crc kubenswrapper[4854]: E1007 12:58:13.463360 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68481956-b7a9-4578-bcd5-d216a9a324bb" containerName="extract-content" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.463375 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="68481956-b7a9-4578-bcd5-d216a9a324bb" containerName="extract-content" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.463529 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="68481956-b7a9-4578-bcd5-d216a9a324bb" containerName="registry-server" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.463541 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="020f1b29-4e54-4e79-8271-cea1a6e3aec0" containerName="registry-server" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.464552 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.472817 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b7clm"] Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.496137 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrpd8\" (UniqueName: \"kubernetes.io/projected/72394a6d-6967-4a08-9b12-19bf3572fc89-kube-api-access-hrpd8\") pod \"redhat-marketplace-b7clm\" (UID: \"72394a6d-6967-4a08-9b12-19bf3572fc89\") " pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.496241 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72394a6d-6967-4a08-9b12-19bf3572fc89-catalog-content\") pod \"redhat-marketplace-b7clm\" (UID: \"72394a6d-6967-4a08-9b12-19bf3572fc89\") " pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.496340 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72394a6d-6967-4a08-9b12-19bf3572fc89-utilities\") pod \"redhat-marketplace-b7clm\" (UID: \"72394a6d-6967-4a08-9b12-19bf3572fc89\") " pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.597374 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrpd8\" (UniqueName: \"kubernetes.io/projected/72394a6d-6967-4a08-9b12-19bf3572fc89-kube-api-access-hrpd8\") pod \"redhat-marketplace-b7clm\" (UID: \"72394a6d-6967-4a08-9b12-19bf3572fc89\") " pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.597474 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72394a6d-6967-4a08-9b12-19bf3572fc89-catalog-content\") pod \"redhat-marketplace-b7clm\" (UID: \"72394a6d-6967-4a08-9b12-19bf3572fc89\") " pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.597523 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72394a6d-6967-4a08-9b12-19bf3572fc89-utilities\") pod \"redhat-marketplace-b7clm\" (UID: \"72394a6d-6967-4a08-9b12-19bf3572fc89\") " pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.598078 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72394a6d-6967-4a08-9b12-19bf3572fc89-utilities\") pod \"redhat-marketplace-b7clm\" (UID: \"72394a6d-6967-4a08-9b12-19bf3572fc89\") " pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.598325 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72394a6d-6967-4a08-9b12-19bf3572fc89-catalog-content\") pod \"redhat-marketplace-b7clm\" (UID: \"72394a6d-6967-4a08-9b12-19bf3572fc89\") " pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.626590 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrpd8\" (UniqueName: \"kubernetes.io/projected/72394a6d-6967-4a08-9b12-19bf3572fc89-kube-api-access-hrpd8\") pod \"redhat-marketplace-b7clm\" (UID: \"72394a6d-6967-4a08-9b12-19bf3572fc89\") " pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:13 crc kubenswrapper[4854]: I1007 12:58:13.788277 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:14 crc kubenswrapper[4854]: I1007 12:58:14.225950 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b7clm"] Oct 07 12:58:14 crc kubenswrapper[4854]: I1007 12:58:14.563752 4854 generic.go:334] "Generic (PLEG): container finished" podID="72394a6d-6967-4a08-9b12-19bf3572fc89" containerID="a0f9d67c07cdce6171a0677873e3266c5ad7aadf65938a31b25f46758ab00336" exitCode=0 Oct 07 12:58:14 crc kubenswrapper[4854]: I1007 12:58:14.564304 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b7clm" event={"ID":"72394a6d-6967-4a08-9b12-19bf3572fc89","Type":"ContainerDied","Data":"a0f9d67c07cdce6171a0677873e3266c5ad7aadf65938a31b25f46758ab00336"} Oct 07 12:58:14 crc kubenswrapper[4854]: I1007 12:58:14.564372 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b7clm" event={"ID":"72394a6d-6967-4a08-9b12-19bf3572fc89","Type":"ContainerStarted","Data":"11c622b8132c8d5a5908a74d1855d35db43a8a88a0cafc46db08125f555c0546"} Oct 07 12:58:14 crc kubenswrapper[4854]: I1007 12:58:14.566332 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 12:58:16 crc kubenswrapper[4854]: I1007 12:58:16.582552 4854 generic.go:334] "Generic (PLEG): container finished" podID="72394a6d-6967-4a08-9b12-19bf3572fc89" containerID="8d188dee906f17a8c8e01c48801ed30a5c5c18a28ddd1a5d2e8234407b5da73b" exitCode=0 Oct 07 12:58:16 crc kubenswrapper[4854]: I1007 12:58:16.582796 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b7clm" event={"ID":"72394a6d-6967-4a08-9b12-19bf3572fc89","Type":"ContainerDied","Data":"8d188dee906f17a8c8e01c48801ed30a5c5c18a28ddd1a5d2e8234407b5da73b"} Oct 07 12:58:17 crc kubenswrapper[4854]: I1007 12:58:17.592609 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b7clm" event={"ID":"72394a6d-6967-4a08-9b12-19bf3572fc89","Type":"ContainerStarted","Data":"f885adb68d62dc45e7e28f8153b90eed0b114df4e0d1ccd429dec4af876c334e"} Oct 07 12:58:17 crc kubenswrapper[4854]: I1007 12:58:17.622704 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b7clm" podStartSLOduration=2.063234184 podStartE2EDuration="4.622678691s" podCreationTimestamp="2025-10-07 12:58:13 +0000 UTC" firstStartedPulling="2025-10-07 12:58:14.565898886 +0000 UTC m=+2010.553731181" lastFinishedPulling="2025-10-07 12:58:17.125343423 +0000 UTC m=+2013.113175688" observedRunningTime="2025-10-07 12:58:17.614346728 +0000 UTC m=+2013.602178983" watchObservedRunningTime="2025-10-07 12:58:17.622678691 +0000 UTC m=+2013.610510986" Oct 07 12:58:23 crc kubenswrapper[4854]: I1007 12:58:23.789379 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:23 crc kubenswrapper[4854]: I1007 12:58:23.789952 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:23 crc kubenswrapper[4854]: I1007 12:58:23.852815 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:24 crc kubenswrapper[4854]: I1007 12:58:24.730269 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:24 crc kubenswrapper[4854]: I1007 12:58:24.777352 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b7clm"] Oct 07 12:58:26 crc kubenswrapper[4854]: I1007 12:58:26.681074 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b7clm" podUID="72394a6d-6967-4a08-9b12-19bf3572fc89" containerName="registry-server" containerID="cri-o://f885adb68d62dc45e7e28f8153b90eed0b114df4e0d1ccd429dec4af876c334e" gracePeriod=2 Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.069759 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.130885 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72394a6d-6967-4a08-9b12-19bf3572fc89-utilities\") pod \"72394a6d-6967-4a08-9b12-19bf3572fc89\" (UID: \"72394a6d-6967-4a08-9b12-19bf3572fc89\") " Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.131006 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72394a6d-6967-4a08-9b12-19bf3572fc89-catalog-content\") pod \"72394a6d-6967-4a08-9b12-19bf3572fc89\" (UID: \"72394a6d-6967-4a08-9b12-19bf3572fc89\") " Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.131181 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrpd8\" (UniqueName: \"kubernetes.io/projected/72394a6d-6967-4a08-9b12-19bf3572fc89-kube-api-access-hrpd8\") pod \"72394a6d-6967-4a08-9b12-19bf3572fc89\" (UID: \"72394a6d-6967-4a08-9b12-19bf3572fc89\") " Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.132000 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72394a6d-6967-4a08-9b12-19bf3572fc89-utilities" (OuterVolumeSpecName: "utilities") pod "72394a6d-6967-4a08-9b12-19bf3572fc89" (UID: "72394a6d-6967-4a08-9b12-19bf3572fc89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.136089 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72394a6d-6967-4a08-9b12-19bf3572fc89-kube-api-access-hrpd8" (OuterVolumeSpecName: "kube-api-access-hrpd8") pod "72394a6d-6967-4a08-9b12-19bf3572fc89" (UID: "72394a6d-6967-4a08-9b12-19bf3572fc89"). InnerVolumeSpecName "kube-api-access-hrpd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.232666 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrpd8\" (UniqueName: \"kubernetes.io/projected/72394a6d-6967-4a08-9b12-19bf3572fc89-kube-api-access-hrpd8\") on node \"crc\" DevicePath \"\"" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.232706 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72394a6d-6967-4a08-9b12-19bf3572fc89-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.262460 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72394a6d-6967-4a08-9b12-19bf3572fc89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72394a6d-6967-4a08-9b12-19bf3572fc89" (UID: "72394a6d-6967-4a08-9b12-19bf3572fc89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.333195 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72394a6d-6967-4a08-9b12-19bf3572fc89-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.691436 4854 generic.go:334] "Generic (PLEG): container finished" podID="72394a6d-6967-4a08-9b12-19bf3572fc89" containerID="f885adb68d62dc45e7e28f8153b90eed0b114df4e0d1ccd429dec4af876c334e" exitCode=0 Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.691482 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b7clm" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.691479 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b7clm" event={"ID":"72394a6d-6967-4a08-9b12-19bf3572fc89","Type":"ContainerDied","Data":"f885adb68d62dc45e7e28f8153b90eed0b114df4e0d1ccd429dec4af876c334e"} Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.691619 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b7clm" event={"ID":"72394a6d-6967-4a08-9b12-19bf3572fc89","Type":"ContainerDied","Data":"11c622b8132c8d5a5908a74d1855d35db43a8a88a0cafc46db08125f555c0546"} Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.691653 4854 scope.go:117] "RemoveContainer" containerID="f885adb68d62dc45e7e28f8153b90eed0b114df4e0d1ccd429dec4af876c334e" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.721739 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b7clm"] Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.726481 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b7clm"] Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.726975 4854 scope.go:117] "RemoveContainer" containerID="8d188dee906f17a8c8e01c48801ed30a5c5c18a28ddd1a5d2e8234407b5da73b" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.754105 4854 scope.go:117] "RemoveContainer" containerID="a0f9d67c07cdce6171a0677873e3266c5ad7aadf65938a31b25f46758ab00336" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.781802 4854 scope.go:117] "RemoveContainer" containerID="f885adb68d62dc45e7e28f8153b90eed0b114df4e0d1ccd429dec4af876c334e" Oct 07 12:58:27 crc kubenswrapper[4854]: E1007 12:58:27.782268 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f885adb68d62dc45e7e28f8153b90eed0b114df4e0d1ccd429dec4af876c334e\": container with ID starting with f885adb68d62dc45e7e28f8153b90eed0b114df4e0d1ccd429dec4af876c334e not found: ID does not exist" containerID="f885adb68d62dc45e7e28f8153b90eed0b114df4e0d1ccd429dec4af876c334e" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.782387 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f885adb68d62dc45e7e28f8153b90eed0b114df4e0d1ccd429dec4af876c334e"} err="failed to get container status \"f885adb68d62dc45e7e28f8153b90eed0b114df4e0d1ccd429dec4af876c334e\": rpc error: code = NotFound desc = could not find container \"f885adb68d62dc45e7e28f8153b90eed0b114df4e0d1ccd429dec4af876c334e\": container with ID starting with f885adb68d62dc45e7e28f8153b90eed0b114df4e0d1ccd429dec4af876c334e not found: ID does not exist" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.782470 4854 scope.go:117] "RemoveContainer" containerID="8d188dee906f17a8c8e01c48801ed30a5c5c18a28ddd1a5d2e8234407b5da73b" Oct 07 12:58:27 crc kubenswrapper[4854]: E1007 12:58:27.783086 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d188dee906f17a8c8e01c48801ed30a5c5c18a28ddd1a5d2e8234407b5da73b\": container with ID starting with 8d188dee906f17a8c8e01c48801ed30a5c5c18a28ddd1a5d2e8234407b5da73b not found: ID does not exist" containerID="8d188dee906f17a8c8e01c48801ed30a5c5c18a28ddd1a5d2e8234407b5da73b" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.783141 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d188dee906f17a8c8e01c48801ed30a5c5c18a28ddd1a5d2e8234407b5da73b"} err="failed to get container status \"8d188dee906f17a8c8e01c48801ed30a5c5c18a28ddd1a5d2e8234407b5da73b\": rpc error: code = NotFound desc = could not find container \"8d188dee906f17a8c8e01c48801ed30a5c5c18a28ddd1a5d2e8234407b5da73b\": container with ID starting with 8d188dee906f17a8c8e01c48801ed30a5c5c18a28ddd1a5d2e8234407b5da73b not found: ID does not exist" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.783224 4854 scope.go:117] "RemoveContainer" containerID="a0f9d67c07cdce6171a0677873e3266c5ad7aadf65938a31b25f46758ab00336" Oct 07 12:58:27 crc kubenswrapper[4854]: E1007 12:58:27.783564 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0f9d67c07cdce6171a0677873e3266c5ad7aadf65938a31b25f46758ab00336\": container with ID starting with a0f9d67c07cdce6171a0677873e3266c5ad7aadf65938a31b25f46758ab00336 not found: ID does not exist" containerID="a0f9d67c07cdce6171a0677873e3266c5ad7aadf65938a31b25f46758ab00336" Oct 07 12:58:27 crc kubenswrapper[4854]: I1007 12:58:27.783608 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f9d67c07cdce6171a0677873e3266c5ad7aadf65938a31b25f46758ab00336"} err="failed to get container status \"a0f9d67c07cdce6171a0677873e3266c5ad7aadf65938a31b25f46758ab00336\": rpc error: code = NotFound desc = could not find container \"a0f9d67c07cdce6171a0677873e3266c5ad7aadf65938a31b25f46758ab00336\": container with ID starting with a0f9d67c07cdce6171a0677873e3266c5ad7aadf65938a31b25f46758ab00336 not found: ID does not exist" Oct 07 12:58:28 crc kubenswrapper[4854]: I1007 12:58:28.712707 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72394a6d-6967-4a08-9b12-19bf3572fc89" path="/var/lib/kubelet/pods/72394a6d-6967-4a08-9b12-19bf3572fc89/volumes" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.614702 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hh2vv"] Oct 07 12:59:07 crc kubenswrapper[4854]: E1007 12:59:07.617032 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72394a6d-6967-4a08-9b12-19bf3572fc89" containerName="extract-utilities" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.617055 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="72394a6d-6967-4a08-9b12-19bf3572fc89" containerName="extract-utilities" Oct 07 12:59:07 crc kubenswrapper[4854]: E1007 12:59:07.617074 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72394a6d-6967-4a08-9b12-19bf3572fc89" containerName="extract-content" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.617083 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="72394a6d-6967-4a08-9b12-19bf3572fc89" containerName="extract-content" Oct 07 12:59:07 crc kubenswrapper[4854]: E1007 12:59:07.617099 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72394a6d-6967-4a08-9b12-19bf3572fc89" containerName="registry-server" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.617108 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="72394a6d-6967-4a08-9b12-19bf3572fc89" containerName="registry-server" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.617364 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="72394a6d-6967-4a08-9b12-19bf3572fc89" containerName="registry-server" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.618809 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.621430 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hh2vv"] Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.738973 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a879d2f-9848-408c-ac63-70933900f0f7-utilities\") pod \"community-operators-hh2vv\" (UID: \"4a879d2f-9848-408c-ac63-70933900f0f7\") " pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.739133 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c98f\" (UniqueName: \"kubernetes.io/projected/4a879d2f-9848-408c-ac63-70933900f0f7-kube-api-access-8c98f\") pod \"community-operators-hh2vv\" (UID: \"4a879d2f-9848-408c-ac63-70933900f0f7\") " pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.739197 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a879d2f-9848-408c-ac63-70933900f0f7-catalog-content\") pod \"community-operators-hh2vv\" (UID: \"4a879d2f-9848-408c-ac63-70933900f0f7\") " pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.840303 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c98f\" (UniqueName: \"kubernetes.io/projected/4a879d2f-9848-408c-ac63-70933900f0f7-kube-api-access-8c98f\") pod \"community-operators-hh2vv\" (UID: \"4a879d2f-9848-408c-ac63-70933900f0f7\") " pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.840415 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a879d2f-9848-408c-ac63-70933900f0f7-catalog-content\") pod \"community-operators-hh2vv\" (UID: \"4a879d2f-9848-408c-ac63-70933900f0f7\") " pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.840572 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a879d2f-9848-408c-ac63-70933900f0f7-utilities\") pod \"community-operators-hh2vv\" (UID: \"4a879d2f-9848-408c-ac63-70933900f0f7\") " pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.841163 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a879d2f-9848-408c-ac63-70933900f0f7-utilities\") pod \"community-operators-hh2vv\" (UID: \"4a879d2f-9848-408c-ac63-70933900f0f7\") " pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.841174 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a879d2f-9848-408c-ac63-70933900f0f7-catalog-content\") pod \"community-operators-hh2vv\" (UID: \"4a879d2f-9848-408c-ac63-70933900f0f7\") " pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.868960 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c98f\" (UniqueName: \"kubernetes.io/projected/4a879d2f-9848-408c-ac63-70933900f0f7-kube-api-access-8c98f\") pod \"community-operators-hh2vv\" (UID: \"4a879d2f-9848-408c-ac63-70933900f0f7\") " pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:07 crc kubenswrapper[4854]: I1007 12:59:07.942735 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:08 crc kubenswrapper[4854]: I1007 12:59:08.430194 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hh2vv"] Oct 07 12:59:09 crc kubenswrapper[4854]: I1007 12:59:09.104486 4854 generic.go:334] "Generic (PLEG): container finished" podID="4a879d2f-9848-408c-ac63-70933900f0f7" containerID="b0480b9070a425bdcaf40037367636184d42bc8e2a9bb4636cfcb424f1a6f5a6" exitCode=0 Oct 07 12:59:09 crc kubenswrapper[4854]: I1007 12:59:09.104545 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hh2vv" event={"ID":"4a879d2f-9848-408c-ac63-70933900f0f7","Type":"ContainerDied","Data":"b0480b9070a425bdcaf40037367636184d42bc8e2a9bb4636cfcb424f1a6f5a6"} Oct 07 12:59:09 crc kubenswrapper[4854]: I1007 12:59:09.104773 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hh2vv" event={"ID":"4a879d2f-9848-408c-ac63-70933900f0f7","Type":"ContainerStarted","Data":"1e280ef69559d7d41ffcaad2d1b1519938116d840c254d695bdcbf5587c26c6d"} Oct 07 12:59:11 crc kubenswrapper[4854]: I1007 12:59:11.121044 4854 generic.go:334] "Generic (PLEG): container finished" podID="4a879d2f-9848-408c-ac63-70933900f0f7" containerID="2e8b8ccfee9794f9f24b223af9dd22e887aa4b08eeb2c793f7eeeb9fe90fd02a" exitCode=0 Oct 07 12:59:11 crc kubenswrapper[4854]: I1007 12:59:11.121221 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hh2vv" event={"ID":"4a879d2f-9848-408c-ac63-70933900f0f7","Type":"ContainerDied","Data":"2e8b8ccfee9794f9f24b223af9dd22e887aa4b08eeb2c793f7eeeb9fe90fd02a"} Oct 07 12:59:13 crc kubenswrapper[4854]: I1007 12:59:13.137506 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hh2vv" event={"ID":"4a879d2f-9848-408c-ac63-70933900f0f7","Type":"ContainerStarted","Data":"fa7a708d2bd913b1cd53b9d26b22aef7033240703b44550b8ea7b97745b32990"} Oct 07 12:59:13 crc kubenswrapper[4854]: I1007 12:59:13.156329 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hh2vv" podStartSLOduration=3.478299281 podStartE2EDuration="6.156310659s" podCreationTimestamp="2025-10-07 12:59:07 +0000 UTC" firstStartedPulling="2025-10-07 12:59:09.107674729 +0000 UTC m=+2065.095506984" lastFinishedPulling="2025-10-07 12:59:11.785686107 +0000 UTC m=+2067.773518362" observedRunningTime="2025-10-07 12:59:13.156126483 +0000 UTC m=+2069.143958758" watchObservedRunningTime="2025-10-07 12:59:13.156310659 +0000 UTC m=+2069.144142904" Oct 07 12:59:17 crc kubenswrapper[4854]: I1007 12:59:17.943398 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:17 crc kubenswrapper[4854]: I1007 12:59:17.944700 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:18 crc kubenswrapper[4854]: I1007 12:59:18.036658 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:18 crc kubenswrapper[4854]: I1007 12:59:18.256306 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:18 crc kubenswrapper[4854]: I1007 12:59:18.303163 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hh2vv"] Oct 07 12:59:20 crc kubenswrapper[4854]: I1007 12:59:20.199338 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hh2vv" podUID="4a879d2f-9848-408c-ac63-70933900f0f7" containerName="registry-server" containerID="cri-o://fa7a708d2bd913b1cd53b9d26b22aef7033240703b44550b8ea7b97745b32990" gracePeriod=2 Oct 07 12:59:20 crc kubenswrapper[4854]: I1007 12:59:20.576734 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:20 crc kubenswrapper[4854]: I1007 12:59:20.738810 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a879d2f-9848-408c-ac63-70933900f0f7-utilities\") pod \"4a879d2f-9848-408c-ac63-70933900f0f7\" (UID: \"4a879d2f-9848-408c-ac63-70933900f0f7\") " Oct 07 12:59:20 crc kubenswrapper[4854]: I1007 12:59:20.738892 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a879d2f-9848-408c-ac63-70933900f0f7-catalog-content\") pod \"4a879d2f-9848-408c-ac63-70933900f0f7\" (UID: \"4a879d2f-9848-408c-ac63-70933900f0f7\") " Oct 07 12:59:20 crc kubenswrapper[4854]: I1007 12:59:20.739015 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c98f\" (UniqueName: \"kubernetes.io/projected/4a879d2f-9848-408c-ac63-70933900f0f7-kube-api-access-8c98f\") pod \"4a879d2f-9848-408c-ac63-70933900f0f7\" (UID: \"4a879d2f-9848-408c-ac63-70933900f0f7\") " Oct 07 12:59:20 crc kubenswrapper[4854]: I1007 12:59:20.741917 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a879d2f-9848-408c-ac63-70933900f0f7-utilities" (OuterVolumeSpecName: "utilities") pod "4a879d2f-9848-408c-ac63-70933900f0f7" (UID: "4a879d2f-9848-408c-ac63-70933900f0f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:59:20 crc kubenswrapper[4854]: I1007 12:59:20.747440 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a879d2f-9848-408c-ac63-70933900f0f7-kube-api-access-8c98f" (OuterVolumeSpecName: "kube-api-access-8c98f") pod "4a879d2f-9848-408c-ac63-70933900f0f7" (UID: "4a879d2f-9848-408c-ac63-70933900f0f7"). InnerVolumeSpecName "kube-api-access-8c98f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 12:59:20 crc kubenswrapper[4854]: I1007 12:59:20.795283 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a879d2f-9848-408c-ac63-70933900f0f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a879d2f-9848-408c-ac63-70933900f0f7" (UID: "4a879d2f-9848-408c-ac63-70933900f0f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 12:59:20 crc kubenswrapper[4854]: I1007 12:59:20.841250 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a879d2f-9848-408c-ac63-70933900f0f7-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 12:59:20 crc kubenswrapper[4854]: I1007 12:59:20.841284 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a879d2f-9848-408c-ac63-70933900f0f7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 12:59:20 crc kubenswrapper[4854]: I1007 12:59:20.841295 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c98f\" (UniqueName: \"kubernetes.io/projected/4a879d2f-9848-408c-ac63-70933900f0f7-kube-api-access-8c98f\") on node \"crc\" DevicePath \"\"" Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.209342 4854 generic.go:334] "Generic (PLEG): container finished" podID="4a879d2f-9848-408c-ac63-70933900f0f7" containerID="fa7a708d2bd913b1cd53b9d26b22aef7033240703b44550b8ea7b97745b32990" exitCode=0 Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.209388 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hh2vv" event={"ID":"4a879d2f-9848-408c-ac63-70933900f0f7","Type":"ContainerDied","Data":"fa7a708d2bd913b1cd53b9d26b22aef7033240703b44550b8ea7b97745b32990"} Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.209407 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hh2vv" Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.209417 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hh2vv" event={"ID":"4a879d2f-9848-408c-ac63-70933900f0f7","Type":"ContainerDied","Data":"1e280ef69559d7d41ffcaad2d1b1519938116d840c254d695bdcbf5587c26c6d"} Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.209440 4854 scope.go:117] "RemoveContainer" containerID="fa7a708d2bd913b1cd53b9d26b22aef7033240703b44550b8ea7b97745b32990" Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.227041 4854 scope.go:117] "RemoveContainer" containerID="2e8b8ccfee9794f9f24b223af9dd22e887aa4b08eeb2c793f7eeeb9fe90fd02a" Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.243088 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hh2vv"] Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.248234 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hh2vv"] Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.257903 4854 scope.go:117] "RemoveContainer" containerID="b0480b9070a425bdcaf40037367636184d42bc8e2a9bb4636cfcb424f1a6f5a6" Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.282382 4854 scope.go:117] "RemoveContainer" containerID="fa7a708d2bd913b1cd53b9d26b22aef7033240703b44550b8ea7b97745b32990" Oct 07 12:59:21 crc kubenswrapper[4854]: E1007 12:59:21.282768 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa7a708d2bd913b1cd53b9d26b22aef7033240703b44550b8ea7b97745b32990\": container with ID starting with fa7a708d2bd913b1cd53b9d26b22aef7033240703b44550b8ea7b97745b32990 not found: ID does not exist" containerID="fa7a708d2bd913b1cd53b9d26b22aef7033240703b44550b8ea7b97745b32990" Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.282801 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7a708d2bd913b1cd53b9d26b22aef7033240703b44550b8ea7b97745b32990"} err="failed to get container status \"fa7a708d2bd913b1cd53b9d26b22aef7033240703b44550b8ea7b97745b32990\": rpc error: code = NotFound desc = could not find container \"fa7a708d2bd913b1cd53b9d26b22aef7033240703b44550b8ea7b97745b32990\": container with ID starting with fa7a708d2bd913b1cd53b9d26b22aef7033240703b44550b8ea7b97745b32990 not found: ID does not exist" Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.282820 4854 scope.go:117] "RemoveContainer" containerID="2e8b8ccfee9794f9f24b223af9dd22e887aa4b08eeb2c793f7eeeb9fe90fd02a" Oct 07 12:59:21 crc kubenswrapper[4854]: E1007 12:59:21.283003 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8b8ccfee9794f9f24b223af9dd22e887aa4b08eeb2c793f7eeeb9fe90fd02a\": container with ID starting with 2e8b8ccfee9794f9f24b223af9dd22e887aa4b08eeb2c793f7eeeb9fe90fd02a not found: ID does not exist" containerID="2e8b8ccfee9794f9f24b223af9dd22e887aa4b08eeb2c793f7eeeb9fe90fd02a" Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.283018 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8b8ccfee9794f9f24b223af9dd22e887aa4b08eeb2c793f7eeeb9fe90fd02a"} err="failed to get container status \"2e8b8ccfee9794f9f24b223af9dd22e887aa4b08eeb2c793f7eeeb9fe90fd02a\": rpc error: code = NotFound desc = could not find container \"2e8b8ccfee9794f9f24b223af9dd22e887aa4b08eeb2c793f7eeeb9fe90fd02a\": container with ID starting with 2e8b8ccfee9794f9f24b223af9dd22e887aa4b08eeb2c793f7eeeb9fe90fd02a not found: ID does not exist" Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.283029 4854 scope.go:117] "RemoveContainer" containerID="b0480b9070a425bdcaf40037367636184d42bc8e2a9bb4636cfcb424f1a6f5a6" Oct 07 12:59:21 crc kubenswrapper[4854]: E1007 12:59:21.283421 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0480b9070a425bdcaf40037367636184d42bc8e2a9bb4636cfcb424f1a6f5a6\": container with ID starting with b0480b9070a425bdcaf40037367636184d42bc8e2a9bb4636cfcb424f1a6f5a6 not found: ID does not exist" containerID="b0480b9070a425bdcaf40037367636184d42bc8e2a9bb4636cfcb424f1a6f5a6" Oct 07 12:59:21 crc kubenswrapper[4854]: I1007 12:59:21.283443 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0480b9070a425bdcaf40037367636184d42bc8e2a9bb4636cfcb424f1a6f5a6"} err="failed to get container status \"b0480b9070a425bdcaf40037367636184d42bc8e2a9bb4636cfcb424f1a6f5a6\": rpc error: code = NotFound desc = could not find container \"b0480b9070a425bdcaf40037367636184d42bc8e2a9bb4636cfcb424f1a6f5a6\": container with ID starting with b0480b9070a425bdcaf40037367636184d42bc8e2a9bb4636cfcb424f1a6f5a6 not found: ID does not exist" Oct 07 12:59:22 crc kubenswrapper[4854]: I1007 12:59:22.714605 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a879d2f-9848-408c-ac63-70933900f0f7" path="/var/lib/kubelet/pods/4a879d2f-9848-408c-ac63-70933900f0f7/volumes" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.155839 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m"] Oct 07 13:00:00 crc kubenswrapper[4854]: E1007 13:00:00.156839 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a879d2f-9848-408c-ac63-70933900f0f7" containerName="extract-content" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.156854 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a879d2f-9848-408c-ac63-70933900f0f7" containerName="extract-content" Oct 07 13:00:00 crc kubenswrapper[4854]: E1007 13:00:00.156916 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a879d2f-9848-408c-ac63-70933900f0f7" containerName="registry-server" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.156924 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a879d2f-9848-408c-ac63-70933900f0f7" containerName="registry-server" Oct 07 13:00:00 crc kubenswrapper[4854]: E1007 13:00:00.156949 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a879d2f-9848-408c-ac63-70933900f0f7" containerName="extract-utilities" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.156957 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a879d2f-9848-408c-ac63-70933900f0f7" containerName="extract-utilities" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.157218 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a879d2f-9848-408c-ac63-70933900f0f7" containerName="registry-server" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.157897 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.160948 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.161089 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.169894 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m"] Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.284947 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjzlx\" (UniqueName: \"kubernetes.io/projected/23a460b3-ea0c-426a-8ab9-86bf29315351-kube-api-access-rjzlx\") pod \"collect-profiles-29330700-s5w5m\" (UID: \"23a460b3-ea0c-426a-8ab9-86bf29315351\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.285056 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23a460b3-ea0c-426a-8ab9-86bf29315351-config-volume\") pod \"collect-profiles-29330700-s5w5m\" (UID: \"23a460b3-ea0c-426a-8ab9-86bf29315351\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.285173 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23a460b3-ea0c-426a-8ab9-86bf29315351-secret-volume\") pod \"collect-profiles-29330700-s5w5m\" (UID: \"23a460b3-ea0c-426a-8ab9-86bf29315351\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.386341 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23a460b3-ea0c-426a-8ab9-86bf29315351-config-volume\") pod \"collect-profiles-29330700-s5w5m\" (UID: \"23a460b3-ea0c-426a-8ab9-86bf29315351\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.386397 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23a460b3-ea0c-426a-8ab9-86bf29315351-secret-volume\") pod \"collect-profiles-29330700-s5w5m\" (UID: \"23a460b3-ea0c-426a-8ab9-86bf29315351\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.386503 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjzlx\" (UniqueName: \"kubernetes.io/projected/23a460b3-ea0c-426a-8ab9-86bf29315351-kube-api-access-rjzlx\") pod \"collect-profiles-29330700-s5w5m\" (UID: \"23a460b3-ea0c-426a-8ab9-86bf29315351\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.388240 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23a460b3-ea0c-426a-8ab9-86bf29315351-config-volume\") pod \"collect-profiles-29330700-s5w5m\" (UID: \"23a460b3-ea0c-426a-8ab9-86bf29315351\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.398130 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23a460b3-ea0c-426a-8ab9-86bf29315351-secret-volume\") pod \"collect-profiles-29330700-s5w5m\" (UID: \"23a460b3-ea0c-426a-8ab9-86bf29315351\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.404845 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjzlx\" (UniqueName: \"kubernetes.io/projected/23a460b3-ea0c-426a-8ab9-86bf29315351-kube-api-access-rjzlx\") pod \"collect-profiles-29330700-s5w5m\" (UID: \"23a460b3-ea0c-426a-8ab9-86bf29315351\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.483088 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" Oct 07 13:00:00 crc kubenswrapper[4854]: I1007 13:00:00.980667 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m"] Oct 07 13:00:01 crc kubenswrapper[4854]: I1007 13:00:01.568427 4854 generic.go:334] "Generic (PLEG): container finished" podID="23a460b3-ea0c-426a-8ab9-86bf29315351" containerID="d768df5b19a76ee6bc9b9fd92b969ed76c41d17b4e012e19191b75658e07916d" exitCode=0 Oct 07 13:00:01 crc kubenswrapper[4854]: I1007 13:00:01.568566 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" event={"ID":"23a460b3-ea0c-426a-8ab9-86bf29315351","Type":"ContainerDied","Data":"d768df5b19a76ee6bc9b9fd92b969ed76c41d17b4e012e19191b75658e07916d"} Oct 07 13:00:01 crc kubenswrapper[4854]: I1007 13:00:01.568801 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" event={"ID":"23a460b3-ea0c-426a-8ab9-86bf29315351","Type":"ContainerStarted","Data":"9a4f055d68bba8a7bd394b6ddf2764228ba11319c7e03ee9b151a95ead1ff0b1"} Oct 07 13:00:02 crc kubenswrapper[4854]: I1007 13:00:02.863957 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" Oct 07 13:00:03 crc kubenswrapper[4854]: I1007 13:00:03.027742 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjzlx\" (UniqueName: \"kubernetes.io/projected/23a460b3-ea0c-426a-8ab9-86bf29315351-kube-api-access-rjzlx\") pod \"23a460b3-ea0c-426a-8ab9-86bf29315351\" (UID: \"23a460b3-ea0c-426a-8ab9-86bf29315351\") " Oct 07 13:00:03 crc kubenswrapper[4854]: I1007 13:00:03.027896 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23a460b3-ea0c-426a-8ab9-86bf29315351-secret-volume\") pod \"23a460b3-ea0c-426a-8ab9-86bf29315351\" (UID: \"23a460b3-ea0c-426a-8ab9-86bf29315351\") " Oct 07 13:00:03 crc kubenswrapper[4854]: I1007 13:00:03.027923 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23a460b3-ea0c-426a-8ab9-86bf29315351-config-volume\") pod \"23a460b3-ea0c-426a-8ab9-86bf29315351\" (UID: \"23a460b3-ea0c-426a-8ab9-86bf29315351\") " Oct 07 13:00:03 crc kubenswrapper[4854]: I1007 13:00:03.028621 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a460b3-ea0c-426a-8ab9-86bf29315351-config-volume" (OuterVolumeSpecName: "config-volume") pod "23a460b3-ea0c-426a-8ab9-86bf29315351" (UID: "23a460b3-ea0c-426a-8ab9-86bf29315351"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:00:03 crc kubenswrapper[4854]: I1007 13:00:03.035532 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a460b3-ea0c-426a-8ab9-86bf29315351-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "23a460b3-ea0c-426a-8ab9-86bf29315351" (UID: "23a460b3-ea0c-426a-8ab9-86bf29315351"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:00:03 crc kubenswrapper[4854]: I1007 13:00:03.036164 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a460b3-ea0c-426a-8ab9-86bf29315351-kube-api-access-rjzlx" (OuterVolumeSpecName: "kube-api-access-rjzlx") pod "23a460b3-ea0c-426a-8ab9-86bf29315351" (UID: "23a460b3-ea0c-426a-8ab9-86bf29315351"). InnerVolumeSpecName "kube-api-access-rjzlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:00:03 crc kubenswrapper[4854]: I1007 13:00:03.129342 4854 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23a460b3-ea0c-426a-8ab9-86bf29315351-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:00:03 crc kubenswrapper[4854]: I1007 13:00:03.129388 4854 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23a460b3-ea0c-426a-8ab9-86bf29315351-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:00:03 crc kubenswrapper[4854]: I1007 13:00:03.129409 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjzlx\" (UniqueName: \"kubernetes.io/projected/23a460b3-ea0c-426a-8ab9-86bf29315351-kube-api-access-rjzlx\") on node \"crc\" DevicePath \"\"" Oct 07 13:00:03 crc kubenswrapper[4854]: I1007 13:00:03.591841 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" event={"ID":"23a460b3-ea0c-426a-8ab9-86bf29315351","Type":"ContainerDied","Data":"9a4f055d68bba8a7bd394b6ddf2764228ba11319c7e03ee9b151a95ead1ff0b1"} Oct 07 13:00:03 crc kubenswrapper[4854]: I1007 13:00:03.591893 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a4f055d68bba8a7bd394b6ddf2764228ba11319c7e03ee9b151a95ead1ff0b1" Oct 07 13:00:03 crc kubenswrapper[4854]: I1007 13:00:03.591951 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m" Oct 07 13:00:03 crc kubenswrapper[4854]: I1007 13:00:03.955981 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s"] Oct 07 13:00:03 crc kubenswrapper[4854]: I1007 13:00:03.967237 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330655-nh22s"] Oct 07 13:00:04 crc kubenswrapper[4854]: I1007 13:00:04.733966 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8636dc3b-78f1-4098-81ca-1ae0ef4c441b" path="/var/lib/kubelet/pods/8636dc3b-78f1-4098-81ca-1ae0ef4c441b/volumes" Oct 07 13:00:10 crc kubenswrapper[4854]: I1007 13:00:10.808228 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:00:10 crc kubenswrapper[4854]: I1007 13:00:10.808979 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:00:19 crc kubenswrapper[4854]: I1007 13:00:19.420108 4854 scope.go:117] "RemoveContainer" containerID="83c7d7761eed2717d67730fef786b60ebccf645f50e1bde8cb11ec56fa3300ef" Oct 07 13:00:40 crc kubenswrapper[4854]: I1007 13:00:40.808294 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:00:40 crc kubenswrapper[4854]: I1007 13:00:40.808918 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:01:03 crc kubenswrapper[4854]: I1007 13:01:03.722825 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4lrxl"] Oct 07 13:01:03 crc kubenswrapper[4854]: E1007 13:01:03.723949 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a460b3-ea0c-426a-8ab9-86bf29315351" containerName="collect-profiles" Oct 07 13:01:03 crc kubenswrapper[4854]: I1007 13:01:03.723974 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a460b3-ea0c-426a-8ab9-86bf29315351" containerName="collect-profiles" Oct 07 13:01:03 crc kubenswrapper[4854]: I1007 13:01:03.724280 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a460b3-ea0c-426a-8ab9-86bf29315351" containerName="collect-profiles" Oct 07 13:01:03 crc kubenswrapper[4854]: I1007 13:01:03.726062 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:03 crc kubenswrapper[4854]: I1007 13:01:03.740633 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4lrxl"] Oct 07 13:01:03 crc kubenswrapper[4854]: I1007 13:01:03.888509 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b97s7\" (UniqueName: \"kubernetes.io/projected/54a5d528-9856-424c-87b4-c3ec44960e5a-kube-api-access-b97s7\") pod \"certified-operators-4lrxl\" (UID: \"54a5d528-9856-424c-87b4-c3ec44960e5a\") " pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:03 crc kubenswrapper[4854]: I1007 13:01:03.888748 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a5d528-9856-424c-87b4-c3ec44960e5a-catalog-content\") pod \"certified-operators-4lrxl\" (UID: \"54a5d528-9856-424c-87b4-c3ec44960e5a\") " pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:03 crc kubenswrapper[4854]: I1007 13:01:03.888767 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a5d528-9856-424c-87b4-c3ec44960e5a-utilities\") pod \"certified-operators-4lrxl\" (UID: \"54a5d528-9856-424c-87b4-c3ec44960e5a\") " pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:03 crc kubenswrapper[4854]: I1007 13:01:03.990835 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a5d528-9856-424c-87b4-c3ec44960e5a-catalog-content\") pod \"certified-operators-4lrxl\" (UID: \"54a5d528-9856-424c-87b4-c3ec44960e5a\") " pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:03 crc kubenswrapper[4854]: I1007 13:01:03.990900 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a5d528-9856-424c-87b4-c3ec44960e5a-utilities\") pod \"certified-operators-4lrxl\" (UID: \"54a5d528-9856-424c-87b4-c3ec44960e5a\") " pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:03 crc kubenswrapper[4854]: I1007 13:01:03.990957 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b97s7\" (UniqueName: \"kubernetes.io/projected/54a5d528-9856-424c-87b4-c3ec44960e5a-kube-api-access-b97s7\") pod \"certified-operators-4lrxl\" (UID: \"54a5d528-9856-424c-87b4-c3ec44960e5a\") " pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:03 crc kubenswrapper[4854]: I1007 13:01:03.991564 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a5d528-9856-424c-87b4-c3ec44960e5a-utilities\") pod \"certified-operators-4lrxl\" (UID: \"54a5d528-9856-424c-87b4-c3ec44960e5a\") " pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:03 crc kubenswrapper[4854]: I1007 13:01:03.991907 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a5d528-9856-424c-87b4-c3ec44960e5a-catalog-content\") pod \"certified-operators-4lrxl\" (UID: \"54a5d528-9856-424c-87b4-c3ec44960e5a\") " pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:04 crc kubenswrapper[4854]: I1007 13:01:04.031706 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b97s7\" (UniqueName: \"kubernetes.io/projected/54a5d528-9856-424c-87b4-c3ec44960e5a-kube-api-access-b97s7\") pod \"certified-operators-4lrxl\" (UID: \"54a5d528-9856-424c-87b4-c3ec44960e5a\") " pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:04 crc kubenswrapper[4854]: I1007 13:01:04.067912 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:04 crc kubenswrapper[4854]: I1007 13:01:04.324111 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4lrxl"] Oct 07 13:01:05 crc kubenswrapper[4854]: I1007 13:01:05.166767 4854 generic.go:334] "Generic (PLEG): container finished" podID="54a5d528-9856-424c-87b4-c3ec44960e5a" containerID="9ace67ef18d51d61b38df390f3c631d171b1cd1956410d4ba30996d42bb6c4c1" exitCode=0 Oct 07 13:01:05 crc kubenswrapper[4854]: I1007 13:01:05.166869 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lrxl" event={"ID":"54a5d528-9856-424c-87b4-c3ec44960e5a","Type":"ContainerDied","Data":"9ace67ef18d51d61b38df390f3c631d171b1cd1956410d4ba30996d42bb6c4c1"} Oct 07 13:01:05 crc kubenswrapper[4854]: I1007 13:01:05.166928 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lrxl" event={"ID":"54a5d528-9856-424c-87b4-c3ec44960e5a","Type":"ContainerStarted","Data":"183682d7520f45d434adf612ae1ebc45b60625fa353835cf9f4533e733f4f375"} Oct 07 13:01:07 crc kubenswrapper[4854]: I1007 13:01:07.183999 4854 generic.go:334] "Generic (PLEG): container finished" podID="54a5d528-9856-424c-87b4-c3ec44960e5a" containerID="dab89c1bcdbf66ac8306d15842ef822e43984ef9899024f8cb07804d40f9dfc2" exitCode=0 Oct 07 13:01:07 crc kubenswrapper[4854]: I1007 13:01:07.184054 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lrxl" event={"ID":"54a5d528-9856-424c-87b4-c3ec44960e5a","Type":"ContainerDied","Data":"dab89c1bcdbf66ac8306d15842ef822e43984ef9899024f8cb07804d40f9dfc2"} Oct 07 13:01:08 crc kubenswrapper[4854]: I1007 13:01:08.197408 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lrxl" event={"ID":"54a5d528-9856-424c-87b4-c3ec44960e5a","Type":"ContainerStarted","Data":"a400f14773c40d7512b387a8e7e78db22a109926141a89801b7f2e206ceabdf0"} Oct 07 13:01:10 crc kubenswrapper[4854]: I1007 13:01:10.807528 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:01:10 crc kubenswrapper[4854]: I1007 13:01:10.807638 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:01:10 crc kubenswrapper[4854]: I1007 13:01:10.807708 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 13:01:10 crc kubenswrapper[4854]: I1007 13:01:10.808587 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d2a1a38e20691746e12ccd6dfe6c642f0b09a208db24501927e2ee26a8b722a7"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:01:10 crc kubenswrapper[4854]: I1007 13:01:10.808679 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://d2a1a38e20691746e12ccd6dfe6c642f0b09a208db24501927e2ee26a8b722a7" gracePeriod=600 Oct 07 13:01:11 crc kubenswrapper[4854]: I1007 13:01:11.240335 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="d2a1a38e20691746e12ccd6dfe6c642f0b09a208db24501927e2ee26a8b722a7" exitCode=0 Oct 07 13:01:11 crc kubenswrapper[4854]: I1007 13:01:11.240428 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"d2a1a38e20691746e12ccd6dfe6c642f0b09a208db24501927e2ee26a8b722a7"} Oct 07 13:01:11 crc kubenswrapper[4854]: I1007 13:01:11.240777 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98"} Oct 07 13:01:11 crc kubenswrapper[4854]: I1007 13:01:11.240812 4854 scope.go:117] "RemoveContainer" containerID="4dd26d55eea17079a4cf92e15f926725e28c06e10fa5f6835df5a38b86923df5" Oct 07 13:01:11 crc kubenswrapper[4854]: I1007 13:01:11.268732 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4lrxl" podStartSLOduration=5.841195337 podStartE2EDuration="8.268706484s" podCreationTimestamp="2025-10-07 13:01:03 +0000 UTC" firstStartedPulling="2025-10-07 13:01:05.171860781 +0000 UTC m=+2181.159693036" lastFinishedPulling="2025-10-07 13:01:07.599371898 +0000 UTC m=+2183.587204183" observedRunningTime="2025-10-07 13:01:08.23236691 +0000 UTC m=+2184.220199215" watchObservedRunningTime="2025-10-07 13:01:11.268706484 +0000 UTC m=+2187.256538739" Oct 07 13:01:14 crc kubenswrapper[4854]: I1007 13:01:14.068852 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:14 crc kubenswrapper[4854]: I1007 13:01:14.069402 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:14 crc kubenswrapper[4854]: I1007 13:01:14.112440 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:14 crc kubenswrapper[4854]: I1007 13:01:14.314993 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:14 crc kubenswrapper[4854]: I1007 13:01:14.364294 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4lrxl"] Oct 07 13:01:16 crc kubenswrapper[4854]: I1007 13:01:16.288740 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4lrxl" podUID="54a5d528-9856-424c-87b4-c3ec44960e5a" containerName="registry-server" containerID="cri-o://a400f14773c40d7512b387a8e7e78db22a109926141a89801b7f2e206ceabdf0" gracePeriod=2 Oct 07 13:01:16 crc kubenswrapper[4854]: I1007 13:01:16.766058 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:16 crc kubenswrapper[4854]: I1007 13:01:16.905093 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a5d528-9856-424c-87b4-c3ec44960e5a-catalog-content\") pod \"54a5d528-9856-424c-87b4-c3ec44960e5a\" (UID: \"54a5d528-9856-424c-87b4-c3ec44960e5a\") " Oct 07 13:01:16 crc kubenswrapper[4854]: I1007 13:01:16.905220 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b97s7\" (UniqueName: \"kubernetes.io/projected/54a5d528-9856-424c-87b4-c3ec44960e5a-kube-api-access-b97s7\") pod \"54a5d528-9856-424c-87b4-c3ec44960e5a\" (UID: \"54a5d528-9856-424c-87b4-c3ec44960e5a\") " Oct 07 13:01:16 crc kubenswrapper[4854]: I1007 13:01:16.905314 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a5d528-9856-424c-87b4-c3ec44960e5a-utilities\") pod \"54a5d528-9856-424c-87b4-c3ec44960e5a\" (UID: \"54a5d528-9856-424c-87b4-c3ec44960e5a\") " Oct 07 13:01:16 crc kubenswrapper[4854]: I1007 13:01:16.906721 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54a5d528-9856-424c-87b4-c3ec44960e5a-utilities" (OuterVolumeSpecName: "utilities") pod "54a5d528-9856-424c-87b4-c3ec44960e5a" (UID: "54a5d528-9856-424c-87b4-c3ec44960e5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:01:16 crc kubenswrapper[4854]: I1007 13:01:16.914593 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a5d528-9856-424c-87b4-c3ec44960e5a-kube-api-access-b97s7" (OuterVolumeSpecName: "kube-api-access-b97s7") pod "54a5d528-9856-424c-87b4-c3ec44960e5a" (UID: "54a5d528-9856-424c-87b4-c3ec44960e5a"). InnerVolumeSpecName "kube-api-access-b97s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.007769 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b97s7\" (UniqueName: \"kubernetes.io/projected/54a5d528-9856-424c-87b4-c3ec44960e5a-kube-api-access-b97s7\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.007819 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a5d528-9856-424c-87b4-c3ec44960e5a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.304099 4854 generic.go:334] "Generic (PLEG): container finished" podID="54a5d528-9856-424c-87b4-c3ec44960e5a" containerID="a400f14773c40d7512b387a8e7e78db22a109926141a89801b7f2e206ceabdf0" exitCode=0 Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.304221 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lrxl" event={"ID":"54a5d528-9856-424c-87b4-c3ec44960e5a","Type":"ContainerDied","Data":"a400f14773c40d7512b387a8e7e78db22a109926141a89801b7f2e206ceabdf0"} Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.304354 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4lrxl" event={"ID":"54a5d528-9856-424c-87b4-c3ec44960e5a","Type":"ContainerDied","Data":"183682d7520f45d434adf612ae1ebc45b60625fa353835cf9f4533e733f4f375"} Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.304399 4854 scope.go:117] "RemoveContainer" containerID="a400f14773c40d7512b387a8e7e78db22a109926141a89801b7f2e206ceabdf0" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.304246 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4lrxl" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.333963 4854 scope.go:117] "RemoveContainer" containerID="dab89c1bcdbf66ac8306d15842ef822e43984ef9899024f8cb07804d40f9dfc2" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.357542 4854 scope.go:117] "RemoveContainer" containerID="9ace67ef18d51d61b38df390f3c631d171b1cd1956410d4ba30996d42bb6c4c1" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.399813 4854 scope.go:117] "RemoveContainer" containerID="a400f14773c40d7512b387a8e7e78db22a109926141a89801b7f2e206ceabdf0" Oct 07 13:01:17 crc kubenswrapper[4854]: E1007 13:01:17.400417 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a400f14773c40d7512b387a8e7e78db22a109926141a89801b7f2e206ceabdf0\": container with ID starting with a400f14773c40d7512b387a8e7e78db22a109926141a89801b7f2e206ceabdf0 not found: ID does not exist" containerID="a400f14773c40d7512b387a8e7e78db22a109926141a89801b7f2e206ceabdf0" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.400453 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a400f14773c40d7512b387a8e7e78db22a109926141a89801b7f2e206ceabdf0"} err="failed to get container status \"a400f14773c40d7512b387a8e7e78db22a109926141a89801b7f2e206ceabdf0\": rpc error: code = NotFound desc = could not find container \"a400f14773c40d7512b387a8e7e78db22a109926141a89801b7f2e206ceabdf0\": container with ID starting with a400f14773c40d7512b387a8e7e78db22a109926141a89801b7f2e206ceabdf0 not found: ID does not exist" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.400472 4854 scope.go:117] "RemoveContainer" containerID="dab89c1bcdbf66ac8306d15842ef822e43984ef9899024f8cb07804d40f9dfc2" Oct 07 13:01:17 crc kubenswrapper[4854]: E1007 13:01:17.407170 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab89c1bcdbf66ac8306d15842ef822e43984ef9899024f8cb07804d40f9dfc2\": container with ID starting with dab89c1bcdbf66ac8306d15842ef822e43984ef9899024f8cb07804d40f9dfc2 not found: ID does not exist" containerID="dab89c1bcdbf66ac8306d15842ef822e43984ef9899024f8cb07804d40f9dfc2" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.407202 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab89c1bcdbf66ac8306d15842ef822e43984ef9899024f8cb07804d40f9dfc2"} err="failed to get container status \"dab89c1bcdbf66ac8306d15842ef822e43984ef9899024f8cb07804d40f9dfc2\": rpc error: code = NotFound desc = could not find container \"dab89c1bcdbf66ac8306d15842ef822e43984ef9899024f8cb07804d40f9dfc2\": container with ID starting with dab89c1bcdbf66ac8306d15842ef822e43984ef9899024f8cb07804d40f9dfc2 not found: ID does not exist" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.407222 4854 scope.go:117] "RemoveContainer" containerID="9ace67ef18d51d61b38df390f3c631d171b1cd1956410d4ba30996d42bb6c4c1" Oct 07 13:01:17 crc kubenswrapper[4854]: E1007 13:01:17.407706 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ace67ef18d51d61b38df390f3c631d171b1cd1956410d4ba30996d42bb6c4c1\": container with ID starting with 9ace67ef18d51d61b38df390f3c631d171b1cd1956410d4ba30996d42bb6c4c1 not found: ID does not exist" containerID="9ace67ef18d51d61b38df390f3c631d171b1cd1956410d4ba30996d42bb6c4c1" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.407744 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ace67ef18d51d61b38df390f3c631d171b1cd1956410d4ba30996d42bb6c4c1"} err="failed to get container status \"9ace67ef18d51d61b38df390f3c631d171b1cd1956410d4ba30996d42bb6c4c1\": rpc error: code = NotFound desc = could not find container \"9ace67ef18d51d61b38df390f3c631d171b1cd1956410d4ba30996d42bb6c4c1\": container with ID starting with 9ace67ef18d51d61b38df390f3c631d171b1cd1956410d4ba30996d42bb6c4c1 not found: ID does not exist" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.517862 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54a5d528-9856-424c-87b4-c3ec44960e5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54a5d528-9856-424c-87b4-c3ec44960e5a" (UID: "54a5d528-9856-424c-87b4-c3ec44960e5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.616373 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a5d528-9856-424c-87b4-c3ec44960e5a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.651305 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4lrxl"] Oct 07 13:01:17 crc kubenswrapper[4854]: I1007 13:01:17.658370 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4lrxl"] Oct 07 13:01:18 crc kubenswrapper[4854]: I1007 13:01:18.711873 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54a5d528-9856-424c-87b4-c3ec44960e5a" path="/var/lib/kubelet/pods/54a5d528-9856-424c-87b4-c3ec44960e5a/volumes" Oct 07 13:01:46 crc kubenswrapper[4854]: I1007 13:01:46.883760 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cswqf"] Oct 07 13:01:46 crc kubenswrapper[4854]: E1007 13:01:46.884636 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a5d528-9856-424c-87b4-c3ec44960e5a" containerName="extract-content" Oct 07 13:01:46 crc kubenswrapper[4854]: I1007 13:01:46.884652 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a5d528-9856-424c-87b4-c3ec44960e5a" containerName="extract-content" Oct 07 13:01:46 crc kubenswrapper[4854]: E1007 13:01:46.884679 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a5d528-9856-424c-87b4-c3ec44960e5a" containerName="extract-utilities" Oct 07 13:01:46 crc kubenswrapper[4854]: I1007 13:01:46.884688 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a5d528-9856-424c-87b4-c3ec44960e5a" containerName="extract-utilities" Oct 07 13:01:46 crc kubenswrapper[4854]: E1007 13:01:46.884698 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a5d528-9856-424c-87b4-c3ec44960e5a" containerName="registry-server" Oct 07 13:01:46 crc kubenswrapper[4854]: I1007 13:01:46.884709 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a5d528-9856-424c-87b4-c3ec44960e5a" containerName="registry-server" Oct 07 13:01:46 crc kubenswrapper[4854]: I1007 13:01:46.884908 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a5d528-9856-424c-87b4-c3ec44960e5a" containerName="registry-server" Oct 07 13:01:46 crc kubenswrapper[4854]: I1007 13:01:46.886776 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:46 crc kubenswrapper[4854]: I1007 13:01:46.893159 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cswqf"] Oct 07 13:01:47 crc kubenswrapper[4854]: I1007 13:01:47.057495 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00b185a8-2646-490c-bea2-68299a788c2d-utilities\") pod \"redhat-operators-cswqf\" (UID: \"00b185a8-2646-490c-bea2-68299a788c2d\") " pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:47 crc kubenswrapper[4854]: I1007 13:01:47.057594 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5btg\" (UniqueName: \"kubernetes.io/projected/00b185a8-2646-490c-bea2-68299a788c2d-kube-api-access-p5btg\") pod \"redhat-operators-cswqf\" (UID: \"00b185a8-2646-490c-bea2-68299a788c2d\") " pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:47 crc kubenswrapper[4854]: I1007 13:01:47.057656 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00b185a8-2646-490c-bea2-68299a788c2d-catalog-content\") pod \"redhat-operators-cswqf\" (UID: \"00b185a8-2646-490c-bea2-68299a788c2d\") " pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:47 crc kubenswrapper[4854]: I1007 13:01:47.160043 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00b185a8-2646-490c-bea2-68299a788c2d-utilities\") pod \"redhat-operators-cswqf\" (UID: \"00b185a8-2646-490c-bea2-68299a788c2d\") " pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:47 crc kubenswrapper[4854]: I1007 13:01:47.160137 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5btg\" (UniqueName: \"kubernetes.io/projected/00b185a8-2646-490c-bea2-68299a788c2d-kube-api-access-p5btg\") pod \"redhat-operators-cswqf\" (UID: \"00b185a8-2646-490c-bea2-68299a788c2d\") " pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:47 crc kubenswrapper[4854]: I1007 13:01:47.160189 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00b185a8-2646-490c-bea2-68299a788c2d-catalog-content\") pod \"redhat-operators-cswqf\" (UID: \"00b185a8-2646-490c-bea2-68299a788c2d\") " pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:47 crc kubenswrapper[4854]: I1007 13:01:47.160563 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00b185a8-2646-490c-bea2-68299a788c2d-utilities\") pod \"redhat-operators-cswqf\" (UID: \"00b185a8-2646-490c-bea2-68299a788c2d\") " pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:47 crc kubenswrapper[4854]: I1007 13:01:47.160686 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00b185a8-2646-490c-bea2-68299a788c2d-catalog-content\") pod \"redhat-operators-cswqf\" (UID: \"00b185a8-2646-490c-bea2-68299a788c2d\") " pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:47 crc kubenswrapper[4854]: I1007 13:01:47.188185 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5btg\" (UniqueName: \"kubernetes.io/projected/00b185a8-2646-490c-bea2-68299a788c2d-kube-api-access-p5btg\") pod \"redhat-operators-cswqf\" (UID: \"00b185a8-2646-490c-bea2-68299a788c2d\") " pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:47 crc kubenswrapper[4854]: I1007 13:01:47.217050 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:47 crc kubenswrapper[4854]: I1007 13:01:47.632607 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cswqf"] Oct 07 13:01:48 crc kubenswrapper[4854]: I1007 13:01:48.568402 4854 generic.go:334] "Generic (PLEG): container finished" podID="00b185a8-2646-490c-bea2-68299a788c2d" containerID="0c707a396447f475c4664c6c7bf7c5de9eff492fe605999a8804fa54ba3be9f8" exitCode=0 Oct 07 13:01:48 crc kubenswrapper[4854]: I1007 13:01:48.568470 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cswqf" event={"ID":"00b185a8-2646-490c-bea2-68299a788c2d","Type":"ContainerDied","Data":"0c707a396447f475c4664c6c7bf7c5de9eff492fe605999a8804fa54ba3be9f8"} Oct 07 13:01:48 crc kubenswrapper[4854]: I1007 13:01:48.568738 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cswqf" event={"ID":"00b185a8-2646-490c-bea2-68299a788c2d","Type":"ContainerStarted","Data":"7cc065d59da48ee2c78e35a9bf82a1b31d3dcb23db42d6f7b172bf24d2d1553d"} Oct 07 13:01:50 crc kubenswrapper[4854]: I1007 13:01:50.585969 4854 generic.go:334] "Generic (PLEG): container finished" podID="00b185a8-2646-490c-bea2-68299a788c2d" containerID="b870580a32d1f1377552c81e05e9fcc6cb3348759d372d4caad4525a107f24e7" exitCode=0 Oct 07 13:01:50 crc kubenswrapper[4854]: I1007 13:01:50.586086 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cswqf" event={"ID":"00b185a8-2646-490c-bea2-68299a788c2d","Type":"ContainerDied","Data":"b870580a32d1f1377552c81e05e9fcc6cb3348759d372d4caad4525a107f24e7"} Oct 07 13:01:51 crc kubenswrapper[4854]: I1007 13:01:51.600079 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cswqf" event={"ID":"00b185a8-2646-490c-bea2-68299a788c2d","Type":"ContainerStarted","Data":"ab2fcb37bfef3160e0fbcf13dc3e03adbd0c1cafd18867de84ce117f1971f964"} Oct 07 13:01:51 crc kubenswrapper[4854]: I1007 13:01:51.633966 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cswqf" podStartSLOduration=3.158663783 podStartE2EDuration="5.633944411s" podCreationTimestamp="2025-10-07 13:01:46 +0000 UTC" firstStartedPulling="2025-10-07 13:01:48.571578618 +0000 UTC m=+2224.559410913" lastFinishedPulling="2025-10-07 13:01:51.046859276 +0000 UTC m=+2227.034691541" observedRunningTime="2025-10-07 13:01:51.626621877 +0000 UTC m=+2227.614454162" watchObservedRunningTime="2025-10-07 13:01:51.633944411 +0000 UTC m=+2227.621776676" Oct 07 13:01:57 crc kubenswrapper[4854]: I1007 13:01:57.217722 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:57 crc kubenswrapper[4854]: I1007 13:01:57.218083 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:57 crc kubenswrapper[4854]: I1007 13:01:57.287013 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:57 crc kubenswrapper[4854]: I1007 13:01:57.727467 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:01:57 crc kubenswrapper[4854]: I1007 13:01:57.797485 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cswqf"] Oct 07 13:01:59 crc kubenswrapper[4854]: I1007 13:01:59.672393 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cswqf" podUID="00b185a8-2646-490c-bea2-68299a788c2d" containerName="registry-server" containerID="cri-o://ab2fcb37bfef3160e0fbcf13dc3e03adbd0c1cafd18867de84ce117f1971f964" gracePeriod=2 Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.111649 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.158066 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5btg\" (UniqueName: \"kubernetes.io/projected/00b185a8-2646-490c-bea2-68299a788c2d-kube-api-access-p5btg\") pod \"00b185a8-2646-490c-bea2-68299a788c2d\" (UID: \"00b185a8-2646-490c-bea2-68299a788c2d\") " Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.158185 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00b185a8-2646-490c-bea2-68299a788c2d-utilities\") pod \"00b185a8-2646-490c-bea2-68299a788c2d\" (UID: \"00b185a8-2646-490c-bea2-68299a788c2d\") " Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.158289 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00b185a8-2646-490c-bea2-68299a788c2d-catalog-content\") pod \"00b185a8-2646-490c-bea2-68299a788c2d\" (UID: \"00b185a8-2646-490c-bea2-68299a788c2d\") " Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.159526 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00b185a8-2646-490c-bea2-68299a788c2d-utilities" (OuterVolumeSpecName: "utilities") pod "00b185a8-2646-490c-bea2-68299a788c2d" (UID: "00b185a8-2646-490c-bea2-68299a788c2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.182424 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00b185a8-2646-490c-bea2-68299a788c2d-kube-api-access-p5btg" (OuterVolumeSpecName: "kube-api-access-p5btg") pod "00b185a8-2646-490c-bea2-68299a788c2d" (UID: "00b185a8-2646-490c-bea2-68299a788c2d"). InnerVolumeSpecName "kube-api-access-p5btg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.259838 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5btg\" (UniqueName: \"kubernetes.io/projected/00b185a8-2646-490c-bea2-68299a788c2d-kube-api-access-p5btg\") on node \"crc\" DevicePath \"\"" Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.259886 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00b185a8-2646-490c-bea2-68299a788c2d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.683203 4854 generic.go:334] "Generic (PLEG): container finished" podID="00b185a8-2646-490c-bea2-68299a788c2d" containerID="ab2fcb37bfef3160e0fbcf13dc3e03adbd0c1cafd18867de84ce117f1971f964" exitCode=0 Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.683271 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cswqf" event={"ID":"00b185a8-2646-490c-bea2-68299a788c2d","Type":"ContainerDied","Data":"ab2fcb37bfef3160e0fbcf13dc3e03adbd0c1cafd18867de84ce117f1971f964"} Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.683298 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cswqf" event={"ID":"00b185a8-2646-490c-bea2-68299a788c2d","Type":"ContainerDied","Data":"7cc065d59da48ee2c78e35a9bf82a1b31d3dcb23db42d6f7b172bf24d2d1553d"} Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.683335 4854 scope.go:117] "RemoveContainer" containerID="ab2fcb37bfef3160e0fbcf13dc3e03adbd0c1cafd18867de84ce117f1971f964" Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.683336 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cswqf" Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.709722 4854 scope.go:117] "RemoveContainer" containerID="b870580a32d1f1377552c81e05e9fcc6cb3348759d372d4caad4525a107f24e7" Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.745512 4854 scope.go:117] "RemoveContainer" containerID="0c707a396447f475c4664c6c7bf7c5de9eff492fe605999a8804fa54ba3be9f8" Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.781501 4854 scope.go:117] "RemoveContainer" containerID="ab2fcb37bfef3160e0fbcf13dc3e03adbd0c1cafd18867de84ce117f1971f964" Oct 07 13:02:00 crc kubenswrapper[4854]: E1007 13:02:00.782017 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab2fcb37bfef3160e0fbcf13dc3e03adbd0c1cafd18867de84ce117f1971f964\": container with ID starting with ab2fcb37bfef3160e0fbcf13dc3e03adbd0c1cafd18867de84ce117f1971f964 not found: ID does not exist" containerID="ab2fcb37bfef3160e0fbcf13dc3e03adbd0c1cafd18867de84ce117f1971f964" Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.782079 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab2fcb37bfef3160e0fbcf13dc3e03adbd0c1cafd18867de84ce117f1971f964"} err="failed to get container status \"ab2fcb37bfef3160e0fbcf13dc3e03adbd0c1cafd18867de84ce117f1971f964\": rpc error: code = NotFound desc = could not find container \"ab2fcb37bfef3160e0fbcf13dc3e03adbd0c1cafd18867de84ce117f1971f964\": container with ID starting with ab2fcb37bfef3160e0fbcf13dc3e03adbd0c1cafd18867de84ce117f1971f964 not found: ID does not exist" Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.782120 4854 scope.go:117] "RemoveContainer" containerID="b870580a32d1f1377552c81e05e9fcc6cb3348759d372d4caad4525a107f24e7" Oct 07 13:02:00 crc kubenswrapper[4854]: E1007 13:02:00.782592 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b870580a32d1f1377552c81e05e9fcc6cb3348759d372d4caad4525a107f24e7\": container with ID starting with b870580a32d1f1377552c81e05e9fcc6cb3348759d372d4caad4525a107f24e7 not found: ID does not exist" containerID="b870580a32d1f1377552c81e05e9fcc6cb3348759d372d4caad4525a107f24e7" Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.782646 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b870580a32d1f1377552c81e05e9fcc6cb3348759d372d4caad4525a107f24e7"} err="failed to get container status \"b870580a32d1f1377552c81e05e9fcc6cb3348759d372d4caad4525a107f24e7\": rpc error: code = NotFound desc = could not find container \"b870580a32d1f1377552c81e05e9fcc6cb3348759d372d4caad4525a107f24e7\": container with ID starting with b870580a32d1f1377552c81e05e9fcc6cb3348759d372d4caad4525a107f24e7 not found: ID does not exist" Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.782667 4854 scope.go:117] "RemoveContainer" containerID="0c707a396447f475c4664c6c7bf7c5de9eff492fe605999a8804fa54ba3be9f8" Oct 07 13:02:00 crc kubenswrapper[4854]: E1007 13:02:00.782954 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c707a396447f475c4664c6c7bf7c5de9eff492fe605999a8804fa54ba3be9f8\": container with ID starting with 0c707a396447f475c4664c6c7bf7c5de9eff492fe605999a8804fa54ba3be9f8 not found: ID does not exist" containerID="0c707a396447f475c4664c6c7bf7c5de9eff492fe605999a8804fa54ba3be9f8" Oct 07 13:02:00 crc kubenswrapper[4854]: I1007 13:02:00.782974 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c707a396447f475c4664c6c7bf7c5de9eff492fe605999a8804fa54ba3be9f8"} err="failed to get container status \"0c707a396447f475c4664c6c7bf7c5de9eff492fe605999a8804fa54ba3be9f8\": rpc error: code = NotFound desc = could not find container \"0c707a396447f475c4664c6c7bf7c5de9eff492fe605999a8804fa54ba3be9f8\": container with ID starting with 0c707a396447f475c4664c6c7bf7c5de9eff492fe605999a8804fa54ba3be9f8 not found: ID does not exist" Oct 07 13:02:02 crc kubenswrapper[4854]: I1007 13:02:02.004489 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00b185a8-2646-490c-bea2-68299a788c2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00b185a8-2646-490c-bea2-68299a788c2d" (UID: "00b185a8-2646-490c-bea2-68299a788c2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:02:02 crc kubenswrapper[4854]: I1007 13:02:02.088268 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00b185a8-2646-490c-bea2-68299a788c2d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:02:02 crc kubenswrapper[4854]: I1007 13:02:02.220819 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cswqf"] Oct 07 13:02:02 crc kubenswrapper[4854]: I1007 13:02:02.226868 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cswqf"] Oct 07 13:02:02 crc kubenswrapper[4854]: I1007 13:02:02.716583 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00b185a8-2646-490c-bea2-68299a788c2d" path="/var/lib/kubelet/pods/00b185a8-2646-490c-bea2-68299a788c2d/volumes" Oct 07 13:03:40 crc kubenswrapper[4854]: I1007 13:03:40.807640 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:03:40 crc kubenswrapper[4854]: I1007 13:03:40.808327 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:04:10 crc kubenswrapper[4854]: I1007 13:04:10.807344 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:04:10 crc kubenswrapper[4854]: I1007 13:04:10.807994 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:04:40 crc kubenswrapper[4854]: I1007 13:04:40.808229 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:04:40 crc kubenswrapper[4854]: I1007 13:04:40.809411 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:04:40 crc kubenswrapper[4854]: I1007 13:04:40.809478 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 13:04:40 crc kubenswrapper[4854]: I1007 13:04:40.810317 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:04:40 crc kubenswrapper[4854]: I1007 13:04:40.810488 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" gracePeriod=600 Oct 07 13:04:40 crc kubenswrapper[4854]: E1007 13:04:40.952791 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:04:41 crc kubenswrapper[4854]: I1007 13:04:41.260000 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" exitCode=0 Oct 07 13:04:41 crc kubenswrapper[4854]: I1007 13:04:41.260069 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98"} Oct 07 13:04:41 crc kubenswrapper[4854]: I1007 13:04:41.260134 4854 scope.go:117] "RemoveContainer" containerID="d2a1a38e20691746e12ccd6dfe6c642f0b09a208db24501927e2ee26a8b722a7" Oct 07 13:04:41 crc kubenswrapper[4854]: I1007 13:04:41.260934 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:04:41 crc kubenswrapper[4854]: E1007 13:04:41.261478 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:04:54 crc kubenswrapper[4854]: I1007 13:04:54.709785 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:04:54 crc kubenswrapper[4854]: E1007 13:04:54.710693 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:05:09 crc kubenswrapper[4854]: I1007 13:05:09.703001 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:05:09 crc kubenswrapper[4854]: E1007 13:05:09.703594 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:05:20 crc kubenswrapper[4854]: I1007 13:05:20.702822 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:05:20 crc kubenswrapper[4854]: E1007 13:05:20.703849 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:05:31 crc kubenswrapper[4854]: I1007 13:05:31.703722 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:05:31 crc kubenswrapper[4854]: E1007 13:05:31.704714 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:05:44 crc kubenswrapper[4854]: I1007 13:05:44.709482 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:05:44 crc kubenswrapper[4854]: E1007 13:05:44.710274 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:05:55 crc kubenswrapper[4854]: I1007 13:05:55.702826 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:05:55 crc kubenswrapper[4854]: E1007 13:05:55.703578 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:06:06 crc kubenswrapper[4854]: I1007 13:06:06.703174 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:06:06 crc kubenswrapper[4854]: E1007 13:06:06.703820 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:06:17 crc kubenswrapper[4854]: I1007 13:06:17.703571 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:06:17 crc kubenswrapper[4854]: E1007 13:06:17.704824 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:06:28 crc kubenswrapper[4854]: I1007 13:06:28.703280 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:06:28 crc kubenswrapper[4854]: E1007 13:06:28.704302 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:06:43 crc kubenswrapper[4854]: I1007 13:06:43.703358 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:06:43 crc kubenswrapper[4854]: E1007 13:06:43.704609 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:06:55 crc kubenswrapper[4854]: I1007 13:06:55.704045 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:06:55 crc kubenswrapper[4854]: E1007 13:06:55.706600 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:07:06 crc kubenswrapper[4854]: I1007 13:07:06.702223 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:07:06 crc kubenswrapper[4854]: E1007 13:07:06.702975 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:07:18 crc kubenswrapper[4854]: I1007 13:07:18.703504 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:07:18 crc kubenswrapper[4854]: E1007 13:07:18.705816 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:07:32 crc kubenswrapper[4854]: I1007 13:07:32.703398 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:07:32 crc kubenswrapper[4854]: E1007 13:07:32.704492 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:07:45 crc kubenswrapper[4854]: I1007 13:07:45.702562 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:07:45 crc kubenswrapper[4854]: E1007 13:07:45.703184 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:08:00 crc kubenswrapper[4854]: I1007 13:08:00.703133 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:08:00 crc kubenswrapper[4854]: E1007 13:08:00.704241 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:08:12 crc kubenswrapper[4854]: I1007 13:08:12.703419 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:08:12 crc kubenswrapper[4854]: E1007 13:08:12.704402 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.563666 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9tlhw"] Oct 07 13:08:16 crc kubenswrapper[4854]: E1007 13:08:16.565927 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b185a8-2646-490c-bea2-68299a788c2d" containerName="extract-content" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.566172 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b185a8-2646-490c-bea2-68299a788c2d" containerName="extract-content" Oct 07 13:08:16 crc kubenswrapper[4854]: E1007 13:08:16.566356 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b185a8-2646-490c-bea2-68299a788c2d" containerName="extract-utilities" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.566503 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b185a8-2646-490c-bea2-68299a788c2d" containerName="extract-utilities" Oct 07 13:08:16 crc kubenswrapper[4854]: E1007 13:08:16.566661 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b185a8-2646-490c-bea2-68299a788c2d" containerName="registry-server" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.567656 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b185a8-2646-490c-bea2-68299a788c2d" containerName="registry-server" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.568184 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="00b185a8-2646-490c-bea2-68299a788c2d" containerName="registry-server" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.574043 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.576692 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tlhw"] Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.688223 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7wp4\" (UniqueName: \"kubernetes.io/projected/6328902c-ee05-4265-95dc-ea1cbb9aa397-kube-api-access-j7wp4\") pod \"redhat-marketplace-9tlhw\" (UID: \"6328902c-ee05-4265-95dc-ea1cbb9aa397\") " pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.688291 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6328902c-ee05-4265-95dc-ea1cbb9aa397-catalog-content\") pod \"redhat-marketplace-9tlhw\" (UID: \"6328902c-ee05-4265-95dc-ea1cbb9aa397\") " pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.688317 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6328902c-ee05-4265-95dc-ea1cbb9aa397-utilities\") pod \"redhat-marketplace-9tlhw\" (UID: \"6328902c-ee05-4265-95dc-ea1cbb9aa397\") " pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.789813 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7wp4\" (UniqueName: \"kubernetes.io/projected/6328902c-ee05-4265-95dc-ea1cbb9aa397-kube-api-access-j7wp4\") pod \"redhat-marketplace-9tlhw\" (UID: \"6328902c-ee05-4265-95dc-ea1cbb9aa397\") " pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.789880 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6328902c-ee05-4265-95dc-ea1cbb9aa397-catalog-content\") pod \"redhat-marketplace-9tlhw\" (UID: \"6328902c-ee05-4265-95dc-ea1cbb9aa397\") " pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.789912 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6328902c-ee05-4265-95dc-ea1cbb9aa397-utilities\") pod \"redhat-marketplace-9tlhw\" (UID: \"6328902c-ee05-4265-95dc-ea1cbb9aa397\") " pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.790534 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6328902c-ee05-4265-95dc-ea1cbb9aa397-utilities\") pod \"redhat-marketplace-9tlhw\" (UID: \"6328902c-ee05-4265-95dc-ea1cbb9aa397\") " pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.790766 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6328902c-ee05-4265-95dc-ea1cbb9aa397-catalog-content\") pod \"redhat-marketplace-9tlhw\" (UID: \"6328902c-ee05-4265-95dc-ea1cbb9aa397\") " pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.818464 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7wp4\" (UniqueName: \"kubernetes.io/projected/6328902c-ee05-4265-95dc-ea1cbb9aa397-kube-api-access-j7wp4\") pod \"redhat-marketplace-9tlhw\" (UID: \"6328902c-ee05-4265-95dc-ea1cbb9aa397\") " pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:16 crc kubenswrapper[4854]: I1007 13:08:16.901692 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:17 crc kubenswrapper[4854]: I1007 13:08:17.116676 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tlhw"] Oct 07 13:08:17 crc kubenswrapper[4854]: I1007 13:08:17.141882 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tlhw" event={"ID":"6328902c-ee05-4265-95dc-ea1cbb9aa397","Type":"ContainerStarted","Data":"0cd49ab74e70ddba5ca456e17cb8bfab3680d1a71a6ed4ab5502caa24ce79782"} Oct 07 13:08:18 crc kubenswrapper[4854]: I1007 13:08:18.156675 4854 generic.go:334] "Generic (PLEG): container finished" podID="6328902c-ee05-4265-95dc-ea1cbb9aa397" containerID="04a08323d50b248c63e798a9309551f7c7d336fc0f276ff69ab6e282d427ac3f" exitCode=0 Oct 07 13:08:18 crc kubenswrapper[4854]: I1007 13:08:18.156765 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tlhw" event={"ID":"6328902c-ee05-4265-95dc-ea1cbb9aa397","Type":"ContainerDied","Data":"04a08323d50b248c63e798a9309551f7c7d336fc0f276ff69ab6e282d427ac3f"} Oct 07 13:08:18 crc kubenswrapper[4854]: I1007 13:08:18.159946 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:08:20 crc kubenswrapper[4854]: I1007 13:08:20.181125 4854 generic.go:334] "Generic (PLEG): container finished" podID="6328902c-ee05-4265-95dc-ea1cbb9aa397" containerID="2dfe21a26cfa80904a95d94884e7d6199619127b9f036a49a05cc18a3ffe5ea8" exitCode=0 Oct 07 13:08:20 crc kubenswrapper[4854]: I1007 13:08:20.181220 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tlhw" event={"ID":"6328902c-ee05-4265-95dc-ea1cbb9aa397","Type":"ContainerDied","Data":"2dfe21a26cfa80904a95d94884e7d6199619127b9f036a49a05cc18a3ffe5ea8"} Oct 07 13:08:21 crc kubenswrapper[4854]: I1007 13:08:21.190821 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tlhw" event={"ID":"6328902c-ee05-4265-95dc-ea1cbb9aa397","Type":"ContainerStarted","Data":"a91d8fa9c48bed737b0c18c39953182934b2b3beed0cf73b81355a20d52ca978"} Oct 07 13:08:23 crc kubenswrapper[4854]: I1007 13:08:23.703037 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:08:23 crc kubenswrapper[4854]: E1007 13:08:23.703763 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:08:26 crc kubenswrapper[4854]: I1007 13:08:26.903038 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:26 crc kubenswrapper[4854]: I1007 13:08:26.903554 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:26 crc kubenswrapper[4854]: I1007 13:08:26.962708 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:27 crc kubenswrapper[4854]: I1007 13:08:27.000339 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9tlhw" podStartSLOduration=8.501356844 podStartE2EDuration="11.000311676s" podCreationTimestamp="2025-10-07 13:08:16 +0000 UTC" firstStartedPulling="2025-10-07 13:08:18.159664107 +0000 UTC m=+2614.147496382" lastFinishedPulling="2025-10-07 13:08:20.658618959 +0000 UTC m=+2616.646451214" observedRunningTime="2025-10-07 13:08:21.2135048 +0000 UTC m=+2617.201337055" watchObservedRunningTime="2025-10-07 13:08:27.000311676 +0000 UTC m=+2622.988143961" Oct 07 13:08:27 crc kubenswrapper[4854]: I1007 13:08:27.330954 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:27 crc kubenswrapper[4854]: I1007 13:08:27.386906 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tlhw"] Oct 07 13:08:29 crc kubenswrapper[4854]: I1007 13:08:29.273848 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9tlhw" podUID="6328902c-ee05-4265-95dc-ea1cbb9aa397" containerName="registry-server" containerID="cri-o://a91d8fa9c48bed737b0c18c39953182934b2b3beed0cf73b81355a20d52ca978" gracePeriod=2 Oct 07 13:08:29 crc kubenswrapper[4854]: E1007 13:08:29.476909 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6328902c_ee05_4265_95dc_ea1cbb9aa397.slice/crio-a91d8fa9c48bed737b0c18c39953182934b2b3beed0cf73b81355a20d52ca978.scope\": RecentStats: unable to find data in memory cache]" Oct 07 13:08:30 crc kubenswrapper[4854]: I1007 13:08:30.284197 4854 generic.go:334] "Generic (PLEG): container finished" podID="6328902c-ee05-4265-95dc-ea1cbb9aa397" containerID="a91d8fa9c48bed737b0c18c39953182934b2b3beed0cf73b81355a20d52ca978" exitCode=0 Oct 07 13:08:30 crc kubenswrapper[4854]: I1007 13:08:30.284286 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tlhw" event={"ID":"6328902c-ee05-4265-95dc-ea1cbb9aa397","Type":"ContainerDied","Data":"a91d8fa9c48bed737b0c18c39953182934b2b3beed0cf73b81355a20d52ca978"} Oct 07 13:08:30 crc kubenswrapper[4854]: I1007 13:08:30.372938 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:30 crc kubenswrapper[4854]: I1007 13:08:30.531585 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6328902c-ee05-4265-95dc-ea1cbb9aa397-utilities\") pod \"6328902c-ee05-4265-95dc-ea1cbb9aa397\" (UID: \"6328902c-ee05-4265-95dc-ea1cbb9aa397\") " Oct 07 13:08:30 crc kubenswrapper[4854]: I1007 13:08:30.531713 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6328902c-ee05-4265-95dc-ea1cbb9aa397-catalog-content\") pod \"6328902c-ee05-4265-95dc-ea1cbb9aa397\" (UID: \"6328902c-ee05-4265-95dc-ea1cbb9aa397\") " Oct 07 13:08:30 crc kubenswrapper[4854]: I1007 13:08:30.531904 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7wp4\" (UniqueName: \"kubernetes.io/projected/6328902c-ee05-4265-95dc-ea1cbb9aa397-kube-api-access-j7wp4\") pod \"6328902c-ee05-4265-95dc-ea1cbb9aa397\" (UID: \"6328902c-ee05-4265-95dc-ea1cbb9aa397\") " Oct 07 13:08:30 crc kubenswrapper[4854]: I1007 13:08:30.532715 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6328902c-ee05-4265-95dc-ea1cbb9aa397-utilities" (OuterVolumeSpecName: "utilities") pod "6328902c-ee05-4265-95dc-ea1cbb9aa397" (UID: "6328902c-ee05-4265-95dc-ea1cbb9aa397"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:08:30 crc kubenswrapper[4854]: I1007 13:08:30.537852 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6328902c-ee05-4265-95dc-ea1cbb9aa397-kube-api-access-j7wp4" (OuterVolumeSpecName: "kube-api-access-j7wp4") pod "6328902c-ee05-4265-95dc-ea1cbb9aa397" (UID: "6328902c-ee05-4265-95dc-ea1cbb9aa397"). InnerVolumeSpecName "kube-api-access-j7wp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:08:30 crc kubenswrapper[4854]: I1007 13:08:30.634245 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7wp4\" (UniqueName: \"kubernetes.io/projected/6328902c-ee05-4265-95dc-ea1cbb9aa397-kube-api-access-j7wp4\") on node \"crc\" DevicePath \"\"" Oct 07 13:08:30 crc kubenswrapper[4854]: I1007 13:08:30.634293 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6328902c-ee05-4265-95dc-ea1cbb9aa397-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:08:30 crc kubenswrapper[4854]: I1007 13:08:30.930427 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6328902c-ee05-4265-95dc-ea1cbb9aa397-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6328902c-ee05-4265-95dc-ea1cbb9aa397" (UID: "6328902c-ee05-4265-95dc-ea1cbb9aa397"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:08:30 crc kubenswrapper[4854]: I1007 13:08:30.939006 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6328902c-ee05-4265-95dc-ea1cbb9aa397-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:08:31 crc kubenswrapper[4854]: I1007 13:08:31.296426 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tlhw" event={"ID":"6328902c-ee05-4265-95dc-ea1cbb9aa397","Type":"ContainerDied","Data":"0cd49ab74e70ddba5ca456e17cb8bfab3680d1a71a6ed4ab5502caa24ce79782"} Oct 07 13:08:31 crc kubenswrapper[4854]: I1007 13:08:31.296479 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9tlhw" Oct 07 13:08:31 crc kubenswrapper[4854]: I1007 13:08:31.296493 4854 scope.go:117] "RemoveContainer" containerID="a91d8fa9c48bed737b0c18c39953182934b2b3beed0cf73b81355a20d52ca978" Oct 07 13:08:31 crc kubenswrapper[4854]: I1007 13:08:31.329386 4854 scope.go:117] "RemoveContainer" containerID="2dfe21a26cfa80904a95d94884e7d6199619127b9f036a49a05cc18a3ffe5ea8" Oct 07 13:08:31 crc kubenswrapper[4854]: I1007 13:08:31.350931 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tlhw"] Oct 07 13:08:31 crc kubenswrapper[4854]: I1007 13:08:31.350981 4854 scope.go:117] "RemoveContainer" containerID="04a08323d50b248c63e798a9309551f7c7d336fc0f276ff69ab6e282d427ac3f" Oct 07 13:08:31 crc kubenswrapper[4854]: I1007 13:08:31.355589 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tlhw"] Oct 07 13:08:32 crc kubenswrapper[4854]: I1007 13:08:32.719782 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6328902c-ee05-4265-95dc-ea1cbb9aa397" path="/var/lib/kubelet/pods/6328902c-ee05-4265-95dc-ea1cbb9aa397/volumes" Oct 07 13:08:35 crc kubenswrapper[4854]: I1007 13:08:35.703460 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:08:35 crc kubenswrapper[4854]: E1007 13:08:35.704219 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:08:47 crc kubenswrapper[4854]: I1007 13:08:47.703503 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:08:47 crc kubenswrapper[4854]: E1007 13:08:47.704527 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:09:02 crc kubenswrapper[4854]: I1007 13:09:02.702201 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:09:02 crc kubenswrapper[4854]: E1007 13:09:02.702790 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:09:14 crc kubenswrapper[4854]: I1007 13:09:14.707682 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:09:14 crc kubenswrapper[4854]: E1007 13:09:14.708535 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:09:27 crc kubenswrapper[4854]: I1007 13:09:27.702560 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:09:27 crc kubenswrapper[4854]: E1007 13:09:27.703596 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:09:39 crc kubenswrapper[4854]: I1007 13:09:39.703501 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:09:39 crc kubenswrapper[4854]: E1007 13:09:39.704520 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:09:47 crc kubenswrapper[4854]: I1007 13:09:47.728323 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7xf7t"] Oct 07 13:09:47 crc kubenswrapper[4854]: E1007 13:09:47.731197 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6328902c-ee05-4265-95dc-ea1cbb9aa397" containerName="extract-utilities" Oct 07 13:09:47 crc kubenswrapper[4854]: I1007 13:09:47.731369 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6328902c-ee05-4265-95dc-ea1cbb9aa397" containerName="extract-utilities" Oct 07 13:09:47 crc kubenswrapper[4854]: E1007 13:09:47.731533 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6328902c-ee05-4265-95dc-ea1cbb9aa397" containerName="registry-server" Oct 07 13:09:47 crc kubenswrapper[4854]: I1007 13:09:47.731657 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6328902c-ee05-4265-95dc-ea1cbb9aa397" containerName="registry-server" Oct 07 13:09:47 crc kubenswrapper[4854]: E1007 13:09:47.731798 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6328902c-ee05-4265-95dc-ea1cbb9aa397" containerName="extract-content" Oct 07 13:09:47 crc kubenswrapper[4854]: I1007 13:09:47.731917 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6328902c-ee05-4265-95dc-ea1cbb9aa397" containerName="extract-content" Oct 07 13:09:47 crc kubenswrapper[4854]: I1007 13:09:47.732359 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6328902c-ee05-4265-95dc-ea1cbb9aa397" containerName="registry-server" Oct 07 13:09:47 crc kubenswrapper[4854]: I1007 13:09:47.735305 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:47 crc kubenswrapper[4854]: I1007 13:09:47.752141 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7xf7t"] Oct 07 13:09:47 crc kubenswrapper[4854]: I1007 13:09:47.927851 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-utilities\") pod \"community-operators-7xf7t\" (UID: \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\") " pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:47 crc kubenswrapper[4854]: I1007 13:09:47.928439 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-catalog-content\") pod \"community-operators-7xf7t\" (UID: \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\") " pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:47 crc kubenswrapper[4854]: I1007 13:09:47.928650 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5f7f\" (UniqueName: \"kubernetes.io/projected/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-kube-api-access-s5f7f\") pod \"community-operators-7xf7t\" (UID: \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\") " pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:48 crc kubenswrapper[4854]: I1007 13:09:48.030341 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-utilities\") pod \"community-operators-7xf7t\" (UID: \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\") " pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:48 crc kubenswrapper[4854]: I1007 13:09:48.030797 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-catalog-content\") pod \"community-operators-7xf7t\" (UID: \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\") " pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:48 crc kubenswrapper[4854]: I1007 13:09:48.030997 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5f7f\" (UniqueName: \"kubernetes.io/projected/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-kube-api-access-s5f7f\") pod \"community-operators-7xf7t\" (UID: \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\") " pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:48 crc kubenswrapper[4854]: I1007 13:09:48.030872 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-utilities\") pod \"community-operators-7xf7t\" (UID: \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\") " pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:48 crc kubenswrapper[4854]: I1007 13:09:48.031280 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-catalog-content\") pod \"community-operators-7xf7t\" (UID: \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\") " pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:48 crc kubenswrapper[4854]: I1007 13:09:48.057271 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5f7f\" (UniqueName: \"kubernetes.io/projected/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-kube-api-access-s5f7f\") pod \"community-operators-7xf7t\" (UID: \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\") " pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:48 crc kubenswrapper[4854]: I1007 13:09:48.059986 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:48 crc kubenswrapper[4854]: I1007 13:09:48.602699 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7xf7t"] Oct 07 13:09:49 crc kubenswrapper[4854]: I1007 13:09:49.031869 4854 generic.go:334] "Generic (PLEG): container finished" podID="a99aa1f8-4bb7-4a66-8664-cf1235def9ef" containerID="bb41031b87c1eafc6a0a2f45e0fde544b0d4769d5f4860aa53b5085b76f98bb9" exitCode=0 Oct 07 13:09:49 crc kubenswrapper[4854]: I1007 13:09:49.031943 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xf7t" event={"ID":"a99aa1f8-4bb7-4a66-8664-cf1235def9ef","Type":"ContainerDied","Data":"bb41031b87c1eafc6a0a2f45e0fde544b0d4769d5f4860aa53b5085b76f98bb9"} Oct 07 13:09:49 crc kubenswrapper[4854]: I1007 13:09:49.032009 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xf7t" event={"ID":"a99aa1f8-4bb7-4a66-8664-cf1235def9ef","Type":"ContainerStarted","Data":"f6dce564f8fab40763cc9ea8102f5197d93814d2caeb13530da185be68a8ef2a"} Oct 07 13:09:51 crc kubenswrapper[4854]: I1007 13:09:51.053050 4854 generic.go:334] "Generic (PLEG): container finished" podID="a99aa1f8-4bb7-4a66-8664-cf1235def9ef" containerID="eef6101f737ad6dcd2acd78d3d503d55e3b79ccadd692c00e0e438b6e7d997b9" exitCode=0 Oct 07 13:09:51 crc kubenswrapper[4854]: I1007 13:09:51.053193 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xf7t" event={"ID":"a99aa1f8-4bb7-4a66-8664-cf1235def9ef","Type":"ContainerDied","Data":"eef6101f737ad6dcd2acd78d3d503d55e3b79ccadd692c00e0e438b6e7d997b9"} Oct 07 13:09:52 crc kubenswrapper[4854]: I1007 13:09:52.066214 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xf7t" event={"ID":"a99aa1f8-4bb7-4a66-8664-cf1235def9ef","Type":"ContainerStarted","Data":"b6c4eb531eecd65aff00fb4f9d491e0bb2a0903aab5c07a76e7c29a37c285761"} Oct 07 13:09:52 crc kubenswrapper[4854]: I1007 13:09:52.087369 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7xf7t" podStartSLOduration=2.434030762 podStartE2EDuration="5.087349119s" podCreationTimestamp="2025-10-07 13:09:47 +0000 UTC" firstStartedPulling="2025-10-07 13:09:49.033714462 +0000 UTC m=+2705.021546747" lastFinishedPulling="2025-10-07 13:09:51.687032849 +0000 UTC m=+2707.674865104" observedRunningTime="2025-10-07 13:09:52.082116508 +0000 UTC m=+2708.069948773" watchObservedRunningTime="2025-10-07 13:09:52.087349119 +0000 UTC m=+2708.075181384" Oct 07 13:09:52 crc kubenswrapper[4854]: I1007 13:09:52.703279 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:09:53 crc kubenswrapper[4854]: I1007 13:09:53.079241 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"9dd9e16eaf55f4ba69f659bf6e7b7b2660a18997e1ad98c00cf8a756a7e4a562"} Oct 07 13:09:58 crc kubenswrapper[4854]: I1007 13:09:58.061403 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:58 crc kubenswrapper[4854]: I1007 13:09:58.063921 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:58 crc kubenswrapper[4854]: I1007 13:09:58.146700 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:59 crc kubenswrapper[4854]: I1007 13:09:59.200832 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:09:59 crc kubenswrapper[4854]: I1007 13:09:59.264401 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7xf7t"] Oct 07 13:10:01 crc kubenswrapper[4854]: I1007 13:10:01.153478 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7xf7t" podUID="a99aa1f8-4bb7-4a66-8664-cf1235def9ef" containerName="registry-server" containerID="cri-o://b6c4eb531eecd65aff00fb4f9d491e0bb2a0903aab5c07a76e7c29a37c285761" gracePeriod=2 Oct 07 13:10:01 crc kubenswrapper[4854]: E1007 13:10:01.559949 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda99aa1f8_4bb7_4a66_8664_cf1235def9ef.slice/crio-b6c4eb531eecd65aff00fb4f9d491e0bb2a0903aab5c07a76e7c29a37c285761.scope\": RecentStats: unable to find data in memory cache]" Oct 07 13:10:02 crc kubenswrapper[4854]: I1007 13:10:02.170969 4854 generic.go:334] "Generic (PLEG): container finished" podID="a99aa1f8-4bb7-4a66-8664-cf1235def9ef" containerID="b6c4eb531eecd65aff00fb4f9d491e0bb2a0903aab5c07a76e7c29a37c285761" exitCode=0 Oct 07 13:10:02 crc kubenswrapper[4854]: I1007 13:10:02.171030 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xf7t" event={"ID":"a99aa1f8-4bb7-4a66-8664-cf1235def9ef","Type":"ContainerDied","Data":"b6c4eb531eecd65aff00fb4f9d491e0bb2a0903aab5c07a76e7c29a37c285761"} Oct 07 13:10:02 crc kubenswrapper[4854]: I1007 13:10:02.390899 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:10:02 crc kubenswrapper[4854]: I1007 13:10:02.588285 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-utilities\") pod \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\" (UID: \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\") " Oct 07 13:10:02 crc kubenswrapper[4854]: I1007 13:10:02.588449 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-catalog-content\") pod \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\" (UID: \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\") " Oct 07 13:10:02 crc kubenswrapper[4854]: I1007 13:10:02.588544 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5f7f\" (UniqueName: \"kubernetes.io/projected/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-kube-api-access-s5f7f\") pod \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\" (UID: \"a99aa1f8-4bb7-4a66-8664-cf1235def9ef\") " Oct 07 13:10:02 crc kubenswrapper[4854]: I1007 13:10:02.589648 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-utilities" (OuterVolumeSpecName: "utilities") pod "a99aa1f8-4bb7-4a66-8664-cf1235def9ef" (UID: "a99aa1f8-4bb7-4a66-8664-cf1235def9ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:10:02 crc kubenswrapper[4854]: I1007 13:10:02.598324 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-kube-api-access-s5f7f" (OuterVolumeSpecName: "kube-api-access-s5f7f") pod "a99aa1f8-4bb7-4a66-8664-cf1235def9ef" (UID: "a99aa1f8-4bb7-4a66-8664-cf1235def9ef"). InnerVolumeSpecName "kube-api-access-s5f7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:10:02 crc kubenswrapper[4854]: I1007 13:10:02.681861 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a99aa1f8-4bb7-4a66-8664-cf1235def9ef" (UID: "a99aa1f8-4bb7-4a66-8664-cf1235def9ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:10:02 crc kubenswrapper[4854]: I1007 13:10:02.690677 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:10:02 crc kubenswrapper[4854]: I1007 13:10:02.690722 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:10:02 crc kubenswrapper[4854]: I1007 13:10:02.690742 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5f7f\" (UniqueName: \"kubernetes.io/projected/a99aa1f8-4bb7-4a66-8664-cf1235def9ef-kube-api-access-s5f7f\") on node \"crc\" DevicePath \"\"" Oct 07 13:10:03 crc kubenswrapper[4854]: I1007 13:10:03.188256 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7xf7t" event={"ID":"a99aa1f8-4bb7-4a66-8664-cf1235def9ef","Type":"ContainerDied","Data":"f6dce564f8fab40763cc9ea8102f5197d93814d2caeb13530da185be68a8ef2a"} Oct 07 13:10:03 crc kubenswrapper[4854]: I1007 13:10:03.188351 4854 scope.go:117] "RemoveContainer" containerID="b6c4eb531eecd65aff00fb4f9d491e0bb2a0903aab5c07a76e7c29a37c285761" Oct 07 13:10:03 crc kubenswrapper[4854]: I1007 13:10:03.188462 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7xf7t" Oct 07 13:10:03 crc kubenswrapper[4854]: I1007 13:10:03.225869 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7xf7t"] Oct 07 13:10:03 crc kubenswrapper[4854]: I1007 13:10:03.227879 4854 scope.go:117] "RemoveContainer" containerID="eef6101f737ad6dcd2acd78d3d503d55e3b79ccadd692c00e0e438b6e7d997b9" Oct 07 13:10:03 crc kubenswrapper[4854]: I1007 13:10:03.244393 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7xf7t"] Oct 07 13:10:03 crc kubenswrapper[4854]: I1007 13:10:03.266361 4854 scope.go:117] "RemoveContainer" containerID="bb41031b87c1eafc6a0a2f45e0fde544b0d4769d5f4860aa53b5085b76f98bb9" Oct 07 13:10:04 crc kubenswrapper[4854]: I1007 13:10:04.726954 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a99aa1f8-4bb7-4a66-8664-cf1235def9ef" path="/var/lib/kubelet/pods/a99aa1f8-4bb7-4a66-8664-cf1235def9ef/volumes" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.466809 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dc8z7"] Oct 07 13:11:23 crc kubenswrapper[4854]: E1007 13:11:23.467793 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99aa1f8-4bb7-4a66-8664-cf1235def9ef" containerName="registry-server" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.467811 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99aa1f8-4bb7-4a66-8664-cf1235def9ef" containerName="registry-server" Oct 07 13:11:23 crc kubenswrapper[4854]: E1007 13:11:23.467834 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99aa1f8-4bb7-4a66-8664-cf1235def9ef" containerName="extract-utilities" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.467844 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99aa1f8-4bb7-4a66-8664-cf1235def9ef" containerName="extract-utilities" Oct 07 13:11:23 crc kubenswrapper[4854]: E1007 13:11:23.467872 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99aa1f8-4bb7-4a66-8664-cf1235def9ef" containerName="extract-content" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.467880 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99aa1f8-4bb7-4a66-8664-cf1235def9ef" containerName="extract-content" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.468051 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a99aa1f8-4bb7-4a66-8664-cf1235def9ef" containerName="registry-server" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.469714 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.475610 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dc8z7"] Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.661480 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0c2a7c-d842-4860-acbf-fefb26f459be-catalog-content\") pod \"certified-operators-dc8z7\" (UID: \"bc0c2a7c-d842-4860-acbf-fefb26f459be\") " pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.661539 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv8cl\" (UniqueName: \"kubernetes.io/projected/bc0c2a7c-d842-4860-acbf-fefb26f459be-kube-api-access-bv8cl\") pod \"certified-operators-dc8z7\" (UID: \"bc0c2a7c-d842-4860-acbf-fefb26f459be\") " pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.661914 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0c2a7c-d842-4860-acbf-fefb26f459be-utilities\") pod \"certified-operators-dc8z7\" (UID: \"bc0c2a7c-d842-4860-acbf-fefb26f459be\") " pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.763234 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0c2a7c-d842-4860-acbf-fefb26f459be-utilities\") pod \"certified-operators-dc8z7\" (UID: \"bc0c2a7c-d842-4860-acbf-fefb26f459be\") " pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.763310 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0c2a7c-d842-4860-acbf-fefb26f459be-catalog-content\") pod \"certified-operators-dc8z7\" (UID: \"bc0c2a7c-d842-4860-acbf-fefb26f459be\") " pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.763913 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv8cl\" (UniqueName: \"kubernetes.io/projected/bc0c2a7c-d842-4860-acbf-fefb26f459be-kube-api-access-bv8cl\") pod \"certified-operators-dc8z7\" (UID: \"bc0c2a7c-d842-4860-acbf-fefb26f459be\") " pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.763916 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0c2a7c-d842-4860-acbf-fefb26f459be-catalog-content\") pod \"certified-operators-dc8z7\" (UID: \"bc0c2a7c-d842-4860-acbf-fefb26f459be\") " pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.764203 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0c2a7c-d842-4860-acbf-fefb26f459be-utilities\") pod \"certified-operators-dc8z7\" (UID: \"bc0c2a7c-d842-4860-acbf-fefb26f459be\") " pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.794033 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv8cl\" (UniqueName: \"kubernetes.io/projected/bc0c2a7c-d842-4860-acbf-fefb26f459be-kube-api-access-bv8cl\") pod \"certified-operators-dc8z7\" (UID: \"bc0c2a7c-d842-4860-acbf-fefb26f459be\") " pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:23 crc kubenswrapper[4854]: I1007 13:11:23.804753 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:24 crc kubenswrapper[4854]: I1007 13:11:24.107294 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dc8z7"] Oct 07 13:11:24 crc kubenswrapper[4854]: W1007 13:11:24.121371 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc0c2a7c_d842_4860_acbf_fefb26f459be.slice/crio-1e6196b3ed9d2a36ddb3744b0c93645cff8ea447625e619efc9b7dcd2990dd80 WatchSource:0}: Error finding container 1e6196b3ed9d2a36ddb3744b0c93645cff8ea447625e619efc9b7dcd2990dd80: Status 404 returned error can't find the container with id 1e6196b3ed9d2a36ddb3744b0c93645cff8ea447625e619efc9b7dcd2990dd80 Oct 07 13:11:24 crc kubenswrapper[4854]: I1007 13:11:24.995827 4854 generic.go:334] "Generic (PLEG): container finished" podID="bc0c2a7c-d842-4860-acbf-fefb26f459be" containerID="b73c32175b83021ba1108c3445f2bdba43475d3768761563b2068c2162a906dc" exitCode=0 Oct 07 13:11:24 crc kubenswrapper[4854]: I1007 13:11:24.996195 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc8z7" event={"ID":"bc0c2a7c-d842-4860-acbf-fefb26f459be","Type":"ContainerDied","Data":"b73c32175b83021ba1108c3445f2bdba43475d3768761563b2068c2162a906dc"} Oct 07 13:11:24 crc kubenswrapper[4854]: I1007 13:11:24.996224 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc8z7" event={"ID":"bc0c2a7c-d842-4860-acbf-fefb26f459be","Type":"ContainerStarted","Data":"1e6196b3ed9d2a36ddb3744b0c93645cff8ea447625e619efc9b7dcd2990dd80"} Oct 07 13:11:26 crc kubenswrapper[4854]: I1007 13:11:26.005874 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc8z7" event={"ID":"bc0c2a7c-d842-4860-acbf-fefb26f459be","Type":"ContainerStarted","Data":"36eac7bcb8c43e9e60d05391037e9e3e3e8689ba8e05fa02969fe6a1a83fc8d3"} Oct 07 13:11:27 crc kubenswrapper[4854]: I1007 13:11:27.022323 4854 generic.go:334] "Generic (PLEG): container finished" podID="bc0c2a7c-d842-4860-acbf-fefb26f459be" containerID="36eac7bcb8c43e9e60d05391037e9e3e3e8689ba8e05fa02969fe6a1a83fc8d3" exitCode=0 Oct 07 13:11:27 crc kubenswrapper[4854]: I1007 13:11:27.022372 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc8z7" event={"ID":"bc0c2a7c-d842-4860-acbf-fefb26f459be","Type":"ContainerDied","Data":"36eac7bcb8c43e9e60d05391037e9e3e3e8689ba8e05fa02969fe6a1a83fc8d3"} Oct 07 13:11:28 crc kubenswrapper[4854]: I1007 13:11:28.037224 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc8z7" event={"ID":"bc0c2a7c-d842-4860-acbf-fefb26f459be","Type":"ContainerStarted","Data":"0a860e8db0f264e33c92c958b0a771c6cbe0f01b4426665e49e33803d47aed55"} Oct 07 13:11:28 crc kubenswrapper[4854]: I1007 13:11:28.063633 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dc8z7" podStartSLOduration=2.612112428 podStartE2EDuration="5.063603442s" podCreationTimestamp="2025-10-07 13:11:23 +0000 UTC" firstStartedPulling="2025-10-07 13:11:24.998466974 +0000 UTC m=+2800.986299239" lastFinishedPulling="2025-10-07 13:11:27.449957968 +0000 UTC m=+2803.437790253" observedRunningTime="2025-10-07 13:11:28.061351397 +0000 UTC m=+2804.049183752" watchObservedRunningTime="2025-10-07 13:11:28.063603442 +0000 UTC m=+2804.051435727" Oct 07 13:11:33 crc kubenswrapper[4854]: I1007 13:11:33.805489 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:33 crc kubenswrapper[4854]: I1007 13:11:33.806238 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:33 crc kubenswrapper[4854]: I1007 13:11:33.875987 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:34 crc kubenswrapper[4854]: I1007 13:11:34.164539 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:34 crc kubenswrapper[4854]: I1007 13:11:34.218700 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dc8z7"] Oct 07 13:11:36 crc kubenswrapper[4854]: I1007 13:11:36.110260 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dc8z7" podUID="bc0c2a7c-d842-4860-acbf-fefb26f459be" containerName="registry-server" containerID="cri-o://0a860e8db0f264e33c92c958b0a771c6cbe0f01b4426665e49e33803d47aed55" gracePeriod=2 Oct 07 13:11:36 crc kubenswrapper[4854]: I1007 13:11:36.584625 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:36 crc kubenswrapper[4854]: I1007 13:11:36.771842 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0c2a7c-d842-4860-acbf-fefb26f459be-catalog-content\") pod \"bc0c2a7c-d842-4860-acbf-fefb26f459be\" (UID: \"bc0c2a7c-d842-4860-acbf-fefb26f459be\") " Oct 07 13:11:36 crc kubenswrapper[4854]: I1007 13:11:36.772672 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv8cl\" (UniqueName: \"kubernetes.io/projected/bc0c2a7c-d842-4860-acbf-fefb26f459be-kube-api-access-bv8cl\") pod \"bc0c2a7c-d842-4860-acbf-fefb26f459be\" (UID: \"bc0c2a7c-d842-4860-acbf-fefb26f459be\") " Oct 07 13:11:36 crc kubenswrapper[4854]: I1007 13:11:36.772743 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0c2a7c-d842-4860-acbf-fefb26f459be-utilities\") pod \"bc0c2a7c-d842-4860-acbf-fefb26f459be\" (UID: \"bc0c2a7c-d842-4860-acbf-fefb26f459be\") " Oct 07 13:11:36 crc kubenswrapper[4854]: I1007 13:11:36.774243 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc0c2a7c-d842-4860-acbf-fefb26f459be-utilities" (OuterVolumeSpecName: "utilities") pod "bc0c2a7c-d842-4860-acbf-fefb26f459be" (UID: "bc0c2a7c-d842-4860-acbf-fefb26f459be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:11:36 crc kubenswrapper[4854]: I1007 13:11:36.781849 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc0c2a7c-d842-4860-acbf-fefb26f459be-kube-api-access-bv8cl" (OuterVolumeSpecName: "kube-api-access-bv8cl") pod "bc0c2a7c-d842-4860-acbf-fefb26f459be" (UID: "bc0c2a7c-d842-4860-acbf-fefb26f459be"). InnerVolumeSpecName "kube-api-access-bv8cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:11:36 crc kubenswrapper[4854]: I1007 13:11:36.845352 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc0c2a7c-d842-4860-acbf-fefb26f459be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc0c2a7c-d842-4860-acbf-fefb26f459be" (UID: "bc0c2a7c-d842-4860-acbf-fefb26f459be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:11:36 crc kubenswrapper[4854]: I1007 13:11:36.875139 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0c2a7c-d842-4860-acbf-fefb26f459be-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:36 crc kubenswrapper[4854]: I1007 13:11:36.875277 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv8cl\" (UniqueName: \"kubernetes.io/projected/bc0c2a7c-d842-4860-acbf-fefb26f459be-kube-api-access-bv8cl\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:36 crc kubenswrapper[4854]: I1007 13:11:36.875316 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0c2a7c-d842-4860-acbf-fefb26f459be-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.126947 4854 generic.go:334] "Generic (PLEG): container finished" podID="bc0c2a7c-d842-4860-acbf-fefb26f459be" containerID="0a860e8db0f264e33c92c958b0a771c6cbe0f01b4426665e49e33803d47aed55" exitCode=0 Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.127067 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc8z7" event={"ID":"bc0c2a7c-d842-4860-acbf-fefb26f459be","Type":"ContainerDied","Data":"0a860e8db0f264e33c92c958b0a771c6cbe0f01b4426665e49e33803d47aed55"} Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.127175 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc8z7" Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.127291 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc8z7" event={"ID":"bc0c2a7c-d842-4860-acbf-fefb26f459be","Type":"ContainerDied","Data":"1e6196b3ed9d2a36ddb3744b0c93645cff8ea447625e619efc9b7dcd2990dd80"} Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.127372 4854 scope.go:117] "RemoveContainer" containerID="0a860e8db0f264e33c92c958b0a771c6cbe0f01b4426665e49e33803d47aed55" Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.162297 4854 scope.go:117] "RemoveContainer" containerID="36eac7bcb8c43e9e60d05391037e9e3e3e8689ba8e05fa02969fe6a1a83fc8d3" Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.196015 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dc8z7"] Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.206849 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dc8z7"] Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.209604 4854 scope.go:117] "RemoveContainer" containerID="b73c32175b83021ba1108c3445f2bdba43475d3768761563b2068c2162a906dc" Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.238118 4854 scope.go:117] "RemoveContainer" containerID="0a860e8db0f264e33c92c958b0a771c6cbe0f01b4426665e49e33803d47aed55" Oct 07 13:11:37 crc kubenswrapper[4854]: E1007 13:11:37.238618 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a860e8db0f264e33c92c958b0a771c6cbe0f01b4426665e49e33803d47aed55\": container with ID starting with 0a860e8db0f264e33c92c958b0a771c6cbe0f01b4426665e49e33803d47aed55 not found: ID does not exist" containerID="0a860e8db0f264e33c92c958b0a771c6cbe0f01b4426665e49e33803d47aed55" Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.238671 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a860e8db0f264e33c92c958b0a771c6cbe0f01b4426665e49e33803d47aed55"} err="failed to get container status \"0a860e8db0f264e33c92c958b0a771c6cbe0f01b4426665e49e33803d47aed55\": rpc error: code = NotFound desc = could not find container \"0a860e8db0f264e33c92c958b0a771c6cbe0f01b4426665e49e33803d47aed55\": container with ID starting with 0a860e8db0f264e33c92c958b0a771c6cbe0f01b4426665e49e33803d47aed55 not found: ID does not exist" Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.238706 4854 scope.go:117] "RemoveContainer" containerID="36eac7bcb8c43e9e60d05391037e9e3e3e8689ba8e05fa02969fe6a1a83fc8d3" Oct 07 13:11:37 crc kubenswrapper[4854]: E1007 13:11:37.239415 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36eac7bcb8c43e9e60d05391037e9e3e3e8689ba8e05fa02969fe6a1a83fc8d3\": container with ID starting with 36eac7bcb8c43e9e60d05391037e9e3e3e8689ba8e05fa02969fe6a1a83fc8d3 not found: ID does not exist" containerID="36eac7bcb8c43e9e60d05391037e9e3e3e8689ba8e05fa02969fe6a1a83fc8d3" Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.239449 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36eac7bcb8c43e9e60d05391037e9e3e3e8689ba8e05fa02969fe6a1a83fc8d3"} err="failed to get container status \"36eac7bcb8c43e9e60d05391037e9e3e3e8689ba8e05fa02969fe6a1a83fc8d3\": rpc error: code = NotFound desc = could not find container \"36eac7bcb8c43e9e60d05391037e9e3e3e8689ba8e05fa02969fe6a1a83fc8d3\": container with ID starting with 36eac7bcb8c43e9e60d05391037e9e3e3e8689ba8e05fa02969fe6a1a83fc8d3 not found: ID does not exist" Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.239467 4854 scope.go:117] "RemoveContainer" containerID="b73c32175b83021ba1108c3445f2bdba43475d3768761563b2068c2162a906dc" Oct 07 13:11:37 crc kubenswrapper[4854]: E1007 13:11:37.239783 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b73c32175b83021ba1108c3445f2bdba43475d3768761563b2068c2162a906dc\": container with ID starting with b73c32175b83021ba1108c3445f2bdba43475d3768761563b2068c2162a906dc not found: ID does not exist" containerID="b73c32175b83021ba1108c3445f2bdba43475d3768761563b2068c2162a906dc" Oct 07 13:11:37 crc kubenswrapper[4854]: I1007 13:11:37.239804 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73c32175b83021ba1108c3445f2bdba43475d3768761563b2068c2162a906dc"} err="failed to get container status \"b73c32175b83021ba1108c3445f2bdba43475d3768761563b2068c2162a906dc\": rpc error: code = NotFound desc = could not find container \"b73c32175b83021ba1108c3445f2bdba43475d3768761563b2068c2162a906dc\": container with ID starting with b73c32175b83021ba1108c3445f2bdba43475d3768761563b2068c2162a906dc not found: ID does not exist" Oct 07 13:11:38 crc kubenswrapper[4854]: I1007 13:11:38.714963 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc0c2a7c-d842-4860-acbf-fefb26f459be" path="/var/lib/kubelet/pods/bc0c2a7c-d842-4860-acbf-fefb26f459be/volumes" Oct 07 13:12:10 crc kubenswrapper[4854]: I1007 13:12:10.807393 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:12:10 crc kubenswrapper[4854]: I1007 13:12:10.808535 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.597676 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nw2q9"] Oct 07 13:12:19 crc kubenswrapper[4854]: E1007 13:12:19.598559 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0c2a7c-d842-4860-acbf-fefb26f459be" containerName="registry-server" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.598575 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0c2a7c-d842-4860-acbf-fefb26f459be" containerName="registry-server" Oct 07 13:12:19 crc kubenswrapper[4854]: E1007 13:12:19.598590 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0c2a7c-d842-4860-acbf-fefb26f459be" containerName="extract-content" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.598598 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0c2a7c-d842-4860-acbf-fefb26f459be" containerName="extract-content" Oct 07 13:12:19 crc kubenswrapper[4854]: E1007 13:12:19.598613 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0c2a7c-d842-4860-acbf-fefb26f459be" containerName="extract-utilities" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.598622 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0c2a7c-d842-4860-acbf-fefb26f459be" containerName="extract-utilities" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.598775 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc0c2a7c-d842-4860-acbf-fefb26f459be" containerName="registry-server" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.599934 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.609703 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nw2q9"] Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.723416 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q5dz\" (UniqueName: \"kubernetes.io/projected/1cd66629-6922-4022-a0ef-2087457d37f9-kube-api-access-5q5dz\") pod \"redhat-operators-nw2q9\" (UID: \"1cd66629-6922-4022-a0ef-2087457d37f9\") " pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.723695 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd66629-6922-4022-a0ef-2087457d37f9-catalog-content\") pod \"redhat-operators-nw2q9\" (UID: \"1cd66629-6922-4022-a0ef-2087457d37f9\") " pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.723783 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd66629-6922-4022-a0ef-2087457d37f9-utilities\") pod \"redhat-operators-nw2q9\" (UID: \"1cd66629-6922-4022-a0ef-2087457d37f9\") " pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.825138 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd66629-6922-4022-a0ef-2087457d37f9-utilities\") pod \"redhat-operators-nw2q9\" (UID: \"1cd66629-6922-4022-a0ef-2087457d37f9\") " pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.825269 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q5dz\" (UniqueName: \"kubernetes.io/projected/1cd66629-6922-4022-a0ef-2087457d37f9-kube-api-access-5q5dz\") pod \"redhat-operators-nw2q9\" (UID: \"1cd66629-6922-4022-a0ef-2087457d37f9\") " pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.825291 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd66629-6922-4022-a0ef-2087457d37f9-catalog-content\") pod \"redhat-operators-nw2q9\" (UID: \"1cd66629-6922-4022-a0ef-2087457d37f9\") " pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.825744 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd66629-6922-4022-a0ef-2087457d37f9-utilities\") pod \"redhat-operators-nw2q9\" (UID: \"1cd66629-6922-4022-a0ef-2087457d37f9\") " pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.826396 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd66629-6922-4022-a0ef-2087457d37f9-catalog-content\") pod \"redhat-operators-nw2q9\" (UID: \"1cd66629-6922-4022-a0ef-2087457d37f9\") " pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.846731 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q5dz\" (UniqueName: \"kubernetes.io/projected/1cd66629-6922-4022-a0ef-2087457d37f9-kube-api-access-5q5dz\") pod \"redhat-operators-nw2q9\" (UID: \"1cd66629-6922-4022-a0ef-2087457d37f9\") " pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:19 crc kubenswrapper[4854]: I1007 13:12:19.918769 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:20 crc kubenswrapper[4854]: I1007 13:12:20.335603 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nw2q9"] Oct 07 13:12:20 crc kubenswrapper[4854]: W1007 13:12:20.348049 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cd66629_6922_4022_a0ef_2087457d37f9.slice/crio-59995539456e181aa25371c3f8315c2e9ff0203d05e3152dcb26c5261b727857 WatchSource:0}: Error finding container 59995539456e181aa25371c3f8315c2e9ff0203d05e3152dcb26c5261b727857: Status 404 returned error can't find the container with id 59995539456e181aa25371c3f8315c2e9ff0203d05e3152dcb26c5261b727857 Oct 07 13:12:20 crc kubenswrapper[4854]: I1007 13:12:20.523560 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2q9" event={"ID":"1cd66629-6922-4022-a0ef-2087457d37f9","Type":"ContainerStarted","Data":"6e3bc6cb71eb95e6e2fa76a6e10e85935ac27a0400ae69eda8225a7f7e56afb6"} Oct 07 13:12:20 crc kubenswrapper[4854]: I1007 13:12:20.523601 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2q9" event={"ID":"1cd66629-6922-4022-a0ef-2087457d37f9","Type":"ContainerStarted","Data":"59995539456e181aa25371c3f8315c2e9ff0203d05e3152dcb26c5261b727857"} Oct 07 13:12:21 crc kubenswrapper[4854]: I1007 13:12:21.534222 4854 generic.go:334] "Generic (PLEG): container finished" podID="1cd66629-6922-4022-a0ef-2087457d37f9" containerID="6e3bc6cb71eb95e6e2fa76a6e10e85935ac27a0400ae69eda8225a7f7e56afb6" exitCode=0 Oct 07 13:12:21 crc kubenswrapper[4854]: I1007 13:12:21.534401 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2q9" event={"ID":"1cd66629-6922-4022-a0ef-2087457d37f9","Type":"ContainerDied","Data":"6e3bc6cb71eb95e6e2fa76a6e10e85935ac27a0400ae69eda8225a7f7e56afb6"} Oct 07 13:12:23 crc kubenswrapper[4854]: I1007 13:12:23.556485 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2q9" event={"ID":"1cd66629-6922-4022-a0ef-2087457d37f9","Type":"ContainerStarted","Data":"07e6e8b3160fd7dbd577ff8ef9d107fda3eb3ebfbd70b5f1dba720cbfdd4a4d2"} Oct 07 13:12:24 crc kubenswrapper[4854]: I1007 13:12:24.567136 4854 generic.go:334] "Generic (PLEG): container finished" podID="1cd66629-6922-4022-a0ef-2087457d37f9" containerID="07e6e8b3160fd7dbd577ff8ef9d107fda3eb3ebfbd70b5f1dba720cbfdd4a4d2" exitCode=0 Oct 07 13:12:24 crc kubenswrapper[4854]: I1007 13:12:24.567243 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2q9" event={"ID":"1cd66629-6922-4022-a0ef-2087457d37f9","Type":"ContainerDied","Data":"07e6e8b3160fd7dbd577ff8ef9d107fda3eb3ebfbd70b5f1dba720cbfdd4a4d2"} Oct 07 13:12:25 crc kubenswrapper[4854]: I1007 13:12:25.578780 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2q9" event={"ID":"1cd66629-6922-4022-a0ef-2087457d37f9","Type":"ContainerStarted","Data":"1e13fc10d67a306a25838871ebeae6f1ab6423a2d320c5e0ba8af0e30f3acb20"} Oct 07 13:12:25 crc kubenswrapper[4854]: I1007 13:12:25.595998 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nw2q9" podStartSLOduration=3.11355984 podStartE2EDuration="6.595980747s" podCreationTimestamp="2025-10-07 13:12:19 +0000 UTC" firstStartedPulling="2025-10-07 13:12:21.537065227 +0000 UTC m=+2857.524897482" lastFinishedPulling="2025-10-07 13:12:25.019486124 +0000 UTC m=+2861.007318389" observedRunningTime="2025-10-07 13:12:25.595622017 +0000 UTC m=+2861.583454302" watchObservedRunningTime="2025-10-07 13:12:25.595980747 +0000 UTC m=+2861.583813002" Oct 07 13:12:29 crc kubenswrapper[4854]: I1007 13:12:29.919338 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:29 crc kubenswrapper[4854]: I1007 13:12:29.919931 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:30 crc kubenswrapper[4854]: I1007 13:12:30.993697 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nw2q9" podUID="1cd66629-6922-4022-a0ef-2087457d37f9" containerName="registry-server" probeResult="failure" output=< Oct 07 13:12:30 crc kubenswrapper[4854]: timeout: failed to connect service ":50051" within 1s Oct 07 13:12:30 crc kubenswrapper[4854]: > Oct 07 13:12:39 crc kubenswrapper[4854]: I1007 13:12:39.990301 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:40 crc kubenswrapper[4854]: I1007 13:12:40.058624 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:40 crc kubenswrapper[4854]: I1007 13:12:40.241663 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nw2q9"] Oct 07 13:12:40 crc kubenswrapper[4854]: I1007 13:12:40.808814 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:12:40 crc kubenswrapper[4854]: I1007 13:12:40.808885 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:12:41 crc kubenswrapper[4854]: I1007 13:12:41.727600 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nw2q9" podUID="1cd66629-6922-4022-a0ef-2087457d37f9" containerName="registry-server" containerID="cri-o://1e13fc10d67a306a25838871ebeae6f1ab6423a2d320c5e0ba8af0e30f3acb20" gracePeriod=2 Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.239630 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.275233 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd66629-6922-4022-a0ef-2087457d37f9-catalog-content\") pod \"1cd66629-6922-4022-a0ef-2087457d37f9\" (UID: \"1cd66629-6922-4022-a0ef-2087457d37f9\") " Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.275427 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q5dz\" (UniqueName: \"kubernetes.io/projected/1cd66629-6922-4022-a0ef-2087457d37f9-kube-api-access-5q5dz\") pod \"1cd66629-6922-4022-a0ef-2087457d37f9\" (UID: \"1cd66629-6922-4022-a0ef-2087457d37f9\") " Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.275508 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd66629-6922-4022-a0ef-2087457d37f9-utilities\") pod \"1cd66629-6922-4022-a0ef-2087457d37f9\" (UID: \"1cd66629-6922-4022-a0ef-2087457d37f9\") " Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.278348 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cd66629-6922-4022-a0ef-2087457d37f9-utilities" (OuterVolumeSpecName: "utilities") pod "1cd66629-6922-4022-a0ef-2087457d37f9" (UID: "1cd66629-6922-4022-a0ef-2087457d37f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.287183 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cd66629-6922-4022-a0ef-2087457d37f9-kube-api-access-5q5dz" (OuterVolumeSpecName: "kube-api-access-5q5dz") pod "1cd66629-6922-4022-a0ef-2087457d37f9" (UID: "1cd66629-6922-4022-a0ef-2087457d37f9"). InnerVolumeSpecName "kube-api-access-5q5dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.377868 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q5dz\" (UniqueName: \"kubernetes.io/projected/1cd66629-6922-4022-a0ef-2087457d37f9-kube-api-access-5q5dz\") on node \"crc\" DevicePath \"\"" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.377909 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd66629-6922-4022-a0ef-2087457d37f9-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.386208 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cd66629-6922-4022-a0ef-2087457d37f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cd66629-6922-4022-a0ef-2087457d37f9" (UID: "1cd66629-6922-4022-a0ef-2087457d37f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.479611 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd66629-6922-4022-a0ef-2087457d37f9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.743221 4854 generic.go:334] "Generic (PLEG): container finished" podID="1cd66629-6922-4022-a0ef-2087457d37f9" containerID="1e13fc10d67a306a25838871ebeae6f1ab6423a2d320c5e0ba8af0e30f3acb20" exitCode=0 Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.743369 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2q9" event={"ID":"1cd66629-6922-4022-a0ef-2087457d37f9","Type":"ContainerDied","Data":"1e13fc10d67a306a25838871ebeae6f1ab6423a2d320c5e0ba8af0e30f3acb20"} Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.743427 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nw2q9" event={"ID":"1cd66629-6922-4022-a0ef-2087457d37f9","Type":"ContainerDied","Data":"59995539456e181aa25371c3f8315c2e9ff0203d05e3152dcb26c5261b727857"} Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.743467 4854 scope.go:117] "RemoveContainer" containerID="1e13fc10d67a306a25838871ebeae6f1ab6423a2d320c5e0ba8af0e30f3acb20" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.743389 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nw2q9" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.783264 4854 scope.go:117] "RemoveContainer" containerID="07e6e8b3160fd7dbd577ff8ef9d107fda3eb3ebfbd70b5f1dba720cbfdd4a4d2" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.794841 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nw2q9"] Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.813468 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nw2q9"] Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.826530 4854 scope.go:117] "RemoveContainer" containerID="6e3bc6cb71eb95e6e2fa76a6e10e85935ac27a0400ae69eda8225a7f7e56afb6" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.859196 4854 scope.go:117] "RemoveContainer" containerID="1e13fc10d67a306a25838871ebeae6f1ab6423a2d320c5e0ba8af0e30f3acb20" Oct 07 13:12:42 crc kubenswrapper[4854]: E1007 13:12:42.859691 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e13fc10d67a306a25838871ebeae6f1ab6423a2d320c5e0ba8af0e30f3acb20\": container with ID starting with 1e13fc10d67a306a25838871ebeae6f1ab6423a2d320c5e0ba8af0e30f3acb20 not found: ID does not exist" containerID="1e13fc10d67a306a25838871ebeae6f1ab6423a2d320c5e0ba8af0e30f3acb20" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.859740 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e13fc10d67a306a25838871ebeae6f1ab6423a2d320c5e0ba8af0e30f3acb20"} err="failed to get container status \"1e13fc10d67a306a25838871ebeae6f1ab6423a2d320c5e0ba8af0e30f3acb20\": rpc error: code = NotFound desc = could not find container \"1e13fc10d67a306a25838871ebeae6f1ab6423a2d320c5e0ba8af0e30f3acb20\": container with ID starting with 1e13fc10d67a306a25838871ebeae6f1ab6423a2d320c5e0ba8af0e30f3acb20 not found: ID does not exist" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.859766 4854 scope.go:117] "RemoveContainer" containerID="07e6e8b3160fd7dbd577ff8ef9d107fda3eb3ebfbd70b5f1dba720cbfdd4a4d2" Oct 07 13:12:42 crc kubenswrapper[4854]: E1007 13:12:42.860194 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07e6e8b3160fd7dbd577ff8ef9d107fda3eb3ebfbd70b5f1dba720cbfdd4a4d2\": container with ID starting with 07e6e8b3160fd7dbd577ff8ef9d107fda3eb3ebfbd70b5f1dba720cbfdd4a4d2 not found: ID does not exist" containerID="07e6e8b3160fd7dbd577ff8ef9d107fda3eb3ebfbd70b5f1dba720cbfdd4a4d2" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.860220 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07e6e8b3160fd7dbd577ff8ef9d107fda3eb3ebfbd70b5f1dba720cbfdd4a4d2"} err="failed to get container status \"07e6e8b3160fd7dbd577ff8ef9d107fda3eb3ebfbd70b5f1dba720cbfdd4a4d2\": rpc error: code = NotFound desc = could not find container \"07e6e8b3160fd7dbd577ff8ef9d107fda3eb3ebfbd70b5f1dba720cbfdd4a4d2\": container with ID starting with 07e6e8b3160fd7dbd577ff8ef9d107fda3eb3ebfbd70b5f1dba720cbfdd4a4d2 not found: ID does not exist" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.860235 4854 scope.go:117] "RemoveContainer" containerID="6e3bc6cb71eb95e6e2fa76a6e10e85935ac27a0400ae69eda8225a7f7e56afb6" Oct 07 13:12:42 crc kubenswrapper[4854]: E1007 13:12:42.860658 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3bc6cb71eb95e6e2fa76a6e10e85935ac27a0400ae69eda8225a7f7e56afb6\": container with ID starting with 6e3bc6cb71eb95e6e2fa76a6e10e85935ac27a0400ae69eda8225a7f7e56afb6 not found: ID does not exist" containerID="6e3bc6cb71eb95e6e2fa76a6e10e85935ac27a0400ae69eda8225a7f7e56afb6" Oct 07 13:12:42 crc kubenswrapper[4854]: I1007 13:12:42.860706 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3bc6cb71eb95e6e2fa76a6e10e85935ac27a0400ae69eda8225a7f7e56afb6"} err="failed to get container status \"6e3bc6cb71eb95e6e2fa76a6e10e85935ac27a0400ae69eda8225a7f7e56afb6\": rpc error: code = NotFound desc = could not find container \"6e3bc6cb71eb95e6e2fa76a6e10e85935ac27a0400ae69eda8225a7f7e56afb6\": container with ID starting with 6e3bc6cb71eb95e6e2fa76a6e10e85935ac27a0400ae69eda8225a7f7e56afb6 not found: ID does not exist" Oct 07 13:12:44 crc kubenswrapper[4854]: I1007 13:12:44.720144 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cd66629-6922-4022-a0ef-2087457d37f9" path="/var/lib/kubelet/pods/1cd66629-6922-4022-a0ef-2087457d37f9/volumes" Oct 07 13:13:10 crc kubenswrapper[4854]: I1007 13:13:10.808206 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:13:10 crc kubenswrapper[4854]: I1007 13:13:10.808906 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:13:10 crc kubenswrapper[4854]: I1007 13:13:10.808980 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 13:13:10 crc kubenswrapper[4854]: I1007 13:13:10.809931 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9dd9e16eaf55f4ba69f659bf6e7b7b2660a18997e1ad98c00cf8a756a7e4a562"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:13:10 crc kubenswrapper[4854]: I1007 13:13:10.810034 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://9dd9e16eaf55f4ba69f659bf6e7b7b2660a18997e1ad98c00cf8a756a7e4a562" gracePeriod=600 Oct 07 13:13:11 crc kubenswrapper[4854]: I1007 13:13:11.078875 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="9dd9e16eaf55f4ba69f659bf6e7b7b2660a18997e1ad98c00cf8a756a7e4a562" exitCode=0 Oct 07 13:13:11 crc kubenswrapper[4854]: I1007 13:13:11.079131 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"9dd9e16eaf55f4ba69f659bf6e7b7b2660a18997e1ad98c00cf8a756a7e4a562"} Oct 07 13:13:11 crc kubenswrapper[4854]: I1007 13:13:11.079269 4854 scope.go:117] "RemoveContainer" containerID="de7eb6481963900983ac98a60edbefb2fb8bb4b50f644caad50d425b3ee27b98" Oct 07 13:13:12 crc kubenswrapper[4854]: I1007 13:13:12.090558 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0"} Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.201373 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87"] Oct 07 13:15:00 crc kubenswrapper[4854]: E1007 13:15:00.202203 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd66629-6922-4022-a0ef-2087457d37f9" containerName="registry-server" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.202218 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd66629-6922-4022-a0ef-2087457d37f9" containerName="registry-server" Oct 07 13:15:00 crc kubenswrapper[4854]: E1007 13:15:00.202234 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd66629-6922-4022-a0ef-2087457d37f9" containerName="extract-content" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.202242 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd66629-6922-4022-a0ef-2087457d37f9" containerName="extract-content" Oct 07 13:15:00 crc kubenswrapper[4854]: E1007 13:15:00.202283 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd66629-6922-4022-a0ef-2087457d37f9" containerName="extract-utilities" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.202292 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd66629-6922-4022-a0ef-2087457d37f9" containerName="extract-utilities" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.202450 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd66629-6922-4022-a0ef-2087457d37f9" containerName="registry-server" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.203049 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.206833 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.207313 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.232636 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87"] Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.400379 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42109e34-6105-420f-9841-80ce763db33c-config-volume\") pod \"collect-profiles-29330715-6dw87\" (UID: \"42109e34-6105-420f-9841-80ce763db33c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.400539 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nmb9\" (UniqueName: \"kubernetes.io/projected/42109e34-6105-420f-9841-80ce763db33c-kube-api-access-4nmb9\") pod \"collect-profiles-29330715-6dw87\" (UID: \"42109e34-6105-420f-9841-80ce763db33c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.400600 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42109e34-6105-420f-9841-80ce763db33c-secret-volume\") pod \"collect-profiles-29330715-6dw87\" (UID: \"42109e34-6105-420f-9841-80ce763db33c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.501807 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nmb9\" (UniqueName: \"kubernetes.io/projected/42109e34-6105-420f-9841-80ce763db33c-kube-api-access-4nmb9\") pod \"collect-profiles-29330715-6dw87\" (UID: \"42109e34-6105-420f-9841-80ce763db33c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.501860 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42109e34-6105-420f-9841-80ce763db33c-secret-volume\") pod \"collect-profiles-29330715-6dw87\" (UID: \"42109e34-6105-420f-9841-80ce763db33c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.501902 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42109e34-6105-420f-9841-80ce763db33c-config-volume\") pod \"collect-profiles-29330715-6dw87\" (UID: \"42109e34-6105-420f-9841-80ce763db33c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.502676 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42109e34-6105-420f-9841-80ce763db33c-config-volume\") pod \"collect-profiles-29330715-6dw87\" (UID: \"42109e34-6105-420f-9841-80ce763db33c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.507808 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42109e34-6105-420f-9841-80ce763db33c-secret-volume\") pod \"collect-profiles-29330715-6dw87\" (UID: \"42109e34-6105-420f-9841-80ce763db33c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.517688 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nmb9\" (UniqueName: \"kubernetes.io/projected/42109e34-6105-420f-9841-80ce763db33c-kube-api-access-4nmb9\") pod \"collect-profiles-29330715-6dw87\" (UID: \"42109e34-6105-420f-9841-80ce763db33c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" Oct 07 13:15:00 crc kubenswrapper[4854]: I1007 13:15:00.523848 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" Oct 07 13:15:01 crc kubenswrapper[4854]: I1007 13:15:01.003176 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87"] Oct 07 13:15:01 crc kubenswrapper[4854]: W1007 13:15:01.010306 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42109e34_6105_420f_9841_80ce763db33c.slice/crio-4fcf7d19889e37fdc8c7cc911d1139384adefd7d5326b2c0438b9a781b25c3b9 WatchSource:0}: Error finding container 4fcf7d19889e37fdc8c7cc911d1139384adefd7d5326b2c0438b9a781b25c3b9: Status 404 returned error can't find the container with id 4fcf7d19889e37fdc8c7cc911d1139384adefd7d5326b2c0438b9a781b25c3b9 Oct 07 13:15:01 crc kubenswrapper[4854]: I1007 13:15:01.113574 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" event={"ID":"42109e34-6105-420f-9841-80ce763db33c","Type":"ContainerStarted","Data":"4fcf7d19889e37fdc8c7cc911d1139384adefd7d5326b2c0438b9a781b25c3b9"} Oct 07 13:15:02 crc kubenswrapper[4854]: I1007 13:15:02.126305 4854 generic.go:334] "Generic (PLEG): container finished" podID="42109e34-6105-420f-9841-80ce763db33c" containerID="2d664f6a62b3f053630760e41eabc66518e87658cc380cb1496f3eda655566b3" exitCode=0 Oct 07 13:15:02 crc kubenswrapper[4854]: I1007 13:15:02.126378 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" event={"ID":"42109e34-6105-420f-9841-80ce763db33c","Type":"ContainerDied","Data":"2d664f6a62b3f053630760e41eabc66518e87658cc380cb1496f3eda655566b3"} Oct 07 13:15:03 crc kubenswrapper[4854]: I1007 13:15:03.484832 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" Oct 07 13:15:03 crc kubenswrapper[4854]: I1007 13:15:03.574534 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42109e34-6105-420f-9841-80ce763db33c-config-volume\") pod \"42109e34-6105-420f-9841-80ce763db33c\" (UID: \"42109e34-6105-420f-9841-80ce763db33c\") " Oct 07 13:15:03 crc kubenswrapper[4854]: I1007 13:15:03.574576 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nmb9\" (UniqueName: \"kubernetes.io/projected/42109e34-6105-420f-9841-80ce763db33c-kube-api-access-4nmb9\") pod \"42109e34-6105-420f-9841-80ce763db33c\" (UID: \"42109e34-6105-420f-9841-80ce763db33c\") " Oct 07 13:15:03 crc kubenswrapper[4854]: I1007 13:15:03.574617 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42109e34-6105-420f-9841-80ce763db33c-secret-volume\") pod \"42109e34-6105-420f-9841-80ce763db33c\" (UID: \"42109e34-6105-420f-9841-80ce763db33c\") " Oct 07 13:15:03 crc kubenswrapper[4854]: I1007 13:15:03.575368 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42109e34-6105-420f-9841-80ce763db33c-config-volume" (OuterVolumeSpecName: "config-volume") pod "42109e34-6105-420f-9841-80ce763db33c" (UID: "42109e34-6105-420f-9841-80ce763db33c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:15:03 crc kubenswrapper[4854]: I1007 13:15:03.580378 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42109e34-6105-420f-9841-80ce763db33c-kube-api-access-4nmb9" (OuterVolumeSpecName: "kube-api-access-4nmb9") pod "42109e34-6105-420f-9841-80ce763db33c" (UID: "42109e34-6105-420f-9841-80ce763db33c"). InnerVolumeSpecName "kube-api-access-4nmb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:15:03 crc kubenswrapper[4854]: I1007 13:15:03.581014 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42109e34-6105-420f-9841-80ce763db33c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "42109e34-6105-420f-9841-80ce763db33c" (UID: "42109e34-6105-420f-9841-80ce763db33c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:15:03 crc kubenswrapper[4854]: I1007 13:15:03.676762 4854 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42109e34-6105-420f-9841-80ce763db33c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:03 crc kubenswrapper[4854]: I1007 13:15:03.676873 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nmb9\" (UniqueName: \"kubernetes.io/projected/42109e34-6105-420f-9841-80ce763db33c-kube-api-access-4nmb9\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:03 crc kubenswrapper[4854]: I1007 13:15:03.676955 4854 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42109e34-6105-420f-9841-80ce763db33c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:15:04 crc kubenswrapper[4854]: I1007 13:15:04.147512 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" event={"ID":"42109e34-6105-420f-9841-80ce763db33c","Type":"ContainerDied","Data":"4fcf7d19889e37fdc8c7cc911d1139384adefd7d5326b2c0438b9a781b25c3b9"} Oct 07 13:15:04 crc kubenswrapper[4854]: I1007 13:15:04.147834 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fcf7d19889e37fdc8c7cc911d1139384adefd7d5326b2c0438b9a781b25c3b9" Oct 07 13:15:04 crc kubenswrapper[4854]: I1007 13:15:04.147699 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87" Oct 07 13:15:04 crc kubenswrapper[4854]: I1007 13:15:04.587120 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624"] Oct 07 13:15:04 crc kubenswrapper[4854]: I1007 13:15:04.596417 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330670-ql624"] Oct 07 13:15:04 crc kubenswrapper[4854]: I1007 13:15:04.720669 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b08e27-c5ae-4332-80ad-df3db344f0ee" path="/var/lib/kubelet/pods/23b08e27-c5ae-4332-80ad-df3db344f0ee/volumes" Oct 07 13:15:19 crc kubenswrapper[4854]: I1007 13:15:19.908328 4854 scope.go:117] "RemoveContainer" containerID="33d79c6fba47059b7e54f946e88de2dbd8995c0a3f1e10b4e0fdc606546f0061" Oct 07 13:15:40 crc kubenswrapper[4854]: I1007 13:15:40.808357 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:15:40 crc kubenswrapper[4854]: I1007 13:15:40.809340 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:16:10 crc kubenswrapper[4854]: I1007 13:16:10.807695 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:16:10 crc kubenswrapper[4854]: I1007 13:16:10.808303 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:16:40 crc kubenswrapper[4854]: I1007 13:16:40.809068 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:16:40 crc kubenswrapper[4854]: I1007 13:16:40.809944 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:16:40 crc kubenswrapper[4854]: I1007 13:16:40.810073 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 13:16:40 crc kubenswrapper[4854]: I1007 13:16:40.810944 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:16:40 crc kubenswrapper[4854]: I1007 13:16:40.811936 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" gracePeriod=600 Oct 07 13:16:40 crc kubenswrapper[4854]: E1007 13:16:40.944436 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:16:41 crc kubenswrapper[4854]: I1007 13:16:41.053130 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" exitCode=0 Oct 07 13:16:41 crc kubenswrapper[4854]: I1007 13:16:41.053247 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0"} Oct 07 13:16:41 crc kubenswrapper[4854]: I1007 13:16:41.053309 4854 scope.go:117] "RemoveContainer" containerID="9dd9e16eaf55f4ba69f659bf6e7b7b2660a18997e1ad98c00cf8a756a7e4a562" Oct 07 13:16:41 crc kubenswrapper[4854]: I1007 13:16:41.053900 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:16:41 crc kubenswrapper[4854]: E1007 13:16:41.055091 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:16:53 crc kubenswrapper[4854]: I1007 13:16:53.703378 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:16:53 crc kubenswrapper[4854]: E1007 13:16:53.704514 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:17:08 crc kubenswrapper[4854]: I1007 13:17:08.703138 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:17:08 crc kubenswrapper[4854]: E1007 13:17:08.704123 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:17:23 crc kubenswrapper[4854]: I1007 13:17:23.702749 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:17:23 crc kubenswrapper[4854]: E1007 13:17:23.703465 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:17:37 crc kubenswrapper[4854]: I1007 13:17:37.702546 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:17:37 crc kubenswrapper[4854]: E1007 13:17:37.703426 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:17:51 crc kubenswrapper[4854]: I1007 13:17:51.703019 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:17:51 crc kubenswrapper[4854]: E1007 13:17:51.703778 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:18:05 crc kubenswrapper[4854]: I1007 13:18:05.702990 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:18:05 crc kubenswrapper[4854]: E1007 13:18:05.703936 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:18:17 crc kubenswrapper[4854]: I1007 13:18:17.702689 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:18:17 crc kubenswrapper[4854]: E1007 13:18:17.703702 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:18:30 crc kubenswrapper[4854]: I1007 13:18:30.703778 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:18:30 crc kubenswrapper[4854]: E1007 13:18:30.704820 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:18:43 crc kubenswrapper[4854]: I1007 13:18:43.702167 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:18:43 crc kubenswrapper[4854]: E1007 13:18:43.702870 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:18:54 crc kubenswrapper[4854]: I1007 13:18:54.706655 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:18:54 crc kubenswrapper[4854]: E1007 13:18:54.707543 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:19:09 crc kubenswrapper[4854]: I1007 13:19:09.703449 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:19:09 crc kubenswrapper[4854]: E1007 13:19:09.705639 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:19:21 crc kubenswrapper[4854]: I1007 13:19:21.703397 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:19:21 crc kubenswrapper[4854]: E1007 13:19:21.704458 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:19:32 crc kubenswrapper[4854]: I1007 13:19:32.702410 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:19:32 crc kubenswrapper[4854]: E1007 13:19:32.703186 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:19:45 crc kubenswrapper[4854]: I1007 13:19:45.702955 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:19:45 crc kubenswrapper[4854]: E1007 13:19:45.704087 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:19:56 crc kubenswrapper[4854]: I1007 13:19:56.703337 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:19:56 crc kubenswrapper[4854]: E1007 13:19:56.704319 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:20:07 crc kubenswrapper[4854]: I1007 13:20:07.702411 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:20:07 crc kubenswrapper[4854]: E1007 13:20:07.705427 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:20:19 crc kubenswrapper[4854]: I1007 13:20:19.703207 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:20:19 crc kubenswrapper[4854]: E1007 13:20:19.703762 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:20:30 crc kubenswrapper[4854]: I1007 13:20:30.703000 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:20:30 crc kubenswrapper[4854]: E1007 13:20:30.704484 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:20:42 crc kubenswrapper[4854]: I1007 13:20:42.702457 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:20:42 crc kubenswrapper[4854]: E1007 13:20:42.703032 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:20:52 crc kubenswrapper[4854]: I1007 13:20:52.826685 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9dn9t"] Oct 07 13:20:52 crc kubenswrapper[4854]: E1007 13:20:52.827788 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42109e34-6105-420f-9841-80ce763db33c" containerName="collect-profiles" Oct 07 13:20:52 crc kubenswrapper[4854]: I1007 13:20:52.827811 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="42109e34-6105-420f-9841-80ce763db33c" containerName="collect-profiles" Oct 07 13:20:52 crc kubenswrapper[4854]: I1007 13:20:52.828031 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="42109e34-6105-420f-9841-80ce763db33c" containerName="collect-profiles" Oct 07 13:20:52 crc kubenswrapper[4854]: I1007 13:20:52.829715 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:20:52 crc kubenswrapper[4854]: I1007 13:20:52.858175 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9dn9t"] Oct 07 13:20:52 crc kubenswrapper[4854]: I1007 13:20:52.890979 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7e5461-46d5-40f4-9b11-7d9613cba02e-utilities\") pod \"community-operators-9dn9t\" (UID: \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\") " pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:20:52 crc kubenswrapper[4854]: I1007 13:20:52.891046 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29lfk\" (UniqueName: \"kubernetes.io/projected/fb7e5461-46d5-40f4-9b11-7d9613cba02e-kube-api-access-29lfk\") pod \"community-operators-9dn9t\" (UID: \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\") " pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:20:52 crc kubenswrapper[4854]: I1007 13:20:52.891115 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7e5461-46d5-40f4-9b11-7d9613cba02e-catalog-content\") pod \"community-operators-9dn9t\" (UID: \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\") " pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:20:52 crc kubenswrapper[4854]: I1007 13:20:52.992074 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29lfk\" (UniqueName: \"kubernetes.io/projected/fb7e5461-46d5-40f4-9b11-7d9613cba02e-kube-api-access-29lfk\") pod \"community-operators-9dn9t\" (UID: \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\") " pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:20:52 crc kubenswrapper[4854]: I1007 13:20:52.992372 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7e5461-46d5-40f4-9b11-7d9613cba02e-catalog-content\") pod \"community-operators-9dn9t\" (UID: \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\") " pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:20:52 crc kubenswrapper[4854]: I1007 13:20:52.992523 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7e5461-46d5-40f4-9b11-7d9613cba02e-utilities\") pod \"community-operators-9dn9t\" (UID: \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\") " pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:20:52 crc kubenswrapper[4854]: I1007 13:20:52.992891 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7e5461-46d5-40f4-9b11-7d9613cba02e-catalog-content\") pod \"community-operators-9dn9t\" (UID: \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\") " pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:20:52 crc kubenswrapper[4854]: I1007 13:20:52.993033 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7e5461-46d5-40f4-9b11-7d9613cba02e-utilities\") pod \"community-operators-9dn9t\" (UID: \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\") " pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:20:53 crc kubenswrapper[4854]: I1007 13:20:53.016606 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29lfk\" (UniqueName: \"kubernetes.io/projected/fb7e5461-46d5-40f4-9b11-7d9613cba02e-kube-api-access-29lfk\") pod \"community-operators-9dn9t\" (UID: \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\") " pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:20:53 crc kubenswrapper[4854]: I1007 13:20:53.151752 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:20:53 crc kubenswrapper[4854]: I1007 13:20:53.666262 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9dn9t"] Oct 07 13:20:54 crc kubenswrapper[4854]: I1007 13:20:54.407930 4854 generic.go:334] "Generic (PLEG): container finished" podID="fb7e5461-46d5-40f4-9b11-7d9613cba02e" containerID="aa923510d8ed7a4c5cf52fa87b9be2d2d909e51f6ab908aad9f1025fb9c65646" exitCode=0 Oct 07 13:20:54 crc kubenswrapper[4854]: I1007 13:20:54.408029 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dn9t" event={"ID":"fb7e5461-46d5-40f4-9b11-7d9613cba02e","Type":"ContainerDied","Data":"aa923510d8ed7a4c5cf52fa87b9be2d2d909e51f6ab908aad9f1025fb9c65646"} Oct 07 13:20:54 crc kubenswrapper[4854]: I1007 13:20:54.408433 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dn9t" event={"ID":"fb7e5461-46d5-40f4-9b11-7d9613cba02e","Type":"ContainerStarted","Data":"d9223bbb8c8aa2a7984d4187a91a3ce7c3ab9042ad585c7969aa1de9ba4abb87"} Oct 07 13:20:54 crc kubenswrapper[4854]: I1007 13:20:54.412416 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:20:54 crc kubenswrapper[4854]: I1007 13:20:54.710532 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:20:54 crc kubenswrapper[4854]: E1007 13:20:54.710990 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:20:55 crc kubenswrapper[4854]: E1007 13:20:55.945524 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb7e5461_46d5_40f4_9b11_7d9613cba02e.slice/crio-conmon-5ba1830fcc14a4d45467d02a09a13e1ed34a303b02be6dbd155169eb043974ec.scope\": RecentStats: unable to find data in memory cache]" Oct 07 13:20:56 crc kubenswrapper[4854]: I1007 13:20:56.433247 4854 generic.go:334] "Generic (PLEG): container finished" podID="fb7e5461-46d5-40f4-9b11-7d9613cba02e" containerID="5ba1830fcc14a4d45467d02a09a13e1ed34a303b02be6dbd155169eb043974ec" exitCode=0 Oct 07 13:20:56 crc kubenswrapper[4854]: I1007 13:20:56.433325 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dn9t" event={"ID":"fb7e5461-46d5-40f4-9b11-7d9613cba02e","Type":"ContainerDied","Data":"5ba1830fcc14a4d45467d02a09a13e1ed34a303b02be6dbd155169eb043974ec"} Oct 07 13:20:57 crc kubenswrapper[4854]: I1007 13:20:57.464631 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dn9t" event={"ID":"fb7e5461-46d5-40f4-9b11-7d9613cba02e","Type":"ContainerStarted","Data":"88b53525365ca8a3176d1bd87b9e33324db2096c955638a0fbc9bb1cf7316463"} Oct 07 13:20:57 crc kubenswrapper[4854]: I1007 13:20:57.494096 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9dn9t" podStartSLOduration=2.972882139 podStartE2EDuration="5.494071457s" podCreationTimestamp="2025-10-07 13:20:52 +0000 UTC" firstStartedPulling="2025-10-07 13:20:54.410106153 +0000 UTC m=+3370.397938458" lastFinishedPulling="2025-10-07 13:20:56.931295481 +0000 UTC m=+3372.919127776" observedRunningTime="2025-10-07 13:20:57.490201915 +0000 UTC m=+3373.478034180" watchObservedRunningTime="2025-10-07 13:20:57.494071457 +0000 UTC m=+3373.481903732" Oct 07 13:21:03 crc kubenswrapper[4854]: I1007 13:21:03.152493 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:21:03 crc kubenswrapper[4854]: I1007 13:21:03.152820 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:21:03 crc kubenswrapper[4854]: I1007 13:21:03.220555 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:21:03 crc kubenswrapper[4854]: I1007 13:21:03.584959 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:21:03 crc kubenswrapper[4854]: I1007 13:21:03.654590 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9dn9t"] Oct 07 13:21:05 crc kubenswrapper[4854]: I1007 13:21:05.543226 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9dn9t" podUID="fb7e5461-46d5-40f4-9b11-7d9613cba02e" containerName="registry-server" containerID="cri-o://88b53525365ca8a3176d1bd87b9e33324db2096c955638a0fbc9bb1cf7316463" gracePeriod=2 Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.050285 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.150163 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7e5461-46d5-40f4-9b11-7d9613cba02e-utilities\") pod \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\" (UID: \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\") " Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.150264 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7e5461-46d5-40f4-9b11-7d9613cba02e-catalog-content\") pod \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\" (UID: \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\") " Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.150285 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29lfk\" (UniqueName: \"kubernetes.io/projected/fb7e5461-46d5-40f4-9b11-7d9613cba02e-kube-api-access-29lfk\") pod \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\" (UID: \"fb7e5461-46d5-40f4-9b11-7d9613cba02e\") " Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.152088 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb7e5461-46d5-40f4-9b11-7d9613cba02e-utilities" (OuterVolumeSpecName: "utilities") pod "fb7e5461-46d5-40f4-9b11-7d9613cba02e" (UID: "fb7e5461-46d5-40f4-9b11-7d9613cba02e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.159209 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb7e5461-46d5-40f4-9b11-7d9613cba02e-kube-api-access-29lfk" (OuterVolumeSpecName: "kube-api-access-29lfk") pod "fb7e5461-46d5-40f4-9b11-7d9613cba02e" (UID: "fb7e5461-46d5-40f4-9b11-7d9613cba02e"). InnerVolumeSpecName "kube-api-access-29lfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.223778 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb7e5461-46d5-40f4-9b11-7d9613cba02e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb7e5461-46d5-40f4-9b11-7d9613cba02e" (UID: "fb7e5461-46d5-40f4-9b11-7d9613cba02e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.252720 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7e5461-46d5-40f4-9b11-7d9613cba02e-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.252768 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7e5461-46d5-40f4-9b11-7d9613cba02e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.252789 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29lfk\" (UniqueName: \"kubernetes.io/projected/fb7e5461-46d5-40f4-9b11-7d9613cba02e-kube-api-access-29lfk\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.558740 4854 generic.go:334] "Generic (PLEG): container finished" podID="fb7e5461-46d5-40f4-9b11-7d9613cba02e" containerID="88b53525365ca8a3176d1bd87b9e33324db2096c955638a0fbc9bb1cf7316463" exitCode=0 Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.558821 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dn9t" event={"ID":"fb7e5461-46d5-40f4-9b11-7d9613cba02e","Type":"ContainerDied","Data":"88b53525365ca8a3176d1bd87b9e33324db2096c955638a0fbc9bb1cf7316463"} Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.558872 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dn9t" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.558905 4854 scope.go:117] "RemoveContainer" containerID="88b53525365ca8a3176d1bd87b9e33324db2096c955638a0fbc9bb1cf7316463" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.558881 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dn9t" event={"ID":"fb7e5461-46d5-40f4-9b11-7d9613cba02e","Type":"ContainerDied","Data":"d9223bbb8c8aa2a7984d4187a91a3ce7c3ab9042ad585c7969aa1de9ba4abb87"} Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.588564 4854 scope.go:117] "RemoveContainer" containerID="5ba1830fcc14a4d45467d02a09a13e1ed34a303b02be6dbd155169eb043974ec" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.614183 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9dn9t"] Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.620899 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9dn9t"] Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.632173 4854 scope.go:117] "RemoveContainer" containerID="aa923510d8ed7a4c5cf52fa87b9be2d2d909e51f6ab908aad9f1025fb9c65646" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.663602 4854 scope.go:117] "RemoveContainer" containerID="88b53525365ca8a3176d1bd87b9e33324db2096c955638a0fbc9bb1cf7316463" Oct 07 13:21:06 crc kubenswrapper[4854]: E1007 13:21:06.664023 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b53525365ca8a3176d1bd87b9e33324db2096c955638a0fbc9bb1cf7316463\": container with ID starting with 88b53525365ca8a3176d1bd87b9e33324db2096c955638a0fbc9bb1cf7316463 not found: ID does not exist" containerID="88b53525365ca8a3176d1bd87b9e33324db2096c955638a0fbc9bb1cf7316463" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.664060 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b53525365ca8a3176d1bd87b9e33324db2096c955638a0fbc9bb1cf7316463"} err="failed to get container status \"88b53525365ca8a3176d1bd87b9e33324db2096c955638a0fbc9bb1cf7316463\": rpc error: code = NotFound desc = could not find container \"88b53525365ca8a3176d1bd87b9e33324db2096c955638a0fbc9bb1cf7316463\": container with ID starting with 88b53525365ca8a3176d1bd87b9e33324db2096c955638a0fbc9bb1cf7316463 not found: ID does not exist" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.664086 4854 scope.go:117] "RemoveContainer" containerID="5ba1830fcc14a4d45467d02a09a13e1ed34a303b02be6dbd155169eb043974ec" Oct 07 13:21:06 crc kubenswrapper[4854]: E1007 13:21:06.664416 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba1830fcc14a4d45467d02a09a13e1ed34a303b02be6dbd155169eb043974ec\": container with ID starting with 5ba1830fcc14a4d45467d02a09a13e1ed34a303b02be6dbd155169eb043974ec not found: ID does not exist" containerID="5ba1830fcc14a4d45467d02a09a13e1ed34a303b02be6dbd155169eb043974ec" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.664464 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba1830fcc14a4d45467d02a09a13e1ed34a303b02be6dbd155169eb043974ec"} err="failed to get container status \"5ba1830fcc14a4d45467d02a09a13e1ed34a303b02be6dbd155169eb043974ec\": rpc error: code = NotFound desc = could not find container \"5ba1830fcc14a4d45467d02a09a13e1ed34a303b02be6dbd155169eb043974ec\": container with ID starting with 5ba1830fcc14a4d45467d02a09a13e1ed34a303b02be6dbd155169eb043974ec not found: ID does not exist" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.664495 4854 scope.go:117] "RemoveContainer" containerID="aa923510d8ed7a4c5cf52fa87b9be2d2d909e51f6ab908aad9f1025fb9c65646" Oct 07 13:21:06 crc kubenswrapper[4854]: E1007 13:21:06.664978 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa923510d8ed7a4c5cf52fa87b9be2d2d909e51f6ab908aad9f1025fb9c65646\": container with ID starting with aa923510d8ed7a4c5cf52fa87b9be2d2d909e51f6ab908aad9f1025fb9c65646 not found: ID does not exist" containerID="aa923510d8ed7a4c5cf52fa87b9be2d2d909e51f6ab908aad9f1025fb9c65646" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.665013 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa923510d8ed7a4c5cf52fa87b9be2d2d909e51f6ab908aad9f1025fb9c65646"} err="failed to get container status \"aa923510d8ed7a4c5cf52fa87b9be2d2d909e51f6ab908aad9f1025fb9c65646\": rpc error: code = NotFound desc = could not find container \"aa923510d8ed7a4c5cf52fa87b9be2d2d909e51f6ab908aad9f1025fb9c65646\": container with ID starting with aa923510d8ed7a4c5cf52fa87b9be2d2d909e51f6ab908aad9f1025fb9c65646 not found: ID does not exist" Oct 07 13:21:06 crc kubenswrapper[4854]: I1007 13:21:06.718762 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb7e5461-46d5-40f4-9b11-7d9613cba02e" path="/var/lib/kubelet/pods/fb7e5461-46d5-40f4-9b11-7d9613cba02e/volumes" Oct 07 13:21:09 crc kubenswrapper[4854]: I1007 13:21:09.703497 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:21:09 crc kubenswrapper[4854]: E1007 13:21:09.704389 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:21:22 crc kubenswrapper[4854]: I1007 13:21:22.704038 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:21:22 crc kubenswrapper[4854]: E1007 13:21:22.705850 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:21:23 crc kubenswrapper[4854]: I1007 13:21:23.915524 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tltcs"] Oct 07 13:21:23 crc kubenswrapper[4854]: E1007 13:21:23.915851 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7e5461-46d5-40f4-9b11-7d9613cba02e" containerName="registry-server" Oct 07 13:21:23 crc kubenswrapper[4854]: I1007 13:21:23.915866 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7e5461-46d5-40f4-9b11-7d9613cba02e" containerName="registry-server" Oct 07 13:21:23 crc kubenswrapper[4854]: E1007 13:21:23.915881 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7e5461-46d5-40f4-9b11-7d9613cba02e" containerName="extract-utilities" Oct 07 13:21:23 crc kubenswrapper[4854]: I1007 13:21:23.915890 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7e5461-46d5-40f4-9b11-7d9613cba02e" containerName="extract-utilities" Oct 07 13:21:23 crc kubenswrapper[4854]: E1007 13:21:23.915911 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7e5461-46d5-40f4-9b11-7d9613cba02e" containerName="extract-content" Oct 07 13:21:23 crc kubenswrapper[4854]: I1007 13:21:23.915920 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7e5461-46d5-40f4-9b11-7d9613cba02e" containerName="extract-content" Oct 07 13:21:23 crc kubenswrapper[4854]: I1007 13:21:23.916110 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb7e5461-46d5-40f4-9b11-7d9613cba02e" containerName="registry-server" Oct 07 13:21:23 crc kubenswrapper[4854]: I1007 13:21:23.917381 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:23 crc kubenswrapper[4854]: I1007 13:21:23.941268 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tltcs"] Oct 07 13:21:24 crc kubenswrapper[4854]: I1007 13:21:24.034690 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwls7\" (UniqueName: \"kubernetes.io/projected/1ffc839b-6c7a-48f0-a997-4d87e895d379-kube-api-access-jwls7\") pod \"redhat-marketplace-tltcs\" (UID: \"1ffc839b-6c7a-48f0-a997-4d87e895d379\") " pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:24 crc kubenswrapper[4854]: I1007 13:21:24.035118 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ffc839b-6c7a-48f0-a997-4d87e895d379-utilities\") pod \"redhat-marketplace-tltcs\" (UID: \"1ffc839b-6c7a-48f0-a997-4d87e895d379\") " pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:24 crc kubenswrapper[4854]: I1007 13:21:24.035178 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ffc839b-6c7a-48f0-a997-4d87e895d379-catalog-content\") pod \"redhat-marketplace-tltcs\" (UID: \"1ffc839b-6c7a-48f0-a997-4d87e895d379\") " pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:24 crc kubenswrapper[4854]: I1007 13:21:24.136597 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ffc839b-6c7a-48f0-a997-4d87e895d379-catalog-content\") pod \"redhat-marketplace-tltcs\" (UID: \"1ffc839b-6c7a-48f0-a997-4d87e895d379\") " pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:24 crc kubenswrapper[4854]: I1007 13:21:24.136635 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ffc839b-6c7a-48f0-a997-4d87e895d379-utilities\") pod \"redhat-marketplace-tltcs\" (UID: \"1ffc839b-6c7a-48f0-a997-4d87e895d379\") " pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:24 crc kubenswrapper[4854]: I1007 13:21:24.136740 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwls7\" (UniqueName: \"kubernetes.io/projected/1ffc839b-6c7a-48f0-a997-4d87e895d379-kube-api-access-jwls7\") pod \"redhat-marketplace-tltcs\" (UID: \"1ffc839b-6c7a-48f0-a997-4d87e895d379\") " pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:24 crc kubenswrapper[4854]: I1007 13:21:24.137138 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ffc839b-6c7a-48f0-a997-4d87e895d379-catalog-content\") pod \"redhat-marketplace-tltcs\" (UID: \"1ffc839b-6c7a-48f0-a997-4d87e895d379\") " pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:24 crc kubenswrapper[4854]: I1007 13:21:24.137456 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ffc839b-6c7a-48f0-a997-4d87e895d379-utilities\") pod \"redhat-marketplace-tltcs\" (UID: \"1ffc839b-6c7a-48f0-a997-4d87e895d379\") " pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:24 crc kubenswrapper[4854]: I1007 13:21:24.163015 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwls7\" (UniqueName: \"kubernetes.io/projected/1ffc839b-6c7a-48f0-a997-4d87e895d379-kube-api-access-jwls7\") pod \"redhat-marketplace-tltcs\" (UID: \"1ffc839b-6c7a-48f0-a997-4d87e895d379\") " pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:24 crc kubenswrapper[4854]: I1007 13:21:24.237829 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:24 crc kubenswrapper[4854]: I1007 13:21:24.724053 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tltcs"] Oct 07 13:21:25 crc kubenswrapper[4854]: I1007 13:21:25.741841 4854 generic.go:334] "Generic (PLEG): container finished" podID="1ffc839b-6c7a-48f0-a997-4d87e895d379" containerID="84dc671c2530f1bcd0e266dee6d1492b8e0258b507443f86feb92359cb1cde09" exitCode=0 Oct 07 13:21:25 crc kubenswrapper[4854]: I1007 13:21:25.741961 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tltcs" event={"ID":"1ffc839b-6c7a-48f0-a997-4d87e895d379","Type":"ContainerDied","Data":"84dc671c2530f1bcd0e266dee6d1492b8e0258b507443f86feb92359cb1cde09"} Oct 07 13:21:25 crc kubenswrapper[4854]: I1007 13:21:25.742168 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tltcs" event={"ID":"1ffc839b-6c7a-48f0-a997-4d87e895d379","Type":"ContainerStarted","Data":"37226405195be8c4b6baf76f6d6c6d4c0e96a488c122c2058c9316fe0b208cc8"} Oct 07 13:21:28 crc kubenswrapper[4854]: I1007 13:21:28.769236 4854 generic.go:334] "Generic (PLEG): container finished" podID="1ffc839b-6c7a-48f0-a997-4d87e895d379" containerID="dd8e63eb4506118233726d6e203ff5062b0adfb12f8f75ba0fa749d2560450ce" exitCode=0 Oct 07 13:21:28 crc kubenswrapper[4854]: I1007 13:21:28.769303 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tltcs" event={"ID":"1ffc839b-6c7a-48f0-a997-4d87e895d379","Type":"ContainerDied","Data":"dd8e63eb4506118233726d6e203ff5062b0adfb12f8f75ba0fa749d2560450ce"} Oct 07 13:21:29 crc kubenswrapper[4854]: I1007 13:21:29.784112 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tltcs" event={"ID":"1ffc839b-6c7a-48f0-a997-4d87e895d379","Type":"ContainerStarted","Data":"725ba1dfc8ca02c3da52b476a79b473d2074bfa322b1a87053896b1cf6be85d6"} Oct 07 13:21:29 crc kubenswrapper[4854]: I1007 13:21:29.808354 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tltcs" podStartSLOduration=3.276901025 podStartE2EDuration="6.808297064s" podCreationTimestamp="2025-10-07 13:21:23 +0000 UTC" firstStartedPulling="2025-10-07 13:21:25.744319386 +0000 UTC m=+3401.732151641" lastFinishedPulling="2025-10-07 13:21:29.275715415 +0000 UTC m=+3405.263547680" observedRunningTime="2025-10-07 13:21:29.805940926 +0000 UTC m=+3405.793773191" watchObservedRunningTime="2025-10-07 13:21:29.808297064 +0000 UTC m=+3405.796129319" Oct 07 13:21:33 crc kubenswrapper[4854]: I1007 13:21:33.704283 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:21:33 crc kubenswrapper[4854]: E1007 13:21:33.705364 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:21:34 crc kubenswrapper[4854]: I1007 13:21:34.238900 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:34 crc kubenswrapper[4854]: I1007 13:21:34.238983 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:34 crc kubenswrapper[4854]: I1007 13:21:34.315721 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:34 crc kubenswrapper[4854]: I1007 13:21:34.863249 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:35 crc kubenswrapper[4854]: I1007 13:21:35.704699 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tltcs"] Oct 07 13:21:36 crc kubenswrapper[4854]: I1007 13:21:36.843279 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tltcs" podUID="1ffc839b-6c7a-48f0-a997-4d87e895d379" containerName="registry-server" containerID="cri-o://725ba1dfc8ca02c3da52b476a79b473d2074bfa322b1a87053896b1cf6be85d6" gracePeriod=2 Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.274080 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.293417 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ffc839b-6c7a-48f0-a997-4d87e895d379-catalog-content\") pod \"1ffc839b-6c7a-48f0-a997-4d87e895d379\" (UID: \"1ffc839b-6c7a-48f0-a997-4d87e895d379\") " Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.293508 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwls7\" (UniqueName: \"kubernetes.io/projected/1ffc839b-6c7a-48f0-a997-4d87e895d379-kube-api-access-jwls7\") pod \"1ffc839b-6c7a-48f0-a997-4d87e895d379\" (UID: \"1ffc839b-6c7a-48f0-a997-4d87e895d379\") " Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.293563 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ffc839b-6c7a-48f0-a997-4d87e895d379-utilities\") pod \"1ffc839b-6c7a-48f0-a997-4d87e895d379\" (UID: \"1ffc839b-6c7a-48f0-a997-4d87e895d379\") " Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.295114 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ffc839b-6c7a-48f0-a997-4d87e895d379-utilities" (OuterVolumeSpecName: "utilities") pod "1ffc839b-6c7a-48f0-a997-4d87e895d379" (UID: "1ffc839b-6c7a-48f0-a997-4d87e895d379"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.311024 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ffc839b-6c7a-48f0-a997-4d87e895d379-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ffc839b-6c7a-48f0-a997-4d87e895d379" (UID: "1ffc839b-6c7a-48f0-a997-4d87e895d379"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.325366 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ffc839b-6c7a-48f0-a997-4d87e895d379-kube-api-access-jwls7" (OuterVolumeSpecName: "kube-api-access-jwls7") pod "1ffc839b-6c7a-48f0-a997-4d87e895d379" (UID: "1ffc839b-6c7a-48f0-a997-4d87e895d379"). InnerVolumeSpecName "kube-api-access-jwls7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.394492 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwls7\" (UniqueName: \"kubernetes.io/projected/1ffc839b-6c7a-48f0-a997-4d87e895d379-kube-api-access-jwls7\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.394543 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ffc839b-6c7a-48f0-a997-4d87e895d379-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.394561 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ffc839b-6c7a-48f0-a997-4d87e895d379-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.857597 4854 generic.go:334] "Generic (PLEG): container finished" podID="1ffc839b-6c7a-48f0-a997-4d87e895d379" containerID="725ba1dfc8ca02c3da52b476a79b473d2074bfa322b1a87053896b1cf6be85d6" exitCode=0 Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.857644 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tltcs" event={"ID":"1ffc839b-6c7a-48f0-a997-4d87e895d379","Type":"ContainerDied","Data":"725ba1dfc8ca02c3da52b476a79b473d2074bfa322b1a87053896b1cf6be85d6"} Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.857679 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tltcs" event={"ID":"1ffc839b-6c7a-48f0-a997-4d87e895d379","Type":"ContainerDied","Data":"37226405195be8c4b6baf76f6d6c6d4c0e96a488c122c2058c9316fe0b208cc8"} Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.857698 4854 scope.go:117] "RemoveContainer" containerID="725ba1dfc8ca02c3da52b476a79b473d2074bfa322b1a87053896b1cf6be85d6" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.857801 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tltcs" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.908297 4854 scope.go:117] "RemoveContainer" containerID="dd8e63eb4506118233726d6e203ff5062b0adfb12f8f75ba0fa749d2560450ce" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.909893 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tltcs"] Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.916682 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tltcs"] Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.956610 4854 scope.go:117] "RemoveContainer" containerID="84dc671c2530f1bcd0e266dee6d1492b8e0258b507443f86feb92359cb1cde09" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.982873 4854 scope.go:117] "RemoveContainer" containerID="725ba1dfc8ca02c3da52b476a79b473d2074bfa322b1a87053896b1cf6be85d6" Oct 07 13:21:37 crc kubenswrapper[4854]: E1007 13:21:37.983365 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"725ba1dfc8ca02c3da52b476a79b473d2074bfa322b1a87053896b1cf6be85d6\": container with ID starting with 725ba1dfc8ca02c3da52b476a79b473d2074bfa322b1a87053896b1cf6be85d6 not found: ID does not exist" containerID="725ba1dfc8ca02c3da52b476a79b473d2074bfa322b1a87053896b1cf6be85d6" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.983419 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"725ba1dfc8ca02c3da52b476a79b473d2074bfa322b1a87053896b1cf6be85d6"} err="failed to get container status \"725ba1dfc8ca02c3da52b476a79b473d2074bfa322b1a87053896b1cf6be85d6\": rpc error: code = NotFound desc = could not find container \"725ba1dfc8ca02c3da52b476a79b473d2074bfa322b1a87053896b1cf6be85d6\": container with ID starting with 725ba1dfc8ca02c3da52b476a79b473d2074bfa322b1a87053896b1cf6be85d6 not found: ID does not exist" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.983462 4854 scope.go:117] "RemoveContainer" containerID="dd8e63eb4506118233726d6e203ff5062b0adfb12f8f75ba0fa749d2560450ce" Oct 07 13:21:37 crc kubenswrapper[4854]: E1007 13:21:37.983828 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd8e63eb4506118233726d6e203ff5062b0adfb12f8f75ba0fa749d2560450ce\": container with ID starting with dd8e63eb4506118233726d6e203ff5062b0adfb12f8f75ba0fa749d2560450ce not found: ID does not exist" containerID="dd8e63eb4506118233726d6e203ff5062b0adfb12f8f75ba0fa749d2560450ce" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.983867 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd8e63eb4506118233726d6e203ff5062b0adfb12f8f75ba0fa749d2560450ce"} err="failed to get container status \"dd8e63eb4506118233726d6e203ff5062b0adfb12f8f75ba0fa749d2560450ce\": rpc error: code = NotFound desc = could not find container \"dd8e63eb4506118233726d6e203ff5062b0adfb12f8f75ba0fa749d2560450ce\": container with ID starting with dd8e63eb4506118233726d6e203ff5062b0adfb12f8f75ba0fa749d2560450ce not found: ID does not exist" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.983890 4854 scope.go:117] "RemoveContainer" containerID="84dc671c2530f1bcd0e266dee6d1492b8e0258b507443f86feb92359cb1cde09" Oct 07 13:21:37 crc kubenswrapper[4854]: E1007 13:21:37.984181 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84dc671c2530f1bcd0e266dee6d1492b8e0258b507443f86feb92359cb1cde09\": container with ID starting with 84dc671c2530f1bcd0e266dee6d1492b8e0258b507443f86feb92359cb1cde09 not found: ID does not exist" containerID="84dc671c2530f1bcd0e266dee6d1492b8e0258b507443f86feb92359cb1cde09" Oct 07 13:21:37 crc kubenswrapper[4854]: I1007 13:21:37.984213 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84dc671c2530f1bcd0e266dee6d1492b8e0258b507443f86feb92359cb1cde09"} err="failed to get container status \"84dc671c2530f1bcd0e266dee6d1492b8e0258b507443f86feb92359cb1cde09\": rpc error: code = NotFound desc = could not find container \"84dc671c2530f1bcd0e266dee6d1492b8e0258b507443f86feb92359cb1cde09\": container with ID starting with 84dc671c2530f1bcd0e266dee6d1492b8e0258b507443f86feb92359cb1cde09 not found: ID does not exist" Oct 07 13:21:38 crc kubenswrapper[4854]: I1007 13:21:38.711609 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ffc839b-6c7a-48f0-a997-4d87e895d379" path="/var/lib/kubelet/pods/1ffc839b-6c7a-48f0-a997-4d87e895d379/volumes" Oct 07 13:21:46 crc kubenswrapper[4854]: I1007 13:21:46.702312 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:21:46 crc kubenswrapper[4854]: I1007 13:21:46.950464 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"7e029ad78439445be27be27e1ab3f083cc771f152fb676d5544eb285938f35b0"} Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.826810 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5d4pz"] Oct 07 13:22:49 crc kubenswrapper[4854]: E1007 13:22:49.828593 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ffc839b-6c7a-48f0-a997-4d87e895d379" containerName="extract-content" Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.828631 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ffc839b-6c7a-48f0-a997-4d87e895d379" containerName="extract-content" Oct 07 13:22:49 crc kubenswrapper[4854]: E1007 13:22:49.828664 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ffc839b-6c7a-48f0-a997-4d87e895d379" containerName="registry-server" Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.828682 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ffc839b-6c7a-48f0-a997-4d87e895d379" containerName="registry-server" Oct 07 13:22:49 crc kubenswrapper[4854]: E1007 13:22:49.828711 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ffc839b-6c7a-48f0-a997-4d87e895d379" containerName="extract-utilities" Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.828731 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ffc839b-6c7a-48f0-a997-4d87e895d379" containerName="extract-utilities" Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.829232 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ffc839b-6c7a-48f0-a997-4d87e895d379" containerName="registry-server" Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.831396 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.841584 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5d4pz"] Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.875096 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388ed2c0-36f7-4c0a-851f-d44a608e1dad-catalog-content\") pod \"redhat-operators-5d4pz\" (UID: \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\") " pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.875230 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388ed2c0-36f7-4c0a-851f-d44a608e1dad-utilities\") pod \"redhat-operators-5d4pz\" (UID: \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\") " pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.875282 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnmxh\" (UniqueName: \"kubernetes.io/projected/388ed2c0-36f7-4c0a-851f-d44a608e1dad-kube-api-access-fnmxh\") pod \"redhat-operators-5d4pz\" (UID: \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\") " pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.976409 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnmxh\" (UniqueName: \"kubernetes.io/projected/388ed2c0-36f7-4c0a-851f-d44a608e1dad-kube-api-access-fnmxh\") pod \"redhat-operators-5d4pz\" (UID: \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\") " pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.976564 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388ed2c0-36f7-4c0a-851f-d44a608e1dad-catalog-content\") pod \"redhat-operators-5d4pz\" (UID: \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\") " pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.976616 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388ed2c0-36f7-4c0a-851f-d44a608e1dad-utilities\") pod \"redhat-operators-5d4pz\" (UID: \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\") " pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.977267 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388ed2c0-36f7-4c0a-851f-d44a608e1dad-utilities\") pod \"redhat-operators-5d4pz\" (UID: \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\") " pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:22:49 crc kubenswrapper[4854]: I1007 13:22:49.977394 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388ed2c0-36f7-4c0a-851f-d44a608e1dad-catalog-content\") pod \"redhat-operators-5d4pz\" (UID: \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\") " pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:22:50 crc kubenswrapper[4854]: I1007 13:22:50.005461 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnmxh\" (UniqueName: \"kubernetes.io/projected/388ed2c0-36f7-4c0a-851f-d44a608e1dad-kube-api-access-fnmxh\") pod \"redhat-operators-5d4pz\" (UID: \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\") " pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:22:50 crc kubenswrapper[4854]: I1007 13:22:50.158755 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:22:50 crc kubenswrapper[4854]: I1007 13:22:50.637692 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5d4pz"] Oct 07 13:22:51 crc kubenswrapper[4854]: I1007 13:22:51.554044 4854 generic.go:334] "Generic (PLEG): container finished" podID="388ed2c0-36f7-4c0a-851f-d44a608e1dad" containerID="2ef9b4bd62b2d812af8a23182411f875a2a0c2a5aa88892efebd68f45cc1395a" exitCode=0 Oct 07 13:22:51 crc kubenswrapper[4854]: I1007 13:22:51.554190 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d4pz" event={"ID":"388ed2c0-36f7-4c0a-851f-d44a608e1dad","Type":"ContainerDied","Data":"2ef9b4bd62b2d812af8a23182411f875a2a0c2a5aa88892efebd68f45cc1395a"} Oct 07 13:22:51 crc kubenswrapper[4854]: I1007 13:22:51.554315 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d4pz" event={"ID":"388ed2c0-36f7-4c0a-851f-d44a608e1dad","Type":"ContainerStarted","Data":"fa01c69a3d8120b6c163b9507aac7dbdf0f2c0967ff82e26e4e580686dfb97c8"} Oct 07 13:22:53 crc kubenswrapper[4854]: I1007 13:22:53.579041 4854 generic.go:334] "Generic (PLEG): container finished" podID="388ed2c0-36f7-4c0a-851f-d44a608e1dad" containerID="538cbbb12352557212eb5e705a9f20d26f9c421d1bd53c7cf6b7d5c70f02dc53" exitCode=0 Oct 07 13:22:53 crc kubenswrapper[4854]: I1007 13:22:53.579095 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d4pz" event={"ID":"388ed2c0-36f7-4c0a-851f-d44a608e1dad","Type":"ContainerDied","Data":"538cbbb12352557212eb5e705a9f20d26f9c421d1bd53c7cf6b7d5c70f02dc53"} Oct 07 13:22:54 crc kubenswrapper[4854]: I1007 13:22:54.588640 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d4pz" event={"ID":"388ed2c0-36f7-4c0a-851f-d44a608e1dad","Type":"ContainerStarted","Data":"b41739071c85088455c21e15721d58fe6ca2a2134c5cec57a9a9f93d6700e61d"} Oct 07 13:22:54 crc kubenswrapper[4854]: I1007 13:22:54.607731 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5d4pz" podStartSLOduration=2.827304172 podStartE2EDuration="5.607706693s" podCreationTimestamp="2025-10-07 13:22:49 +0000 UTC" firstStartedPulling="2025-10-07 13:22:51.558551059 +0000 UTC m=+3487.546383344" lastFinishedPulling="2025-10-07 13:22:54.3389536 +0000 UTC m=+3490.326785865" observedRunningTime="2025-10-07 13:22:54.606527349 +0000 UTC m=+3490.594359644" watchObservedRunningTime="2025-10-07 13:22:54.607706693 +0000 UTC m=+3490.595538988" Oct 07 13:23:00 crc kubenswrapper[4854]: I1007 13:23:00.159605 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:23:00 crc kubenswrapper[4854]: I1007 13:23:00.160502 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:23:00 crc kubenswrapper[4854]: I1007 13:23:00.218447 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:23:00 crc kubenswrapper[4854]: I1007 13:23:00.681235 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:23:00 crc kubenswrapper[4854]: I1007 13:23:00.744408 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5d4pz"] Oct 07 13:23:02 crc kubenswrapper[4854]: I1007 13:23:02.651064 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5d4pz" podUID="388ed2c0-36f7-4c0a-851f-d44a608e1dad" containerName="registry-server" containerID="cri-o://b41739071c85088455c21e15721d58fe6ca2a2134c5cec57a9a9f93d6700e61d" gracePeriod=2 Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.079075 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.187801 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388ed2c0-36f7-4c0a-851f-d44a608e1dad-catalog-content\") pod \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\" (UID: \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\") " Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.187927 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388ed2c0-36f7-4c0a-851f-d44a608e1dad-utilities\") pod \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\" (UID: \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\") " Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.187976 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnmxh\" (UniqueName: \"kubernetes.io/projected/388ed2c0-36f7-4c0a-851f-d44a608e1dad-kube-api-access-fnmxh\") pod \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\" (UID: \"388ed2c0-36f7-4c0a-851f-d44a608e1dad\") " Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.189583 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/388ed2c0-36f7-4c0a-851f-d44a608e1dad-utilities" (OuterVolumeSpecName: "utilities") pod "388ed2c0-36f7-4c0a-851f-d44a608e1dad" (UID: "388ed2c0-36f7-4c0a-851f-d44a608e1dad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.196375 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388ed2c0-36f7-4c0a-851f-d44a608e1dad-kube-api-access-fnmxh" (OuterVolumeSpecName: "kube-api-access-fnmxh") pod "388ed2c0-36f7-4c0a-851f-d44a608e1dad" (UID: "388ed2c0-36f7-4c0a-851f-d44a608e1dad"). InnerVolumeSpecName "kube-api-access-fnmxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.290386 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388ed2c0-36f7-4c0a-851f-d44a608e1dad-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.290755 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnmxh\" (UniqueName: \"kubernetes.io/projected/388ed2c0-36f7-4c0a-851f-d44a608e1dad-kube-api-access-fnmxh\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.309126 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/388ed2c0-36f7-4c0a-851f-d44a608e1dad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "388ed2c0-36f7-4c0a-851f-d44a608e1dad" (UID: "388ed2c0-36f7-4c0a-851f-d44a608e1dad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.392074 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388ed2c0-36f7-4c0a-851f-d44a608e1dad-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.663328 4854 generic.go:334] "Generic (PLEG): container finished" podID="388ed2c0-36f7-4c0a-851f-d44a608e1dad" containerID="b41739071c85088455c21e15721d58fe6ca2a2134c5cec57a9a9f93d6700e61d" exitCode=0 Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.663402 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d4pz" event={"ID":"388ed2c0-36f7-4c0a-851f-d44a608e1dad","Type":"ContainerDied","Data":"b41739071c85088455c21e15721d58fe6ca2a2134c5cec57a9a9f93d6700e61d"} Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.663453 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5d4pz" event={"ID":"388ed2c0-36f7-4c0a-851f-d44a608e1dad","Type":"ContainerDied","Data":"fa01c69a3d8120b6c163b9507aac7dbdf0f2c0967ff82e26e4e580686dfb97c8"} Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.663492 4854 scope.go:117] "RemoveContainer" containerID="b41739071c85088455c21e15721d58fe6ca2a2134c5cec57a9a9f93d6700e61d" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.663578 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5d4pz" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.690269 4854 scope.go:117] "RemoveContainer" containerID="538cbbb12352557212eb5e705a9f20d26f9c421d1bd53c7cf6b7d5c70f02dc53" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.714284 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5d4pz"] Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.721113 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5d4pz"] Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.732350 4854 scope.go:117] "RemoveContainer" containerID="2ef9b4bd62b2d812af8a23182411f875a2a0c2a5aa88892efebd68f45cc1395a" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.769002 4854 scope.go:117] "RemoveContainer" containerID="b41739071c85088455c21e15721d58fe6ca2a2134c5cec57a9a9f93d6700e61d" Oct 07 13:23:03 crc kubenswrapper[4854]: E1007 13:23:03.769558 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b41739071c85088455c21e15721d58fe6ca2a2134c5cec57a9a9f93d6700e61d\": container with ID starting with b41739071c85088455c21e15721d58fe6ca2a2134c5cec57a9a9f93d6700e61d not found: ID does not exist" containerID="b41739071c85088455c21e15721d58fe6ca2a2134c5cec57a9a9f93d6700e61d" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.769609 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b41739071c85088455c21e15721d58fe6ca2a2134c5cec57a9a9f93d6700e61d"} err="failed to get container status \"b41739071c85088455c21e15721d58fe6ca2a2134c5cec57a9a9f93d6700e61d\": rpc error: code = NotFound desc = could not find container \"b41739071c85088455c21e15721d58fe6ca2a2134c5cec57a9a9f93d6700e61d\": container with ID starting with b41739071c85088455c21e15721d58fe6ca2a2134c5cec57a9a9f93d6700e61d not found: ID does not exist" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.769642 4854 scope.go:117] "RemoveContainer" containerID="538cbbb12352557212eb5e705a9f20d26f9c421d1bd53c7cf6b7d5c70f02dc53" Oct 07 13:23:03 crc kubenswrapper[4854]: E1007 13:23:03.770308 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"538cbbb12352557212eb5e705a9f20d26f9c421d1bd53c7cf6b7d5c70f02dc53\": container with ID starting with 538cbbb12352557212eb5e705a9f20d26f9c421d1bd53c7cf6b7d5c70f02dc53 not found: ID does not exist" containerID="538cbbb12352557212eb5e705a9f20d26f9c421d1bd53c7cf6b7d5c70f02dc53" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.770394 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"538cbbb12352557212eb5e705a9f20d26f9c421d1bd53c7cf6b7d5c70f02dc53"} err="failed to get container status \"538cbbb12352557212eb5e705a9f20d26f9c421d1bd53c7cf6b7d5c70f02dc53\": rpc error: code = NotFound desc = could not find container \"538cbbb12352557212eb5e705a9f20d26f9c421d1bd53c7cf6b7d5c70f02dc53\": container with ID starting with 538cbbb12352557212eb5e705a9f20d26f9c421d1bd53c7cf6b7d5c70f02dc53 not found: ID does not exist" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.770430 4854 scope.go:117] "RemoveContainer" containerID="2ef9b4bd62b2d812af8a23182411f875a2a0c2a5aa88892efebd68f45cc1395a" Oct 07 13:23:03 crc kubenswrapper[4854]: E1007 13:23:03.770914 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef9b4bd62b2d812af8a23182411f875a2a0c2a5aa88892efebd68f45cc1395a\": container with ID starting with 2ef9b4bd62b2d812af8a23182411f875a2a0c2a5aa88892efebd68f45cc1395a not found: ID does not exist" containerID="2ef9b4bd62b2d812af8a23182411f875a2a0c2a5aa88892efebd68f45cc1395a" Oct 07 13:23:03 crc kubenswrapper[4854]: I1007 13:23:03.770945 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef9b4bd62b2d812af8a23182411f875a2a0c2a5aa88892efebd68f45cc1395a"} err="failed to get container status \"2ef9b4bd62b2d812af8a23182411f875a2a0c2a5aa88892efebd68f45cc1395a\": rpc error: code = NotFound desc = could not find container \"2ef9b4bd62b2d812af8a23182411f875a2a0c2a5aa88892efebd68f45cc1395a\": container with ID starting with 2ef9b4bd62b2d812af8a23182411f875a2a0c2a5aa88892efebd68f45cc1395a not found: ID does not exist" Oct 07 13:23:04 crc kubenswrapper[4854]: I1007 13:23:04.719450 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="388ed2c0-36f7-4c0a-851f-d44a608e1dad" path="/var/lib/kubelet/pods/388ed2c0-36f7-4c0a-851f-d44a608e1dad/volumes" Oct 07 13:24:10 crc kubenswrapper[4854]: I1007 13:24:10.808103 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:24:10 crc kubenswrapper[4854]: I1007 13:24:10.809252 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:24:40 crc kubenswrapper[4854]: I1007 13:24:40.808262 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:24:40 crc kubenswrapper[4854]: I1007 13:24:40.809338 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:25:10 crc kubenswrapper[4854]: I1007 13:25:10.807611 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:25:10 crc kubenswrapper[4854]: I1007 13:25:10.808494 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:25:10 crc kubenswrapper[4854]: I1007 13:25:10.808574 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 13:25:10 crc kubenswrapper[4854]: I1007 13:25:10.809631 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e029ad78439445be27be27e1ab3f083cc771f152fb676d5544eb285938f35b0"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:25:10 crc kubenswrapper[4854]: I1007 13:25:10.809764 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://7e029ad78439445be27be27e1ab3f083cc771f152fb676d5544eb285938f35b0" gracePeriod=600 Oct 07 13:25:11 crc kubenswrapper[4854]: I1007 13:25:11.817953 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="7e029ad78439445be27be27e1ab3f083cc771f152fb676d5544eb285938f35b0" exitCode=0 Oct 07 13:25:11 crc kubenswrapper[4854]: I1007 13:25:11.818088 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"7e029ad78439445be27be27e1ab3f083cc771f152fb676d5544eb285938f35b0"} Oct 07 13:25:11 crc kubenswrapper[4854]: I1007 13:25:11.818377 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96"} Oct 07 13:25:11 crc kubenswrapper[4854]: I1007 13:25:11.818408 4854 scope.go:117] "RemoveContainer" containerID="0b41fdf1c73e74ad1e4a2d3aa69bee4aa9ba58b145d2d08b5d4b83efa9072db0" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.546139 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bv82r"] Oct 07 13:26:42 crc kubenswrapper[4854]: E1007 13:26:42.547935 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388ed2c0-36f7-4c0a-851f-d44a608e1dad" containerName="extract-utilities" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.547969 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="388ed2c0-36f7-4c0a-851f-d44a608e1dad" containerName="extract-utilities" Oct 07 13:26:42 crc kubenswrapper[4854]: E1007 13:26:42.548007 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388ed2c0-36f7-4c0a-851f-d44a608e1dad" containerName="extract-content" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.548022 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="388ed2c0-36f7-4c0a-851f-d44a608e1dad" containerName="extract-content" Oct 07 13:26:42 crc kubenswrapper[4854]: E1007 13:26:42.548038 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388ed2c0-36f7-4c0a-851f-d44a608e1dad" containerName="registry-server" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.548052 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="388ed2c0-36f7-4c0a-851f-d44a608e1dad" containerName="registry-server" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.548451 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="388ed2c0-36f7-4c0a-851f-d44a608e1dad" containerName="registry-server" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.550592 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.582195 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bv82r"] Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.708579 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwz6l\" (UniqueName: \"kubernetes.io/projected/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-kube-api-access-xwz6l\") pod \"certified-operators-bv82r\" (UID: \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\") " pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.708718 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-utilities\") pod \"certified-operators-bv82r\" (UID: \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\") " pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.708881 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-catalog-content\") pod \"certified-operators-bv82r\" (UID: \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\") " pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.810679 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-catalog-content\") pod \"certified-operators-bv82r\" (UID: \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\") " pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.810845 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwz6l\" (UniqueName: \"kubernetes.io/projected/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-kube-api-access-xwz6l\") pod \"certified-operators-bv82r\" (UID: \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\") " pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.810903 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-utilities\") pod \"certified-operators-bv82r\" (UID: \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\") " pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.811876 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-catalog-content\") pod \"certified-operators-bv82r\" (UID: \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\") " pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.812275 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-utilities\") pod \"certified-operators-bv82r\" (UID: \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\") " pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.836712 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwz6l\" (UniqueName: \"kubernetes.io/projected/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-kube-api-access-xwz6l\") pod \"certified-operators-bv82r\" (UID: \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\") " pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:42 crc kubenswrapper[4854]: I1007 13:26:42.886142 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:43 crc kubenswrapper[4854]: I1007 13:26:43.172128 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bv82r"] Oct 07 13:26:43 crc kubenswrapper[4854]: I1007 13:26:43.619642 4854 generic.go:334] "Generic (PLEG): container finished" podID="6b34f7e2-c966-4dd1-acb4-15c0780c40ae" containerID="da873651b21f91f3c195d8311d9523adc02e0130164b0e7c5cb41e48e058c9ac" exitCode=0 Oct 07 13:26:43 crc kubenswrapper[4854]: I1007 13:26:43.619706 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bv82r" event={"ID":"6b34f7e2-c966-4dd1-acb4-15c0780c40ae","Type":"ContainerDied","Data":"da873651b21f91f3c195d8311d9523adc02e0130164b0e7c5cb41e48e058c9ac"} Oct 07 13:26:43 crc kubenswrapper[4854]: I1007 13:26:43.622399 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bv82r" event={"ID":"6b34f7e2-c966-4dd1-acb4-15c0780c40ae","Type":"ContainerStarted","Data":"847545ba7f9f9c19b3b06ba0096b91101bcaf045e7becd6154edf4eac027e21e"} Oct 07 13:26:43 crc kubenswrapper[4854]: I1007 13:26:43.622909 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:26:44 crc kubenswrapper[4854]: I1007 13:26:44.632722 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bv82r" event={"ID":"6b34f7e2-c966-4dd1-acb4-15c0780c40ae","Type":"ContainerStarted","Data":"81cb45b6fc09dad475c5d51ec24a9a6e9974912827897c41636315bd624de287"} Oct 07 13:26:45 crc kubenswrapper[4854]: I1007 13:26:45.645323 4854 generic.go:334] "Generic (PLEG): container finished" podID="6b34f7e2-c966-4dd1-acb4-15c0780c40ae" containerID="81cb45b6fc09dad475c5d51ec24a9a6e9974912827897c41636315bd624de287" exitCode=0 Oct 07 13:26:45 crc kubenswrapper[4854]: I1007 13:26:45.645705 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bv82r" event={"ID":"6b34f7e2-c966-4dd1-acb4-15c0780c40ae","Type":"ContainerDied","Data":"81cb45b6fc09dad475c5d51ec24a9a6e9974912827897c41636315bd624de287"} Oct 07 13:26:46 crc kubenswrapper[4854]: I1007 13:26:46.662397 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bv82r" event={"ID":"6b34f7e2-c966-4dd1-acb4-15c0780c40ae","Type":"ContainerStarted","Data":"e580b868433f3a3c91c94fb8aa8041d3fd02c48701ef59abba1beff5b122790d"} Oct 07 13:26:52 crc kubenswrapper[4854]: I1007 13:26:52.887327 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:52 crc kubenswrapper[4854]: I1007 13:26:52.887748 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:52 crc kubenswrapper[4854]: I1007 13:26:52.941305 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:52 crc kubenswrapper[4854]: I1007 13:26:52.961717 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bv82r" podStartSLOduration=8.508967109 podStartE2EDuration="10.96168744s" podCreationTimestamp="2025-10-07 13:26:42 +0000 UTC" firstStartedPulling="2025-10-07 13:26:43.622518237 +0000 UTC m=+3719.610350522" lastFinishedPulling="2025-10-07 13:26:46.075238568 +0000 UTC m=+3722.063070853" observedRunningTime="2025-10-07 13:26:46.687588201 +0000 UTC m=+3722.675420496" watchObservedRunningTime="2025-10-07 13:26:52.96168744 +0000 UTC m=+3728.949519695" Oct 07 13:26:53 crc kubenswrapper[4854]: I1007 13:26:53.802292 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:53 crc kubenswrapper[4854]: I1007 13:26:53.861833 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bv82r"] Oct 07 13:26:55 crc kubenswrapper[4854]: I1007 13:26:55.743900 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bv82r" podUID="6b34f7e2-c966-4dd1-acb4-15c0780c40ae" containerName="registry-server" containerID="cri-o://e580b868433f3a3c91c94fb8aa8041d3fd02c48701ef59abba1beff5b122790d" gracePeriod=2 Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.196318 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.341020 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwz6l\" (UniqueName: \"kubernetes.io/projected/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-kube-api-access-xwz6l\") pod \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\" (UID: \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\") " Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.341206 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-catalog-content\") pod \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\" (UID: \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\") " Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.341274 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-utilities\") pod \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\" (UID: \"6b34f7e2-c966-4dd1-acb4-15c0780c40ae\") " Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.342521 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-utilities" (OuterVolumeSpecName: "utilities") pod "6b34f7e2-c966-4dd1-acb4-15c0780c40ae" (UID: "6b34f7e2-c966-4dd1-acb4-15c0780c40ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.348087 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-kube-api-access-xwz6l" (OuterVolumeSpecName: "kube-api-access-xwz6l") pod "6b34f7e2-c966-4dd1-acb4-15c0780c40ae" (UID: "6b34f7e2-c966-4dd1-acb4-15c0780c40ae"). InnerVolumeSpecName "kube-api-access-xwz6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.383531 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b34f7e2-c966-4dd1-acb4-15c0780c40ae" (UID: "6b34f7e2-c966-4dd1-acb4-15c0780c40ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.443805 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwz6l\" (UniqueName: \"kubernetes.io/projected/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-kube-api-access-xwz6l\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.443850 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.443863 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b34f7e2-c966-4dd1-acb4-15c0780c40ae-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.759115 4854 generic.go:334] "Generic (PLEG): container finished" podID="6b34f7e2-c966-4dd1-acb4-15c0780c40ae" containerID="e580b868433f3a3c91c94fb8aa8041d3fd02c48701ef59abba1beff5b122790d" exitCode=0 Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.759200 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bv82r" event={"ID":"6b34f7e2-c966-4dd1-acb4-15c0780c40ae","Type":"ContainerDied","Data":"e580b868433f3a3c91c94fb8aa8041d3fd02c48701ef59abba1beff5b122790d"} Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.759625 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bv82r" event={"ID":"6b34f7e2-c966-4dd1-acb4-15c0780c40ae","Type":"ContainerDied","Data":"847545ba7f9f9c19b3b06ba0096b91101bcaf045e7becd6154edf4eac027e21e"} Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.759662 4854 scope.go:117] "RemoveContainer" containerID="e580b868433f3a3c91c94fb8aa8041d3fd02c48701ef59abba1beff5b122790d" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.759288 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bv82r" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.790810 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bv82r"] Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.797946 4854 scope.go:117] "RemoveContainer" containerID="81cb45b6fc09dad475c5d51ec24a9a6e9974912827897c41636315bd624de287" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.798107 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bv82r"] Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.826361 4854 scope.go:117] "RemoveContainer" containerID="da873651b21f91f3c195d8311d9523adc02e0130164b0e7c5cb41e48e058c9ac" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.844247 4854 scope.go:117] "RemoveContainer" containerID="e580b868433f3a3c91c94fb8aa8041d3fd02c48701ef59abba1beff5b122790d" Oct 07 13:26:56 crc kubenswrapper[4854]: E1007 13:26:56.844693 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e580b868433f3a3c91c94fb8aa8041d3fd02c48701ef59abba1beff5b122790d\": container with ID starting with e580b868433f3a3c91c94fb8aa8041d3fd02c48701ef59abba1beff5b122790d not found: ID does not exist" containerID="e580b868433f3a3c91c94fb8aa8041d3fd02c48701ef59abba1beff5b122790d" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.844729 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e580b868433f3a3c91c94fb8aa8041d3fd02c48701ef59abba1beff5b122790d"} err="failed to get container status \"e580b868433f3a3c91c94fb8aa8041d3fd02c48701ef59abba1beff5b122790d\": rpc error: code = NotFound desc = could not find container \"e580b868433f3a3c91c94fb8aa8041d3fd02c48701ef59abba1beff5b122790d\": container with ID starting with e580b868433f3a3c91c94fb8aa8041d3fd02c48701ef59abba1beff5b122790d not found: ID does not exist" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.844748 4854 scope.go:117] "RemoveContainer" containerID="81cb45b6fc09dad475c5d51ec24a9a6e9974912827897c41636315bd624de287" Oct 07 13:26:56 crc kubenswrapper[4854]: E1007 13:26:56.845101 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81cb45b6fc09dad475c5d51ec24a9a6e9974912827897c41636315bd624de287\": container with ID starting with 81cb45b6fc09dad475c5d51ec24a9a6e9974912827897c41636315bd624de287 not found: ID does not exist" containerID="81cb45b6fc09dad475c5d51ec24a9a6e9974912827897c41636315bd624de287" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.845121 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81cb45b6fc09dad475c5d51ec24a9a6e9974912827897c41636315bd624de287"} err="failed to get container status \"81cb45b6fc09dad475c5d51ec24a9a6e9974912827897c41636315bd624de287\": rpc error: code = NotFound desc = could not find container \"81cb45b6fc09dad475c5d51ec24a9a6e9974912827897c41636315bd624de287\": container with ID starting with 81cb45b6fc09dad475c5d51ec24a9a6e9974912827897c41636315bd624de287 not found: ID does not exist" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.845136 4854 scope.go:117] "RemoveContainer" containerID="da873651b21f91f3c195d8311d9523adc02e0130164b0e7c5cb41e48e058c9ac" Oct 07 13:26:56 crc kubenswrapper[4854]: E1007 13:26:56.845522 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da873651b21f91f3c195d8311d9523adc02e0130164b0e7c5cb41e48e058c9ac\": container with ID starting with da873651b21f91f3c195d8311d9523adc02e0130164b0e7c5cb41e48e058c9ac not found: ID does not exist" containerID="da873651b21f91f3c195d8311d9523adc02e0130164b0e7c5cb41e48e058c9ac" Oct 07 13:26:56 crc kubenswrapper[4854]: I1007 13:26:56.845544 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da873651b21f91f3c195d8311d9523adc02e0130164b0e7c5cb41e48e058c9ac"} err="failed to get container status \"da873651b21f91f3c195d8311d9523adc02e0130164b0e7c5cb41e48e058c9ac\": rpc error: code = NotFound desc = could not find container \"da873651b21f91f3c195d8311d9523adc02e0130164b0e7c5cb41e48e058c9ac\": container with ID starting with da873651b21f91f3c195d8311d9523adc02e0130164b0e7c5cb41e48e058c9ac not found: ID does not exist" Oct 07 13:26:58 crc kubenswrapper[4854]: I1007 13:26:58.718513 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b34f7e2-c966-4dd1-acb4-15c0780c40ae" path="/var/lib/kubelet/pods/6b34f7e2-c966-4dd1-acb4-15c0780c40ae/volumes" Oct 07 13:27:40 crc kubenswrapper[4854]: I1007 13:27:40.808112 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:27:40 crc kubenswrapper[4854]: I1007 13:27:40.808901 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:28:10 crc kubenswrapper[4854]: I1007 13:28:10.807786 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:28:10 crc kubenswrapper[4854]: I1007 13:28:10.808433 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:28:40 crc kubenswrapper[4854]: I1007 13:28:40.808252 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:28:40 crc kubenswrapper[4854]: I1007 13:28:40.808886 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:28:40 crc kubenswrapper[4854]: I1007 13:28:40.808949 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 13:28:40 crc kubenswrapper[4854]: I1007 13:28:40.809751 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:28:40 crc kubenswrapper[4854]: I1007 13:28:40.809813 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" gracePeriod=600 Oct 07 13:28:40 crc kubenswrapper[4854]: E1007 13:28:40.935120 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:28:41 crc kubenswrapper[4854]: I1007 13:28:41.825759 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" exitCode=0 Oct 07 13:28:41 crc kubenswrapper[4854]: I1007 13:28:41.825854 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96"} Oct 07 13:28:41 crc kubenswrapper[4854]: I1007 13:28:41.825896 4854 scope.go:117] "RemoveContainer" containerID="7e029ad78439445be27be27e1ab3f083cc771f152fb676d5544eb285938f35b0" Oct 07 13:28:41 crc kubenswrapper[4854]: I1007 13:28:41.826575 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:28:41 crc kubenswrapper[4854]: E1007 13:28:41.826984 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:28:53 crc kubenswrapper[4854]: I1007 13:28:53.703637 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:28:53 crc kubenswrapper[4854]: E1007 13:28:53.704974 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:29:08 crc kubenswrapper[4854]: I1007 13:29:08.702668 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:29:08 crc kubenswrapper[4854]: E1007 13:29:08.703857 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:29:23 crc kubenswrapper[4854]: I1007 13:29:23.702810 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:29:23 crc kubenswrapper[4854]: E1007 13:29:23.703628 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:29:36 crc kubenswrapper[4854]: I1007 13:29:36.703350 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:29:36 crc kubenswrapper[4854]: E1007 13:29:36.704356 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:29:49 crc kubenswrapper[4854]: I1007 13:29:49.703262 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:29:49 crc kubenswrapper[4854]: E1007 13:29:49.706627 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.179354 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4"] Oct 07 13:30:00 crc kubenswrapper[4854]: E1007 13:30:00.180439 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b34f7e2-c966-4dd1-acb4-15c0780c40ae" containerName="registry-server" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.180460 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b34f7e2-c966-4dd1-acb4-15c0780c40ae" containerName="registry-server" Oct 07 13:30:00 crc kubenswrapper[4854]: E1007 13:30:00.180495 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b34f7e2-c966-4dd1-acb4-15c0780c40ae" containerName="extract-content" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.180507 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b34f7e2-c966-4dd1-acb4-15c0780c40ae" containerName="extract-content" Oct 07 13:30:00 crc kubenswrapper[4854]: E1007 13:30:00.180547 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b34f7e2-c966-4dd1-acb4-15c0780c40ae" containerName="extract-utilities" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.180561 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b34f7e2-c966-4dd1-acb4-15c0780c40ae" containerName="extract-utilities" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.180822 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b34f7e2-c966-4dd1-acb4-15c0780c40ae" containerName="registry-server" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.181602 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.183686 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.183885 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.191229 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4"] Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.276194 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nllcg\" (UniqueName: \"kubernetes.io/projected/dd6d7fce-8018-4050-94e3-efbff9794019-kube-api-access-nllcg\") pod \"collect-profiles-29330730-hcsm4\" (UID: \"dd6d7fce-8018-4050-94e3-efbff9794019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.276316 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd6d7fce-8018-4050-94e3-efbff9794019-secret-volume\") pod \"collect-profiles-29330730-hcsm4\" (UID: \"dd6d7fce-8018-4050-94e3-efbff9794019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.276460 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd6d7fce-8018-4050-94e3-efbff9794019-config-volume\") pod \"collect-profiles-29330730-hcsm4\" (UID: \"dd6d7fce-8018-4050-94e3-efbff9794019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.378112 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd6d7fce-8018-4050-94e3-efbff9794019-secret-volume\") pod \"collect-profiles-29330730-hcsm4\" (UID: \"dd6d7fce-8018-4050-94e3-efbff9794019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.378413 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd6d7fce-8018-4050-94e3-efbff9794019-config-volume\") pod \"collect-profiles-29330730-hcsm4\" (UID: \"dd6d7fce-8018-4050-94e3-efbff9794019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.378506 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nllcg\" (UniqueName: \"kubernetes.io/projected/dd6d7fce-8018-4050-94e3-efbff9794019-kube-api-access-nllcg\") pod \"collect-profiles-29330730-hcsm4\" (UID: \"dd6d7fce-8018-4050-94e3-efbff9794019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.380108 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd6d7fce-8018-4050-94e3-efbff9794019-config-volume\") pod \"collect-profiles-29330730-hcsm4\" (UID: \"dd6d7fce-8018-4050-94e3-efbff9794019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.391832 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd6d7fce-8018-4050-94e3-efbff9794019-secret-volume\") pod \"collect-profiles-29330730-hcsm4\" (UID: \"dd6d7fce-8018-4050-94e3-efbff9794019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.408463 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nllcg\" (UniqueName: \"kubernetes.io/projected/dd6d7fce-8018-4050-94e3-efbff9794019-kube-api-access-nllcg\") pod \"collect-profiles-29330730-hcsm4\" (UID: \"dd6d7fce-8018-4050-94e3-efbff9794019\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.508245 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.702886 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:30:00 crc kubenswrapper[4854]: E1007 13:30:00.703512 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:30:00 crc kubenswrapper[4854]: I1007 13:30:00.817439 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4"] Oct 07 13:30:01 crc kubenswrapper[4854]: I1007 13:30:01.558198 4854 generic.go:334] "Generic (PLEG): container finished" podID="dd6d7fce-8018-4050-94e3-efbff9794019" containerID="d5f12edd93de68ea6e78de6731e614869c4c7f57e32f0fbe91101f79340683ec" exitCode=0 Oct 07 13:30:01 crc kubenswrapper[4854]: I1007 13:30:01.558364 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" event={"ID":"dd6d7fce-8018-4050-94e3-efbff9794019","Type":"ContainerDied","Data":"d5f12edd93de68ea6e78de6731e614869c4c7f57e32f0fbe91101f79340683ec"} Oct 07 13:30:01 crc kubenswrapper[4854]: I1007 13:30:01.558729 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" event={"ID":"dd6d7fce-8018-4050-94e3-efbff9794019","Type":"ContainerStarted","Data":"426fc4335db5e68094887ac159a3172f1cc2fb7e4b5abc2c634c96f6291eab61"} Oct 07 13:30:02 crc kubenswrapper[4854]: I1007 13:30:02.896701 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" Oct 07 13:30:03 crc kubenswrapper[4854]: I1007 13:30:03.036175 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd6d7fce-8018-4050-94e3-efbff9794019-config-volume\") pod \"dd6d7fce-8018-4050-94e3-efbff9794019\" (UID: \"dd6d7fce-8018-4050-94e3-efbff9794019\") " Oct 07 13:30:03 crc kubenswrapper[4854]: I1007 13:30:03.036769 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd6d7fce-8018-4050-94e3-efbff9794019-secret-volume\") pod \"dd6d7fce-8018-4050-94e3-efbff9794019\" (UID: \"dd6d7fce-8018-4050-94e3-efbff9794019\") " Oct 07 13:30:03 crc kubenswrapper[4854]: I1007 13:30:03.036932 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nllcg\" (UniqueName: \"kubernetes.io/projected/dd6d7fce-8018-4050-94e3-efbff9794019-kube-api-access-nllcg\") pod \"dd6d7fce-8018-4050-94e3-efbff9794019\" (UID: \"dd6d7fce-8018-4050-94e3-efbff9794019\") " Oct 07 13:30:03 crc kubenswrapper[4854]: I1007 13:30:03.037196 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd6d7fce-8018-4050-94e3-efbff9794019-config-volume" (OuterVolumeSpecName: "config-volume") pod "dd6d7fce-8018-4050-94e3-efbff9794019" (UID: "dd6d7fce-8018-4050-94e3-efbff9794019"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:30:03 crc kubenswrapper[4854]: I1007 13:30:03.038187 4854 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd6d7fce-8018-4050-94e3-efbff9794019-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:30:03 crc kubenswrapper[4854]: I1007 13:30:03.045822 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6d7fce-8018-4050-94e3-efbff9794019-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dd6d7fce-8018-4050-94e3-efbff9794019" (UID: "dd6d7fce-8018-4050-94e3-efbff9794019"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:30:03 crc kubenswrapper[4854]: I1007 13:30:03.047960 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6d7fce-8018-4050-94e3-efbff9794019-kube-api-access-nllcg" (OuterVolumeSpecName: "kube-api-access-nllcg") pod "dd6d7fce-8018-4050-94e3-efbff9794019" (UID: "dd6d7fce-8018-4050-94e3-efbff9794019"). InnerVolumeSpecName "kube-api-access-nllcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:30:03 crc kubenswrapper[4854]: I1007 13:30:03.140879 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nllcg\" (UniqueName: \"kubernetes.io/projected/dd6d7fce-8018-4050-94e3-efbff9794019-kube-api-access-nllcg\") on node \"crc\" DevicePath \"\"" Oct 07 13:30:03 crc kubenswrapper[4854]: I1007 13:30:03.140957 4854 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd6d7fce-8018-4050-94e3-efbff9794019-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:30:03 crc kubenswrapper[4854]: I1007 13:30:03.584383 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" event={"ID":"dd6d7fce-8018-4050-94e3-efbff9794019","Type":"ContainerDied","Data":"426fc4335db5e68094887ac159a3172f1cc2fb7e4b5abc2c634c96f6291eab61"} Oct 07 13:30:03 crc kubenswrapper[4854]: I1007 13:30:03.584432 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="426fc4335db5e68094887ac159a3172f1cc2fb7e4b5abc2c634c96f6291eab61" Oct 07 13:30:03 crc kubenswrapper[4854]: I1007 13:30:03.584490 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4" Oct 07 13:30:04 crc kubenswrapper[4854]: I1007 13:30:04.002779 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s"] Oct 07 13:30:04 crc kubenswrapper[4854]: I1007 13:30:04.007760 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330685-qwz9s"] Oct 07 13:30:04 crc kubenswrapper[4854]: I1007 13:30:04.722099 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1779af00-05df-4865-b313-cd038772c19f" path="/var/lib/kubelet/pods/1779af00-05df-4865-b313-cd038772c19f/volumes" Oct 07 13:30:12 crc kubenswrapper[4854]: I1007 13:30:12.703901 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:30:12 crc kubenswrapper[4854]: E1007 13:30:12.705712 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:30:20 crc kubenswrapper[4854]: I1007 13:30:20.299259 4854 scope.go:117] "RemoveContainer" containerID="acb66eb6bd6eb8076f9dddda371d67eb437edfd33d9a97a46ccdb5edca43390f" Oct 07 13:30:25 crc kubenswrapper[4854]: I1007 13:30:25.703270 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:30:25 crc kubenswrapper[4854]: E1007 13:30:25.704242 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:30:38 crc kubenswrapper[4854]: I1007 13:30:38.702786 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:30:38 crc kubenswrapper[4854]: E1007 13:30:38.704943 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:30:50 crc kubenswrapper[4854]: I1007 13:30:50.702468 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:30:50 crc kubenswrapper[4854]: E1007 13:30:50.703261 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:31:02 crc kubenswrapper[4854]: I1007 13:31:02.703040 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:31:02 crc kubenswrapper[4854]: E1007 13:31:02.704093 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.546895 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m2p75"] Oct 07 13:31:06 crc kubenswrapper[4854]: E1007 13:31:06.547923 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6d7fce-8018-4050-94e3-efbff9794019" containerName="collect-profiles" Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.547960 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6d7fce-8018-4050-94e3-efbff9794019" containerName="collect-profiles" Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.548343 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd6d7fce-8018-4050-94e3-efbff9794019" containerName="collect-profiles" Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.550685 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.570383 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m2p75"] Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.702026 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac6041b-46b0-49ab-b2b1-533a589d9a30-utilities\") pod \"community-operators-m2p75\" (UID: \"eac6041b-46b0-49ab-b2b1-533a589d9a30\") " pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.702476 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr5wz\" (UniqueName: \"kubernetes.io/projected/eac6041b-46b0-49ab-b2b1-533a589d9a30-kube-api-access-hr5wz\") pod \"community-operators-m2p75\" (UID: \"eac6041b-46b0-49ab-b2b1-533a589d9a30\") " pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.702579 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac6041b-46b0-49ab-b2b1-533a589d9a30-catalog-content\") pod \"community-operators-m2p75\" (UID: \"eac6041b-46b0-49ab-b2b1-533a589d9a30\") " pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.804204 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr5wz\" (UniqueName: \"kubernetes.io/projected/eac6041b-46b0-49ab-b2b1-533a589d9a30-kube-api-access-hr5wz\") pod \"community-operators-m2p75\" (UID: \"eac6041b-46b0-49ab-b2b1-533a589d9a30\") " pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.804293 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac6041b-46b0-49ab-b2b1-533a589d9a30-catalog-content\") pod \"community-operators-m2p75\" (UID: \"eac6041b-46b0-49ab-b2b1-533a589d9a30\") " pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.804342 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac6041b-46b0-49ab-b2b1-533a589d9a30-utilities\") pod \"community-operators-m2p75\" (UID: \"eac6041b-46b0-49ab-b2b1-533a589d9a30\") " pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.805116 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac6041b-46b0-49ab-b2b1-533a589d9a30-utilities\") pod \"community-operators-m2p75\" (UID: \"eac6041b-46b0-49ab-b2b1-533a589d9a30\") " pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.805503 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac6041b-46b0-49ab-b2b1-533a589d9a30-catalog-content\") pod \"community-operators-m2p75\" (UID: \"eac6041b-46b0-49ab-b2b1-533a589d9a30\") " pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.829744 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr5wz\" (UniqueName: \"kubernetes.io/projected/eac6041b-46b0-49ab-b2b1-533a589d9a30-kube-api-access-hr5wz\") pod \"community-operators-m2p75\" (UID: \"eac6041b-46b0-49ab-b2b1-533a589d9a30\") " pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:06 crc kubenswrapper[4854]: I1007 13:31:06.900867 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:07 crc kubenswrapper[4854]: I1007 13:31:07.236844 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m2p75"] Oct 07 13:31:08 crc kubenswrapper[4854]: I1007 13:31:08.183898 4854 generic.go:334] "Generic (PLEG): container finished" podID="eac6041b-46b0-49ab-b2b1-533a589d9a30" containerID="5378bdb7ecd097dcb770453b1f82db40e18741c6f4424eeca00b0850c901e0fb" exitCode=0 Oct 07 13:31:08 crc kubenswrapper[4854]: I1007 13:31:08.184055 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2p75" event={"ID":"eac6041b-46b0-49ab-b2b1-533a589d9a30","Type":"ContainerDied","Data":"5378bdb7ecd097dcb770453b1f82db40e18741c6f4424eeca00b0850c901e0fb"} Oct 07 13:31:08 crc kubenswrapper[4854]: I1007 13:31:08.184322 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2p75" event={"ID":"eac6041b-46b0-49ab-b2b1-533a589d9a30","Type":"ContainerStarted","Data":"d7c85fc6edba5e62e47c21e22caa50de330cf388af796a59e0f5643734a5e8c5"} Oct 07 13:31:09 crc kubenswrapper[4854]: I1007 13:31:09.198541 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2p75" event={"ID":"eac6041b-46b0-49ab-b2b1-533a589d9a30","Type":"ContainerStarted","Data":"2831aea8483f3017eb7c7a45f856ef27dde2272f04cc5fa55123064a8404241b"} Oct 07 13:31:10 crc kubenswrapper[4854]: I1007 13:31:10.209977 4854 generic.go:334] "Generic (PLEG): container finished" podID="eac6041b-46b0-49ab-b2b1-533a589d9a30" containerID="2831aea8483f3017eb7c7a45f856ef27dde2272f04cc5fa55123064a8404241b" exitCode=0 Oct 07 13:31:10 crc kubenswrapper[4854]: I1007 13:31:10.210087 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2p75" event={"ID":"eac6041b-46b0-49ab-b2b1-533a589d9a30","Type":"ContainerDied","Data":"2831aea8483f3017eb7c7a45f856ef27dde2272f04cc5fa55123064a8404241b"} Oct 07 13:31:11 crc kubenswrapper[4854]: I1007 13:31:11.221116 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2p75" event={"ID":"eac6041b-46b0-49ab-b2b1-533a589d9a30","Type":"ContainerStarted","Data":"a730b24caac5abbca074bcdd6aae30425dd28e9a6f681f79851907af453fd315"} Oct 07 13:31:11 crc kubenswrapper[4854]: I1007 13:31:11.242280 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m2p75" podStartSLOduration=2.780090602 podStartE2EDuration="5.242256657s" podCreationTimestamp="2025-10-07 13:31:06 +0000 UTC" firstStartedPulling="2025-10-07 13:31:08.186144445 +0000 UTC m=+3984.173976730" lastFinishedPulling="2025-10-07 13:31:10.64831052 +0000 UTC m=+3986.636142785" observedRunningTime="2025-10-07 13:31:11.240592109 +0000 UTC m=+3987.228424394" watchObservedRunningTime="2025-10-07 13:31:11.242256657 +0000 UTC m=+3987.230088932" Oct 07 13:31:16 crc kubenswrapper[4854]: I1007 13:31:16.703725 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:31:16 crc kubenswrapper[4854]: E1007 13:31:16.704949 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:31:16 crc kubenswrapper[4854]: I1007 13:31:16.901417 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:16 crc kubenswrapper[4854]: I1007 13:31:16.901499 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:16 crc kubenswrapper[4854]: I1007 13:31:16.978544 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:17 crc kubenswrapper[4854]: I1007 13:31:17.343737 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:17 crc kubenswrapper[4854]: I1007 13:31:17.397501 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m2p75"] Oct 07 13:31:19 crc kubenswrapper[4854]: I1007 13:31:19.301279 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m2p75" podUID="eac6041b-46b0-49ab-b2b1-533a589d9a30" containerName="registry-server" containerID="cri-o://a730b24caac5abbca074bcdd6aae30425dd28e9a6f681f79851907af453fd315" gracePeriod=2 Oct 07 13:31:19 crc kubenswrapper[4854]: I1007 13:31:19.901481 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.014453 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac6041b-46b0-49ab-b2b1-533a589d9a30-catalog-content\") pod \"eac6041b-46b0-49ab-b2b1-533a589d9a30\" (UID: \"eac6041b-46b0-49ab-b2b1-533a589d9a30\") " Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.014826 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac6041b-46b0-49ab-b2b1-533a589d9a30-utilities\") pod \"eac6041b-46b0-49ab-b2b1-533a589d9a30\" (UID: \"eac6041b-46b0-49ab-b2b1-533a589d9a30\") " Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.014903 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr5wz\" (UniqueName: \"kubernetes.io/projected/eac6041b-46b0-49ab-b2b1-533a589d9a30-kube-api-access-hr5wz\") pod \"eac6041b-46b0-49ab-b2b1-533a589d9a30\" (UID: \"eac6041b-46b0-49ab-b2b1-533a589d9a30\") " Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.016370 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eac6041b-46b0-49ab-b2b1-533a589d9a30-utilities" (OuterVolumeSpecName: "utilities") pod "eac6041b-46b0-49ab-b2b1-533a589d9a30" (UID: "eac6041b-46b0-49ab-b2b1-533a589d9a30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.020595 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eac6041b-46b0-49ab-b2b1-533a589d9a30-kube-api-access-hr5wz" (OuterVolumeSpecName: "kube-api-access-hr5wz") pod "eac6041b-46b0-49ab-b2b1-533a589d9a30" (UID: "eac6041b-46b0-49ab-b2b1-533a589d9a30"). InnerVolumeSpecName "kube-api-access-hr5wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.061109 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eac6041b-46b0-49ab-b2b1-533a589d9a30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eac6041b-46b0-49ab-b2b1-533a589d9a30" (UID: "eac6041b-46b0-49ab-b2b1-533a589d9a30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.116085 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac6041b-46b0-49ab-b2b1-533a589d9a30-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.116125 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr5wz\" (UniqueName: \"kubernetes.io/projected/eac6041b-46b0-49ab-b2b1-533a589d9a30-kube-api-access-hr5wz\") on node \"crc\" DevicePath \"\"" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.116139 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac6041b-46b0-49ab-b2b1-533a589d9a30-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.313344 4854 generic.go:334] "Generic (PLEG): container finished" podID="eac6041b-46b0-49ab-b2b1-533a589d9a30" containerID="a730b24caac5abbca074bcdd6aae30425dd28e9a6f681f79851907af453fd315" exitCode=0 Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.313405 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m2p75" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.313423 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2p75" event={"ID":"eac6041b-46b0-49ab-b2b1-533a589d9a30","Type":"ContainerDied","Data":"a730b24caac5abbca074bcdd6aae30425dd28e9a6f681f79851907af453fd315"} Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.313478 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m2p75" event={"ID":"eac6041b-46b0-49ab-b2b1-533a589d9a30","Type":"ContainerDied","Data":"d7c85fc6edba5e62e47c21e22caa50de330cf388af796a59e0f5643734a5e8c5"} Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.313506 4854 scope.go:117] "RemoveContainer" containerID="a730b24caac5abbca074bcdd6aae30425dd28e9a6f681f79851907af453fd315" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.345896 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m2p75"] Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.346119 4854 scope.go:117] "RemoveContainer" containerID="2831aea8483f3017eb7c7a45f856ef27dde2272f04cc5fa55123064a8404241b" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.354068 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m2p75"] Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.381653 4854 scope.go:117] "RemoveContainer" containerID="5378bdb7ecd097dcb770453b1f82db40e18741c6f4424eeca00b0850c901e0fb" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.403705 4854 scope.go:117] "RemoveContainer" containerID="a730b24caac5abbca074bcdd6aae30425dd28e9a6f681f79851907af453fd315" Oct 07 13:31:20 crc kubenswrapper[4854]: E1007 13:31:20.404198 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a730b24caac5abbca074bcdd6aae30425dd28e9a6f681f79851907af453fd315\": container with ID starting with a730b24caac5abbca074bcdd6aae30425dd28e9a6f681f79851907af453fd315 not found: ID does not exist" containerID="a730b24caac5abbca074bcdd6aae30425dd28e9a6f681f79851907af453fd315" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.404238 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a730b24caac5abbca074bcdd6aae30425dd28e9a6f681f79851907af453fd315"} err="failed to get container status \"a730b24caac5abbca074bcdd6aae30425dd28e9a6f681f79851907af453fd315\": rpc error: code = NotFound desc = could not find container \"a730b24caac5abbca074bcdd6aae30425dd28e9a6f681f79851907af453fd315\": container with ID starting with a730b24caac5abbca074bcdd6aae30425dd28e9a6f681f79851907af453fd315 not found: ID does not exist" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.404265 4854 scope.go:117] "RemoveContainer" containerID="2831aea8483f3017eb7c7a45f856ef27dde2272f04cc5fa55123064a8404241b" Oct 07 13:31:20 crc kubenswrapper[4854]: E1007 13:31:20.404566 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2831aea8483f3017eb7c7a45f856ef27dde2272f04cc5fa55123064a8404241b\": container with ID starting with 2831aea8483f3017eb7c7a45f856ef27dde2272f04cc5fa55123064a8404241b not found: ID does not exist" containerID="2831aea8483f3017eb7c7a45f856ef27dde2272f04cc5fa55123064a8404241b" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.404628 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2831aea8483f3017eb7c7a45f856ef27dde2272f04cc5fa55123064a8404241b"} err="failed to get container status \"2831aea8483f3017eb7c7a45f856ef27dde2272f04cc5fa55123064a8404241b\": rpc error: code = NotFound desc = could not find container \"2831aea8483f3017eb7c7a45f856ef27dde2272f04cc5fa55123064a8404241b\": container with ID starting with 2831aea8483f3017eb7c7a45f856ef27dde2272f04cc5fa55123064a8404241b not found: ID does not exist" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.404673 4854 scope.go:117] "RemoveContainer" containerID="5378bdb7ecd097dcb770453b1f82db40e18741c6f4424eeca00b0850c901e0fb" Oct 07 13:31:20 crc kubenswrapper[4854]: E1007 13:31:20.405428 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5378bdb7ecd097dcb770453b1f82db40e18741c6f4424eeca00b0850c901e0fb\": container with ID starting with 5378bdb7ecd097dcb770453b1f82db40e18741c6f4424eeca00b0850c901e0fb not found: ID does not exist" containerID="5378bdb7ecd097dcb770453b1f82db40e18741c6f4424eeca00b0850c901e0fb" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.405457 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5378bdb7ecd097dcb770453b1f82db40e18741c6f4424eeca00b0850c901e0fb"} err="failed to get container status \"5378bdb7ecd097dcb770453b1f82db40e18741c6f4424eeca00b0850c901e0fb\": rpc error: code = NotFound desc = could not find container \"5378bdb7ecd097dcb770453b1f82db40e18741c6f4424eeca00b0850c901e0fb\": container with ID starting with 5378bdb7ecd097dcb770453b1f82db40e18741c6f4424eeca00b0850c901e0fb not found: ID does not exist" Oct 07 13:31:20 crc kubenswrapper[4854]: I1007 13:31:20.717410 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eac6041b-46b0-49ab-b2b1-533a589d9a30" path="/var/lib/kubelet/pods/eac6041b-46b0-49ab-b2b1-533a589d9a30/volumes" Oct 07 13:31:29 crc kubenswrapper[4854]: I1007 13:31:29.703368 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:31:29 crc kubenswrapper[4854]: E1007 13:31:29.704470 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:31:41 crc kubenswrapper[4854]: I1007 13:31:41.703075 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:31:41 crc kubenswrapper[4854]: E1007 13:31:41.704055 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:31:53 crc kubenswrapper[4854]: I1007 13:31:53.702865 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:31:53 crc kubenswrapper[4854]: E1007 13:31:53.703952 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:32:08 crc kubenswrapper[4854]: I1007 13:32:08.703123 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:32:08 crc kubenswrapper[4854]: E1007 13:32:08.703936 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:32:22 crc kubenswrapper[4854]: I1007 13:32:22.703708 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:32:22 crc kubenswrapper[4854]: E1007 13:32:22.705183 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:32:33 crc kubenswrapper[4854]: I1007 13:32:33.702968 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:32:33 crc kubenswrapper[4854]: E1007 13:32:33.703700 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:32:47 crc kubenswrapper[4854]: I1007 13:32:47.702800 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:32:47 crc kubenswrapper[4854]: E1007 13:32:47.703926 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:32:58 crc kubenswrapper[4854]: I1007 13:32:58.704476 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:32:58 crc kubenswrapper[4854]: E1007 13:32:58.705743 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:33:09 crc kubenswrapper[4854]: I1007 13:33:09.702570 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:33:09 crc kubenswrapper[4854]: E1007 13:33:09.703207 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.437703 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wdrrf"] Oct 07 13:33:15 crc kubenswrapper[4854]: E1007 13:33:15.438946 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac6041b-46b0-49ab-b2b1-533a589d9a30" containerName="extract-utilities" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.438989 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac6041b-46b0-49ab-b2b1-533a589d9a30" containerName="extract-utilities" Oct 07 13:33:15 crc kubenswrapper[4854]: E1007 13:33:15.439023 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac6041b-46b0-49ab-b2b1-533a589d9a30" containerName="registry-server" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.439034 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac6041b-46b0-49ab-b2b1-533a589d9a30" containerName="registry-server" Oct 07 13:33:15 crc kubenswrapper[4854]: E1007 13:33:15.439065 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac6041b-46b0-49ab-b2b1-533a589d9a30" containerName="extract-content" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.439075 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac6041b-46b0-49ab-b2b1-533a589d9a30" containerName="extract-content" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.439317 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac6041b-46b0-49ab-b2b1-533a589d9a30" containerName="registry-server" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.440938 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.459451 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wdrrf"] Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.470972 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bhfd\" (UniqueName: \"kubernetes.io/projected/643b95db-26ab-40da-8dfe-6dec077cc641-kube-api-access-7bhfd\") pod \"redhat-operators-wdrrf\" (UID: \"643b95db-26ab-40da-8dfe-6dec077cc641\") " pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.471124 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/643b95db-26ab-40da-8dfe-6dec077cc641-utilities\") pod \"redhat-operators-wdrrf\" (UID: \"643b95db-26ab-40da-8dfe-6dec077cc641\") " pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.471233 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/643b95db-26ab-40da-8dfe-6dec077cc641-catalog-content\") pod \"redhat-operators-wdrrf\" (UID: \"643b95db-26ab-40da-8dfe-6dec077cc641\") " pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.572247 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bhfd\" (UniqueName: \"kubernetes.io/projected/643b95db-26ab-40da-8dfe-6dec077cc641-kube-api-access-7bhfd\") pod \"redhat-operators-wdrrf\" (UID: \"643b95db-26ab-40da-8dfe-6dec077cc641\") " pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.572362 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/643b95db-26ab-40da-8dfe-6dec077cc641-utilities\") pod \"redhat-operators-wdrrf\" (UID: \"643b95db-26ab-40da-8dfe-6dec077cc641\") " pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.572402 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/643b95db-26ab-40da-8dfe-6dec077cc641-catalog-content\") pod \"redhat-operators-wdrrf\" (UID: \"643b95db-26ab-40da-8dfe-6dec077cc641\") " pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.573105 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/643b95db-26ab-40da-8dfe-6dec077cc641-catalog-content\") pod \"redhat-operators-wdrrf\" (UID: \"643b95db-26ab-40da-8dfe-6dec077cc641\") " pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.573202 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/643b95db-26ab-40da-8dfe-6dec077cc641-utilities\") pod \"redhat-operators-wdrrf\" (UID: \"643b95db-26ab-40da-8dfe-6dec077cc641\") " pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.596395 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bhfd\" (UniqueName: \"kubernetes.io/projected/643b95db-26ab-40da-8dfe-6dec077cc641-kube-api-access-7bhfd\") pod \"redhat-operators-wdrrf\" (UID: \"643b95db-26ab-40da-8dfe-6dec077cc641\") " pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:15 crc kubenswrapper[4854]: I1007 13:33:15.772189 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:16 crc kubenswrapper[4854]: I1007 13:33:16.250602 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wdrrf"] Oct 07 13:33:16 crc kubenswrapper[4854]: I1007 13:33:16.427169 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdrrf" event={"ID":"643b95db-26ab-40da-8dfe-6dec077cc641","Type":"ContainerStarted","Data":"5660f882d0868bab81961d6fcc442ee46cc6635d89ee97a9ac3fa9ef2318618c"} Oct 07 13:33:17 crc kubenswrapper[4854]: I1007 13:33:17.437123 4854 generic.go:334] "Generic (PLEG): container finished" podID="643b95db-26ab-40da-8dfe-6dec077cc641" containerID="70a72680f31fbefef9249a39b25e21e835db92f230d6d8ee369a17270a61a343" exitCode=0 Oct 07 13:33:17 crc kubenswrapper[4854]: I1007 13:33:17.437364 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdrrf" event={"ID":"643b95db-26ab-40da-8dfe-6dec077cc641","Type":"ContainerDied","Data":"70a72680f31fbefef9249a39b25e21e835db92f230d6d8ee369a17270a61a343"} Oct 07 13:33:17 crc kubenswrapper[4854]: I1007 13:33:17.440223 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:33:19 crc kubenswrapper[4854]: I1007 13:33:19.457848 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdrrf" event={"ID":"643b95db-26ab-40da-8dfe-6dec077cc641","Type":"ContainerStarted","Data":"b67467147ce495c7d53f2c6ef9619a98e495b1448e9a18fecdc08197ce984186"} Oct 07 13:33:20 crc kubenswrapper[4854]: I1007 13:33:20.473480 4854 generic.go:334] "Generic (PLEG): container finished" podID="643b95db-26ab-40da-8dfe-6dec077cc641" containerID="b67467147ce495c7d53f2c6ef9619a98e495b1448e9a18fecdc08197ce984186" exitCode=0 Oct 07 13:33:20 crc kubenswrapper[4854]: I1007 13:33:20.473532 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdrrf" event={"ID":"643b95db-26ab-40da-8dfe-6dec077cc641","Type":"ContainerDied","Data":"b67467147ce495c7d53f2c6ef9619a98e495b1448e9a18fecdc08197ce984186"} Oct 07 13:33:22 crc kubenswrapper[4854]: I1007 13:33:22.497778 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdrrf" event={"ID":"643b95db-26ab-40da-8dfe-6dec077cc641","Type":"ContainerStarted","Data":"c0a0dd98657cdd3dc89a0618b27b9d35b7e7e7ad01d0b2d633890ad6c0583b64"} Oct 07 13:33:22 crc kubenswrapper[4854]: I1007 13:33:22.543317 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wdrrf" podStartSLOduration=3.6302780329999997 podStartE2EDuration="7.543298702s" podCreationTimestamp="2025-10-07 13:33:15 +0000 UTC" firstStartedPulling="2025-10-07 13:33:17.439773025 +0000 UTC m=+4113.427605320" lastFinishedPulling="2025-10-07 13:33:21.352793714 +0000 UTC m=+4117.340625989" observedRunningTime="2025-10-07 13:33:22.538892925 +0000 UTC m=+4118.526725180" watchObservedRunningTime="2025-10-07 13:33:22.543298702 +0000 UTC m=+4118.531130957" Oct 07 13:33:23 crc kubenswrapper[4854]: I1007 13:33:23.702916 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:33:23 crc kubenswrapper[4854]: E1007 13:33:23.703309 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:33:25 crc kubenswrapper[4854]: I1007 13:33:25.773938 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:25 crc kubenswrapper[4854]: I1007 13:33:25.774289 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:26 crc kubenswrapper[4854]: I1007 13:33:26.818010 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wdrrf" podUID="643b95db-26ab-40da-8dfe-6dec077cc641" containerName="registry-server" probeResult="failure" output=< Oct 07 13:33:26 crc kubenswrapper[4854]: timeout: failed to connect service ":50051" within 1s Oct 07 13:33:26 crc kubenswrapper[4854]: > Oct 07 13:33:35 crc kubenswrapper[4854]: I1007 13:33:35.702823 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:33:35 crc kubenswrapper[4854]: E1007 13:33:35.703833 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:33:35 crc kubenswrapper[4854]: I1007 13:33:35.853966 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:35 crc kubenswrapper[4854]: I1007 13:33:35.931182 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:36 crc kubenswrapper[4854]: I1007 13:33:36.101628 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wdrrf"] Oct 07 13:33:37 crc kubenswrapper[4854]: I1007 13:33:37.652231 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wdrrf" podUID="643b95db-26ab-40da-8dfe-6dec077cc641" containerName="registry-server" containerID="cri-o://c0a0dd98657cdd3dc89a0618b27b9d35b7e7e7ad01d0b2d633890ad6c0583b64" gracePeriod=2 Oct 07 13:33:38 crc kubenswrapper[4854]: I1007 13:33:38.665595 4854 generic.go:334] "Generic (PLEG): container finished" podID="643b95db-26ab-40da-8dfe-6dec077cc641" containerID="c0a0dd98657cdd3dc89a0618b27b9d35b7e7e7ad01d0b2d633890ad6c0583b64" exitCode=0 Oct 07 13:33:38 crc kubenswrapper[4854]: I1007 13:33:38.666077 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdrrf" event={"ID":"643b95db-26ab-40da-8dfe-6dec077cc641","Type":"ContainerDied","Data":"c0a0dd98657cdd3dc89a0618b27b9d35b7e7e7ad01d0b2d633890ad6c0583b64"} Oct 07 13:33:38 crc kubenswrapper[4854]: I1007 13:33:38.817508 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:38 crc kubenswrapper[4854]: I1007 13:33:38.966718 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bhfd\" (UniqueName: \"kubernetes.io/projected/643b95db-26ab-40da-8dfe-6dec077cc641-kube-api-access-7bhfd\") pod \"643b95db-26ab-40da-8dfe-6dec077cc641\" (UID: \"643b95db-26ab-40da-8dfe-6dec077cc641\") " Oct 07 13:33:38 crc kubenswrapper[4854]: I1007 13:33:38.966867 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/643b95db-26ab-40da-8dfe-6dec077cc641-utilities\") pod \"643b95db-26ab-40da-8dfe-6dec077cc641\" (UID: \"643b95db-26ab-40da-8dfe-6dec077cc641\") " Oct 07 13:33:38 crc kubenswrapper[4854]: I1007 13:33:38.966925 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/643b95db-26ab-40da-8dfe-6dec077cc641-catalog-content\") pod \"643b95db-26ab-40da-8dfe-6dec077cc641\" (UID: \"643b95db-26ab-40da-8dfe-6dec077cc641\") " Oct 07 13:33:38 crc kubenswrapper[4854]: I1007 13:33:38.967790 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/643b95db-26ab-40da-8dfe-6dec077cc641-utilities" (OuterVolumeSpecName: "utilities") pod "643b95db-26ab-40da-8dfe-6dec077cc641" (UID: "643b95db-26ab-40da-8dfe-6dec077cc641"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:33:38 crc kubenswrapper[4854]: I1007 13:33:38.974896 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/643b95db-26ab-40da-8dfe-6dec077cc641-kube-api-access-7bhfd" (OuterVolumeSpecName: "kube-api-access-7bhfd") pod "643b95db-26ab-40da-8dfe-6dec077cc641" (UID: "643b95db-26ab-40da-8dfe-6dec077cc641"). InnerVolumeSpecName "kube-api-access-7bhfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:33:39 crc kubenswrapper[4854]: I1007 13:33:39.049994 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/643b95db-26ab-40da-8dfe-6dec077cc641-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "643b95db-26ab-40da-8dfe-6dec077cc641" (UID: "643b95db-26ab-40da-8dfe-6dec077cc641"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:33:39 crc kubenswrapper[4854]: I1007 13:33:39.069805 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/643b95db-26ab-40da-8dfe-6dec077cc641-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:33:39 crc kubenswrapper[4854]: I1007 13:33:39.069895 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/643b95db-26ab-40da-8dfe-6dec077cc641-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:33:39 crc kubenswrapper[4854]: I1007 13:33:39.069920 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bhfd\" (UniqueName: \"kubernetes.io/projected/643b95db-26ab-40da-8dfe-6dec077cc641-kube-api-access-7bhfd\") on node \"crc\" DevicePath \"\"" Oct 07 13:33:39 crc kubenswrapper[4854]: I1007 13:33:39.681375 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdrrf" event={"ID":"643b95db-26ab-40da-8dfe-6dec077cc641","Type":"ContainerDied","Data":"5660f882d0868bab81961d6fcc442ee46cc6635d89ee97a9ac3fa9ef2318618c"} Oct 07 13:33:39 crc kubenswrapper[4854]: I1007 13:33:39.681461 4854 scope.go:117] "RemoveContainer" containerID="c0a0dd98657cdd3dc89a0618b27b9d35b7e7e7ad01d0b2d633890ad6c0583b64" Oct 07 13:33:39 crc kubenswrapper[4854]: I1007 13:33:39.681464 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdrrf" Oct 07 13:33:39 crc kubenswrapper[4854]: I1007 13:33:39.715042 4854 scope.go:117] "RemoveContainer" containerID="b67467147ce495c7d53f2c6ef9619a98e495b1448e9a18fecdc08197ce984186" Oct 07 13:33:39 crc kubenswrapper[4854]: I1007 13:33:39.742462 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wdrrf"] Oct 07 13:33:39 crc kubenswrapper[4854]: I1007 13:33:39.754714 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wdrrf"] Oct 07 13:33:39 crc kubenswrapper[4854]: I1007 13:33:39.770515 4854 scope.go:117] "RemoveContainer" containerID="70a72680f31fbefef9249a39b25e21e835db92f230d6d8ee369a17270a61a343" Oct 07 13:33:40 crc kubenswrapper[4854]: I1007 13:33:40.719845 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="643b95db-26ab-40da-8dfe-6dec077cc641" path="/var/lib/kubelet/pods/643b95db-26ab-40da-8dfe-6dec077cc641/volumes" Oct 07 13:33:48 crc kubenswrapper[4854]: I1007 13:33:48.704095 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:33:49 crc kubenswrapper[4854]: I1007 13:33:49.785268 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"eda8261d4908d65e0e20b4ac93a6888107ad25b338c39cca102080a541d21d95"} Oct 07 13:35:21 crc kubenswrapper[4854]: I1007 13:35:21.800790 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5xnhf"] Oct 07 13:35:21 crc kubenswrapper[4854]: E1007 13:35:21.801690 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643b95db-26ab-40da-8dfe-6dec077cc641" containerName="extract-utilities" Oct 07 13:35:21 crc kubenswrapper[4854]: I1007 13:35:21.801707 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="643b95db-26ab-40da-8dfe-6dec077cc641" containerName="extract-utilities" Oct 07 13:35:21 crc kubenswrapper[4854]: E1007 13:35:21.801723 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643b95db-26ab-40da-8dfe-6dec077cc641" containerName="registry-server" Oct 07 13:35:21 crc kubenswrapper[4854]: I1007 13:35:21.801731 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="643b95db-26ab-40da-8dfe-6dec077cc641" containerName="registry-server" Oct 07 13:35:21 crc kubenswrapper[4854]: E1007 13:35:21.801771 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643b95db-26ab-40da-8dfe-6dec077cc641" containerName="extract-content" Oct 07 13:35:21 crc kubenswrapper[4854]: I1007 13:35:21.801779 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="643b95db-26ab-40da-8dfe-6dec077cc641" containerName="extract-content" Oct 07 13:35:21 crc kubenswrapper[4854]: I1007 13:35:21.801957 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="643b95db-26ab-40da-8dfe-6dec077cc641" containerName="registry-server" Oct 07 13:35:21 crc kubenswrapper[4854]: I1007 13:35:21.803398 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:21 crc kubenswrapper[4854]: I1007 13:35:21.827563 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5xnhf"] Oct 07 13:35:21 crc kubenswrapper[4854]: I1007 13:35:21.961558 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsffz\" (UniqueName: \"kubernetes.io/projected/1edfbb43-8183-443b-9acb-05dd3935431f-kube-api-access-tsffz\") pod \"redhat-marketplace-5xnhf\" (UID: \"1edfbb43-8183-443b-9acb-05dd3935431f\") " pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:21 crc kubenswrapper[4854]: I1007 13:35:21.961833 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1edfbb43-8183-443b-9acb-05dd3935431f-utilities\") pod \"redhat-marketplace-5xnhf\" (UID: \"1edfbb43-8183-443b-9acb-05dd3935431f\") " pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:21 crc kubenswrapper[4854]: I1007 13:35:21.961914 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1edfbb43-8183-443b-9acb-05dd3935431f-catalog-content\") pod \"redhat-marketplace-5xnhf\" (UID: \"1edfbb43-8183-443b-9acb-05dd3935431f\") " pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:22 crc kubenswrapper[4854]: I1007 13:35:22.063513 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1edfbb43-8183-443b-9acb-05dd3935431f-utilities\") pod \"redhat-marketplace-5xnhf\" (UID: \"1edfbb43-8183-443b-9acb-05dd3935431f\") " pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:22 crc kubenswrapper[4854]: I1007 13:35:22.063594 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1edfbb43-8183-443b-9acb-05dd3935431f-catalog-content\") pod \"redhat-marketplace-5xnhf\" (UID: \"1edfbb43-8183-443b-9acb-05dd3935431f\") " pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:22 crc kubenswrapper[4854]: I1007 13:35:22.063738 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsffz\" (UniqueName: \"kubernetes.io/projected/1edfbb43-8183-443b-9acb-05dd3935431f-kube-api-access-tsffz\") pod \"redhat-marketplace-5xnhf\" (UID: \"1edfbb43-8183-443b-9acb-05dd3935431f\") " pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:22 crc kubenswrapper[4854]: I1007 13:35:22.064611 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1edfbb43-8183-443b-9acb-05dd3935431f-catalog-content\") pod \"redhat-marketplace-5xnhf\" (UID: \"1edfbb43-8183-443b-9acb-05dd3935431f\") " pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:22 crc kubenswrapper[4854]: I1007 13:35:22.064691 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1edfbb43-8183-443b-9acb-05dd3935431f-utilities\") pod \"redhat-marketplace-5xnhf\" (UID: \"1edfbb43-8183-443b-9acb-05dd3935431f\") " pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:22 crc kubenswrapper[4854]: I1007 13:35:22.091864 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsffz\" (UniqueName: \"kubernetes.io/projected/1edfbb43-8183-443b-9acb-05dd3935431f-kube-api-access-tsffz\") pod \"redhat-marketplace-5xnhf\" (UID: \"1edfbb43-8183-443b-9acb-05dd3935431f\") " pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:22 crc kubenswrapper[4854]: I1007 13:35:22.135599 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:22 crc kubenswrapper[4854]: I1007 13:35:22.377243 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5xnhf"] Oct 07 13:35:22 crc kubenswrapper[4854]: I1007 13:35:22.629938 4854 generic.go:334] "Generic (PLEG): container finished" podID="1edfbb43-8183-443b-9acb-05dd3935431f" containerID="6a234f73a2c747f5fdd4652cf13a1b45c6ec4cbc291a8a79f0ee7166599a0b59" exitCode=0 Oct 07 13:35:22 crc kubenswrapper[4854]: I1007 13:35:22.630039 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5xnhf" event={"ID":"1edfbb43-8183-443b-9acb-05dd3935431f","Type":"ContainerDied","Data":"6a234f73a2c747f5fdd4652cf13a1b45c6ec4cbc291a8a79f0ee7166599a0b59"} Oct 07 13:35:22 crc kubenswrapper[4854]: I1007 13:35:22.630300 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5xnhf" event={"ID":"1edfbb43-8183-443b-9acb-05dd3935431f","Type":"ContainerStarted","Data":"c71f50f346ebba43bb082954f5eb58677fd1faf9ebe88293aa111ea31fef29d2"} Oct 07 13:35:23 crc kubenswrapper[4854]: I1007 13:35:23.642909 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5xnhf" event={"ID":"1edfbb43-8183-443b-9acb-05dd3935431f","Type":"ContainerStarted","Data":"383803c1515437130af10fbb85e4be9c4bad54ae9ab104843ac8ef902eaa3333"} Oct 07 13:35:24 crc kubenswrapper[4854]: I1007 13:35:24.653441 4854 generic.go:334] "Generic (PLEG): container finished" podID="1edfbb43-8183-443b-9acb-05dd3935431f" containerID="383803c1515437130af10fbb85e4be9c4bad54ae9ab104843ac8ef902eaa3333" exitCode=0 Oct 07 13:35:24 crc kubenswrapper[4854]: I1007 13:35:24.653503 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5xnhf" event={"ID":"1edfbb43-8183-443b-9acb-05dd3935431f","Type":"ContainerDied","Data":"383803c1515437130af10fbb85e4be9c4bad54ae9ab104843ac8ef902eaa3333"} Oct 07 13:35:24 crc kubenswrapper[4854]: I1007 13:35:24.653893 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5xnhf" event={"ID":"1edfbb43-8183-443b-9acb-05dd3935431f","Type":"ContainerStarted","Data":"183e1f08db33ab9be2c42cea4c0205efbfc8572e7417a3238bccd4f6f1571825"} Oct 07 13:35:24 crc kubenswrapper[4854]: I1007 13:35:24.672021 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5xnhf" podStartSLOduration=2.232804037 podStartE2EDuration="3.671998536s" podCreationTimestamp="2025-10-07 13:35:21 +0000 UTC" firstStartedPulling="2025-10-07 13:35:22.631414966 +0000 UTC m=+4238.619247221" lastFinishedPulling="2025-10-07 13:35:24.070609455 +0000 UTC m=+4240.058441720" observedRunningTime="2025-10-07 13:35:24.671556703 +0000 UTC m=+4240.659388968" watchObservedRunningTime="2025-10-07 13:35:24.671998536 +0000 UTC m=+4240.659830791" Oct 07 13:35:32 crc kubenswrapper[4854]: I1007 13:35:32.136319 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:32 crc kubenswrapper[4854]: I1007 13:35:32.137451 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:32 crc kubenswrapper[4854]: I1007 13:35:32.196095 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:32 crc kubenswrapper[4854]: I1007 13:35:32.804743 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:32 crc kubenswrapper[4854]: I1007 13:35:32.881052 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5xnhf"] Oct 07 13:35:34 crc kubenswrapper[4854]: I1007 13:35:34.751669 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5xnhf" podUID="1edfbb43-8183-443b-9acb-05dd3935431f" containerName="registry-server" containerID="cri-o://183e1f08db33ab9be2c42cea4c0205efbfc8572e7417a3238bccd4f6f1571825" gracePeriod=2 Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.245258 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.289679 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsffz\" (UniqueName: \"kubernetes.io/projected/1edfbb43-8183-443b-9acb-05dd3935431f-kube-api-access-tsffz\") pod \"1edfbb43-8183-443b-9acb-05dd3935431f\" (UID: \"1edfbb43-8183-443b-9acb-05dd3935431f\") " Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.289937 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1edfbb43-8183-443b-9acb-05dd3935431f-catalog-content\") pod \"1edfbb43-8183-443b-9acb-05dd3935431f\" (UID: \"1edfbb43-8183-443b-9acb-05dd3935431f\") " Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.290211 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1edfbb43-8183-443b-9acb-05dd3935431f-utilities\") pod \"1edfbb43-8183-443b-9acb-05dd3935431f\" (UID: \"1edfbb43-8183-443b-9acb-05dd3935431f\") " Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.291887 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1edfbb43-8183-443b-9acb-05dd3935431f-utilities" (OuterVolumeSpecName: "utilities") pod "1edfbb43-8183-443b-9acb-05dd3935431f" (UID: "1edfbb43-8183-443b-9acb-05dd3935431f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.305842 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1edfbb43-8183-443b-9acb-05dd3935431f-kube-api-access-tsffz" (OuterVolumeSpecName: "kube-api-access-tsffz") pod "1edfbb43-8183-443b-9acb-05dd3935431f" (UID: "1edfbb43-8183-443b-9acb-05dd3935431f"). InnerVolumeSpecName "kube-api-access-tsffz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.308589 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1edfbb43-8183-443b-9acb-05dd3935431f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1edfbb43-8183-443b-9acb-05dd3935431f" (UID: "1edfbb43-8183-443b-9acb-05dd3935431f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.391842 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1edfbb43-8183-443b-9acb-05dd3935431f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.391880 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsffz\" (UniqueName: \"kubernetes.io/projected/1edfbb43-8183-443b-9acb-05dd3935431f-kube-api-access-tsffz\") on node \"crc\" DevicePath \"\"" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.391890 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1edfbb43-8183-443b-9acb-05dd3935431f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.772422 4854 generic.go:334] "Generic (PLEG): container finished" podID="1edfbb43-8183-443b-9acb-05dd3935431f" containerID="183e1f08db33ab9be2c42cea4c0205efbfc8572e7417a3238bccd4f6f1571825" exitCode=0 Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.772475 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5xnhf" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.772510 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5xnhf" event={"ID":"1edfbb43-8183-443b-9acb-05dd3935431f","Type":"ContainerDied","Data":"183e1f08db33ab9be2c42cea4c0205efbfc8572e7417a3238bccd4f6f1571825"} Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.772568 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5xnhf" event={"ID":"1edfbb43-8183-443b-9acb-05dd3935431f","Type":"ContainerDied","Data":"c71f50f346ebba43bb082954f5eb58677fd1faf9ebe88293aa111ea31fef29d2"} Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.772605 4854 scope.go:117] "RemoveContainer" containerID="183e1f08db33ab9be2c42cea4c0205efbfc8572e7417a3238bccd4f6f1571825" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.811873 4854 scope.go:117] "RemoveContainer" containerID="383803c1515437130af10fbb85e4be9c4bad54ae9ab104843ac8ef902eaa3333" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.812944 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5xnhf"] Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.825508 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5xnhf"] Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.850633 4854 scope.go:117] "RemoveContainer" containerID="6a234f73a2c747f5fdd4652cf13a1b45c6ec4cbc291a8a79f0ee7166599a0b59" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.886316 4854 scope.go:117] "RemoveContainer" containerID="183e1f08db33ab9be2c42cea4c0205efbfc8572e7417a3238bccd4f6f1571825" Oct 07 13:35:35 crc kubenswrapper[4854]: E1007 13:35:35.887724 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"183e1f08db33ab9be2c42cea4c0205efbfc8572e7417a3238bccd4f6f1571825\": container with ID starting with 183e1f08db33ab9be2c42cea4c0205efbfc8572e7417a3238bccd4f6f1571825 not found: ID does not exist" containerID="183e1f08db33ab9be2c42cea4c0205efbfc8572e7417a3238bccd4f6f1571825" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.887818 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183e1f08db33ab9be2c42cea4c0205efbfc8572e7417a3238bccd4f6f1571825"} err="failed to get container status \"183e1f08db33ab9be2c42cea4c0205efbfc8572e7417a3238bccd4f6f1571825\": rpc error: code = NotFound desc = could not find container \"183e1f08db33ab9be2c42cea4c0205efbfc8572e7417a3238bccd4f6f1571825\": container with ID starting with 183e1f08db33ab9be2c42cea4c0205efbfc8572e7417a3238bccd4f6f1571825 not found: ID does not exist" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.887872 4854 scope.go:117] "RemoveContainer" containerID="383803c1515437130af10fbb85e4be9c4bad54ae9ab104843ac8ef902eaa3333" Oct 07 13:35:35 crc kubenswrapper[4854]: E1007 13:35:35.888594 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383803c1515437130af10fbb85e4be9c4bad54ae9ab104843ac8ef902eaa3333\": container with ID starting with 383803c1515437130af10fbb85e4be9c4bad54ae9ab104843ac8ef902eaa3333 not found: ID does not exist" containerID="383803c1515437130af10fbb85e4be9c4bad54ae9ab104843ac8ef902eaa3333" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.888635 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383803c1515437130af10fbb85e4be9c4bad54ae9ab104843ac8ef902eaa3333"} err="failed to get container status \"383803c1515437130af10fbb85e4be9c4bad54ae9ab104843ac8ef902eaa3333\": rpc error: code = NotFound desc = could not find container \"383803c1515437130af10fbb85e4be9c4bad54ae9ab104843ac8ef902eaa3333\": container with ID starting with 383803c1515437130af10fbb85e4be9c4bad54ae9ab104843ac8ef902eaa3333 not found: ID does not exist" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.888660 4854 scope.go:117] "RemoveContainer" containerID="6a234f73a2c747f5fdd4652cf13a1b45c6ec4cbc291a8a79f0ee7166599a0b59" Oct 07 13:35:35 crc kubenswrapper[4854]: E1007 13:35:35.889311 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a234f73a2c747f5fdd4652cf13a1b45c6ec4cbc291a8a79f0ee7166599a0b59\": container with ID starting with 6a234f73a2c747f5fdd4652cf13a1b45c6ec4cbc291a8a79f0ee7166599a0b59 not found: ID does not exist" containerID="6a234f73a2c747f5fdd4652cf13a1b45c6ec4cbc291a8a79f0ee7166599a0b59" Oct 07 13:35:35 crc kubenswrapper[4854]: I1007 13:35:35.889368 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a234f73a2c747f5fdd4652cf13a1b45c6ec4cbc291a8a79f0ee7166599a0b59"} err="failed to get container status \"6a234f73a2c747f5fdd4652cf13a1b45c6ec4cbc291a8a79f0ee7166599a0b59\": rpc error: code = NotFound desc = could not find container \"6a234f73a2c747f5fdd4652cf13a1b45c6ec4cbc291a8a79f0ee7166599a0b59\": container with ID starting with 6a234f73a2c747f5fdd4652cf13a1b45c6ec4cbc291a8a79f0ee7166599a0b59 not found: ID does not exist" Oct 07 13:35:36 crc kubenswrapper[4854]: I1007 13:35:36.720606 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1edfbb43-8183-443b-9acb-05dd3935431f" path="/var/lib/kubelet/pods/1edfbb43-8183-443b-9acb-05dd3935431f/volumes" Oct 07 13:36:10 crc kubenswrapper[4854]: I1007 13:36:10.808314 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:36:10 crc kubenswrapper[4854]: I1007 13:36:10.808982 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:36:40 crc kubenswrapper[4854]: I1007 13:36:40.808301 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:36:40 crc kubenswrapper[4854]: I1007 13:36:40.808757 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.159331 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b795w"] Oct 07 13:36:49 crc kubenswrapper[4854]: E1007 13:36:49.160183 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1edfbb43-8183-443b-9acb-05dd3935431f" containerName="registry-server" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.160197 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1edfbb43-8183-443b-9acb-05dd3935431f" containerName="registry-server" Oct 07 13:36:49 crc kubenswrapper[4854]: E1007 13:36:49.160223 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1edfbb43-8183-443b-9acb-05dd3935431f" containerName="extract-content" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.160231 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1edfbb43-8183-443b-9acb-05dd3935431f" containerName="extract-content" Oct 07 13:36:49 crc kubenswrapper[4854]: E1007 13:36:49.160247 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1edfbb43-8183-443b-9acb-05dd3935431f" containerName="extract-utilities" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.160254 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1edfbb43-8183-443b-9acb-05dd3935431f" containerName="extract-utilities" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.160421 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="1edfbb43-8183-443b-9acb-05dd3935431f" containerName="registry-server" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.161628 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.179424 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b795w"] Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.338068 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247de917-8ef3-4803-b8bf-9e7f6a915ab0-utilities\") pod \"certified-operators-b795w\" (UID: \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\") " pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.338134 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247de917-8ef3-4803-b8bf-9e7f6a915ab0-catalog-content\") pod \"certified-operators-b795w\" (UID: \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\") " pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.338277 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vlxx\" (UniqueName: \"kubernetes.io/projected/247de917-8ef3-4803-b8bf-9e7f6a915ab0-kube-api-access-6vlxx\") pod \"certified-operators-b795w\" (UID: \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\") " pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.439863 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vlxx\" (UniqueName: \"kubernetes.io/projected/247de917-8ef3-4803-b8bf-9e7f6a915ab0-kube-api-access-6vlxx\") pod \"certified-operators-b795w\" (UID: \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\") " pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.440030 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247de917-8ef3-4803-b8bf-9e7f6a915ab0-utilities\") pod \"certified-operators-b795w\" (UID: \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\") " pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.440064 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247de917-8ef3-4803-b8bf-9e7f6a915ab0-catalog-content\") pod \"certified-operators-b795w\" (UID: \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\") " pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.440643 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247de917-8ef3-4803-b8bf-9e7f6a915ab0-catalog-content\") pod \"certified-operators-b795w\" (UID: \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\") " pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.440851 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247de917-8ef3-4803-b8bf-9e7f6a915ab0-utilities\") pod \"certified-operators-b795w\" (UID: \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\") " pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.470574 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vlxx\" (UniqueName: \"kubernetes.io/projected/247de917-8ef3-4803-b8bf-9e7f6a915ab0-kube-api-access-6vlxx\") pod \"certified-operators-b795w\" (UID: \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\") " pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:49 crc kubenswrapper[4854]: I1007 13:36:49.507790 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:50 crc kubenswrapper[4854]: I1007 13:36:50.017815 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b795w"] Oct 07 13:36:50 crc kubenswrapper[4854]: I1007 13:36:50.511864 4854 generic.go:334] "Generic (PLEG): container finished" podID="247de917-8ef3-4803-b8bf-9e7f6a915ab0" containerID="79b400719e7437aff464c522dd82ff84195b127e371e9f32c2486272eef15692" exitCode=0 Oct 07 13:36:50 crc kubenswrapper[4854]: I1007 13:36:50.511932 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b795w" event={"ID":"247de917-8ef3-4803-b8bf-9e7f6a915ab0","Type":"ContainerDied","Data":"79b400719e7437aff464c522dd82ff84195b127e371e9f32c2486272eef15692"} Oct 07 13:36:50 crc kubenswrapper[4854]: I1007 13:36:50.512244 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b795w" event={"ID":"247de917-8ef3-4803-b8bf-9e7f6a915ab0","Type":"ContainerStarted","Data":"43ad753803fdb3b2d6cf25073251bb32ebf095d6b4d67c167a27724a77032944"} Oct 07 13:36:51 crc kubenswrapper[4854]: I1007 13:36:51.522715 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b795w" event={"ID":"247de917-8ef3-4803-b8bf-9e7f6a915ab0","Type":"ContainerStarted","Data":"6387aa848ab86f0f3afa3cfeed08bf86ca8a1cba7ffb16656ce46af17a1efc48"} Oct 07 13:36:52 crc kubenswrapper[4854]: I1007 13:36:52.537626 4854 generic.go:334] "Generic (PLEG): container finished" podID="247de917-8ef3-4803-b8bf-9e7f6a915ab0" containerID="6387aa848ab86f0f3afa3cfeed08bf86ca8a1cba7ffb16656ce46af17a1efc48" exitCode=0 Oct 07 13:36:52 crc kubenswrapper[4854]: I1007 13:36:52.537687 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b795w" event={"ID":"247de917-8ef3-4803-b8bf-9e7f6a915ab0","Type":"ContainerDied","Data":"6387aa848ab86f0f3afa3cfeed08bf86ca8a1cba7ffb16656ce46af17a1efc48"} Oct 07 13:36:53 crc kubenswrapper[4854]: I1007 13:36:53.550653 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b795w" event={"ID":"247de917-8ef3-4803-b8bf-9e7f6a915ab0","Type":"ContainerStarted","Data":"3ec7fbaca7e0d16afb53fe101ab0d02d53b523a22e91d6cfc82cc26daf1d1ec5"} Oct 07 13:36:53 crc kubenswrapper[4854]: I1007 13:36:53.588434 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b795w" podStartSLOduration=2.008241912 podStartE2EDuration="4.588401559s" podCreationTimestamp="2025-10-07 13:36:49 +0000 UTC" firstStartedPulling="2025-10-07 13:36:50.514076583 +0000 UTC m=+4326.501908878" lastFinishedPulling="2025-10-07 13:36:53.09423623 +0000 UTC m=+4329.082068525" observedRunningTime="2025-10-07 13:36:53.582955832 +0000 UTC m=+4329.570788127" watchObservedRunningTime="2025-10-07 13:36:53.588401559 +0000 UTC m=+4329.576233854" Oct 07 13:36:59 crc kubenswrapper[4854]: I1007 13:36:59.508568 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:59 crc kubenswrapper[4854]: I1007 13:36:59.509606 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:59 crc kubenswrapper[4854]: I1007 13:36:59.600727 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:59 crc kubenswrapper[4854]: I1007 13:36:59.685464 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:36:59 crc kubenswrapper[4854]: I1007 13:36:59.854677 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b795w"] Oct 07 13:37:01 crc kubenswrapper[4854]: I1007 13:37:01.633315 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b795w" podUID="247de917-8ef3-4803-b8bf-9e7f6a915ab0" containerName="registry-server" containerID="cri-o://3ec7fbaca7e0d16afb53fe101ab0d02d53b523a22e91d6cfc82cc26daf1d1ec5" gracePeriod=2 Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.095950 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.281650 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vlxx\" (UniqueName: \"kubernetes.io/projected/247de917-8ef3-4803-b8bf-9e7f6a915ab0-kube-api-access-6vlxx\") pod \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\" (UID: \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\") " Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.281715 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247de917-8ef3-4803-b8bf-9e7f6a915ab0-catalog-content\") pod \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\" (UID: \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\") " Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.281739 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247de917-8ef3-4803-b8bf-9e7f6a915ab0-utilities\") pod \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\" (UID: \"247de917-8ef3-4803-b8bf-9e7f6a915ab0\") " Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.283014 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247de917-8ef3-4803-b8bf-9e7f6a915ab0-utilities" (OuterVolumeSpecName: "utilities") pod "247de917-8ef3-4803-b8bf-9e7f6a915ab0" (UID: "247de917-8ef3-4803-b8bf-9e7f6a915ab0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.290499 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247de917-8ef3-4803-b8bf-9e7f6a915ab0-kube-api-access-6vlxx" (OuterVolumeSpecName: "kube-api-access-6vlxx") pod "247de917-8ef3-4803-b8bf-9e7f6a915ab0" (UID: "247de917-8ef3-4803-b8bf-9e7f6a915ab0"). InnerVolumeSpecName "kube-api-access-6vlxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.338221 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247de917-8ef3-4803-b8bf-9e7f6a915ab0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "247de917-8ef3-4803-b8bf-9e7f6a915ab0" (UID: "247de917-8ef3-4803-b8bf-9e7f6a915ab0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.383110 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247de917-8ef3-4803-b8bf-9e7f6a915ab0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.383184 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247de917-8ef3-4803-b8bf-9e7f6a915ab0-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.383203 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vlxx\" (UniqueName: \"kubernetes.io/projected/247de917-8ef3-4803-b8bf-9e7f6a915ab0-kube-api-access-6vlxx\") on node \"crc\" DevicePath \"\"" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.646547 4854 generic.go:334] "Generic (PLEG): container finished" podID="247de917-8ef3-4803-b8bf-9e7f6a915ab0" containerID="3ec7fbaca7e0d16afb53fe101ab0d02d53b523a22e91d6cfc82cc26daf1d1ec5" exitCode=0 Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.646593 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b795w" event={"ID":"247de917-8ef3-4803-b8bf-9e7f6a915ab0","Type":"ContainerDied","Data":"3ec7fbaca7e0d16afb53fe101ab0d02d53b523a22e91d6cfc82cc26daf1d1ec5"} Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.646628 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b795w" event={"ID":"247de917-8ef3-4803-b8bf-9e7f6a915ab0","Type":"ContainerDied","Data":"43ad753803fdb3b2d6cf25073251bb32ebf095d6b4d67c167a27724a77032944"} Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.646648 4854 scope.go:117] "RemoveContainer" containerID="3ec7fbaca7e0d16afb53fe101ab0d02d53b523a22e91d6cfc82cc26daf1d1ec5" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.646665 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b795w" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.685881 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b795w"] Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.692727 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b795w"] Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.693230 4854 scope.go:117] "RemoveContainer" containerID="6387aa848ab86f0f3afa3cfeed08bf86ca8a1cba7ffb16656ce46af17a1efc48" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.710847 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247de917-8ef3-4803-b8bf-9e7f6a915ab0" path="/var/lib/kubelet/pods/247de917-8ef3-4803-b8bf-9e7f6a915ab0/volumes" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.724067 4854 scope.go:117] "RemoveContainer" containerID="79b400719e7437aff464c522dd82ff84195b127e371e9f32c2486272eef15692" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.750820 4854 scope.go:117] "RemoveContainer" containerID="3ec7fbaca7e0d16afb53fe101ab0d02d53b523a22e91d6cfc82cc26daf1d1ec5" Oct 07 13:37:02 crc kubenswrapper[4854]: E1007 13:37:02.751461 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ec7fbaca7e0d16afb53fe101ab0d02d53b523a22e91d6cfc82cc26daf1d1ec5\": container with ID starting with 3ec7fbaca7e0d16afb53fe101ab0d02d53b523a22e91d6cfc82cc26daf1d1ec5 not found: ID does not exist" containerID="3ec7fbaca7e0d16afb53fe101ab0d02d53b523a22e91d6cfc82cc26daf1d1ec5" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.751526 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec7fbaca7e0d16afb53fe101ab0d02d53b523a22e91d6cfc82cc26daf1d1ec5"} err="failed to get container status \"3ec7fbaca7e0d16afb53fe101ab0d02d53b523a22e91d6cfc82cc26daf1d1ec5\": rpc error: code = NotFound desc = could not find container \"3ec7fbaca7e0d16afb53fe101ab0d02d53b523a22e91d6cfc82cc26daf1d1ec5\": container with ID starting with 3ec7fbaca7e0d16afb53fe101ab0d02d53b523a22e91d6cfc82cc26daf1d1ec5 not found: ID does not exist" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.751563 4854 scope.go:117] "RemoveContainer" containerID="6387aa848ab86f0f3afa3cfeed08bf86ca8a1cba7ffb16656ce46af17a1efc48" Oct 07 13:37:02 crc kubenswrapper[4854]: E1007 13:37:02.752085 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6387aa848ab86f0f3afa3cfeed08bf86ca8a1cba7ffb16656ce46af17a1efc48\": container with ID starting with 6387aa848ab86f0f3afa3cfeed08bf86ca8a1cba7ffb16656ce46af17a1efc48 not found: ID does not exist" containerID="6387aa848ab86f0f3afa3cfeed08bf86ca8a1cba7ffb16656ce46af17a1efc48" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.752142 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6387aa848ab86f0f3afa3cfeed08bf86ca8a1cba7ffb16656ce46af17a1efc48"} err="failed to get container status \"6387aa848ab86f0f3afa3cfeed08bf86ca8a1cba7ffb16656ce46af17a1efc48\": rpc error: code = NotFound desc = could not find container \"6387aa848ab86f0f3afa3cfeed08bf86ca8a1cba7ffb16656ce46af17a1efc48\": container with ID starting with 6387aa848ab86f0f3afa3cfeed08bf86ca8a1cba7ffb16656ce46af17a1efc48 not found: ID does not exist" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.752222 4854 scope.go:117] "RemoveContainer" containerID="79b400719e7437aff464c522dd82ff84195b127e371e9f32c2486272eef15692" Oct 07 13:37:02 crc kubenswrapper[4854]: E1007 13:37:02.752727 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b400719e7437aff464c522dd82ff84195b127e371e9f32c2486272eef15692\": container with ID starting with 79b400719e7437aff464c522dd82ff84195b127e371e9f32c2486272eef15692 not found: ID does not exist" containerID="79b400719e7437aff464c522dd82ff84195b127e371e9f32c2486272eef15692" Oct 07 13:37:02 crc kubenswrapper[4854]: I1007 13:37:02.752749 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b400719e7437aff464c522dd82ff84195b127e371e9f32c2486272eef15692"} err="failed to get container status \"79b400719e7437aff464c522dd82ff84195b127e371e9f32c2486272eef15692\": rpc error: code = NotFound desc = could not find container \"79b400719e7437aff464c522dd82ff84195b127e371e9f32c2486272eef15692\": container with ID starting with 79b400719e7437aff464c522dd82ff84195b127e371e9f32c2486272eef15692 not found: ID does not exist" Oct 07 13:37:10 crc kubenswrapper[4854]: I1007 13:37:10.807847 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:37:10 crc kubenswrapper[4854]: I1007 13:37:10.808271 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:37:10 crc kubenswrapper[4854]: I1007 13:37:10.808337 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 13:37:10 crc kubenswrapper[4854]: I1007 13:37:10.809303 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eda8261d4908d65e0e20b4ac93a6888107ad25b338c39cca102080a541d21d95"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:37:10 crc kubenswrapper[4854]: I1007 13:37:10.809397 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://eda8261d4908d65e0e20b4ac93a6888107ad25b338c39cca102080a541d21d95" gracePeriod=600 Oct 07 13:37:11 crc kubenswrapper[4854]: I1007 13:37:11.734736 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="eda8261d4908d65e0e20b4ac93a6888107ad25b338c39cca102080a541d21d95" exitCode=0 Oct 07 13:37:11 crc kubenswrapper[4854]: I1007 13:37:11.734856 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"eda8261d4908d65e0e20b4ac93a6888107ad25b338c39cca102080a541d21d95"} Oct 07 13:37:11 crc kubenswrapper[4854]: I1007 13:37:11.735215 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea"} Oct 07 13:37:11 crc kubenswrapper[4854]: I1007 13:37:11.735252 4854 scope.go:117] "RemoveContainer" containerID="2a88a0e668c9802f9aa7f30d5b1139d8c87a4e9bfb222dbe84e769d18df11b96" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.528231 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-bxz4f"] Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.538091 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-bxz4f"] Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.641758 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-k6qn2"] Oct 07 13:38:58 crc kubenswrapper[4854]: E1007 13:38:58.642250 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247de917-8ef3-4803-b8bf-9e7f6a915ab0" containerName="extract-utilities" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.642280 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="247de917-8ef3-4803-b8bf-9e7f6a915ab0" containerName="extract-utilities" Oct 07 13:38:58 crc kubenswrapper[4854]: E1007 13:38:58.642322 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247de917-8ef3-4803-b8bf-9e7f6a915ab0" containerName="extract-content" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.642337 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="247de917-8ef3-4803-b8bf-9e7f6a915ab0" containerName="extract-content" Oct 07 13:38:58 crc kubenswrapper[4854]: E1007 13:38:58.642365 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247de917-8ef3-4803-b8bf-9e7f6a915ab0" containerName="registry-server" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.642377 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="247de917-8ef3-4803-b8bf-9e7f6a915ab0" containerName="registry-server" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.642626 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="247de917-8ef3-4803-b8bf-9e7f6a915ab0" containerName="registry-server" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.643375 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k6qn2" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.658493 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.658784 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.658784 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.659563 4854 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-glcct" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.664912 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-k6qn2"] Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.724493 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d63f1f8a-2d04-4ef7-9c24-25bb39809250" path="/var/lib/kubelet/pods/d63f1f8a-2d04-4ef7-9c24-25bb39809250/volumes" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.760903 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-node-mnt\") pod \"crc-storage-crc-k6qn2\" (UID: \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\") " pod="crc-storage/crc-storage-crc-k6qn2" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.760969 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zlhw\" (UniqueName: \"kubernetes.io/projected/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-kube-api-access-6zlhw\") pod \"crc-storage-crc-k6qn2\" (UID: \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\") " pod="crc-storage/crc-storage-crc-k6qn2" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.761021 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-crc-storage\") pod \"crc-storage-crc-k6qn2\" (UID: \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\") " pod="crc-storage/crc-storage-crc-k6qn2" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.863103 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-node-mnt\") pod \"crc-storage-crc-k6qn2\" (UID: \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\") " pod="crc-storage/crc-storage-crc-k6qn2" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.863292 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zlhw\" (UniqueName: \"kubernetes.io/projected/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-kube-api-access-6zlhw\") pod \"crc-storage-crc-k6qn2\" (UID: \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\") " pod="crc-storage/crc-storage-crc-k6qn2" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.863418 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-crc-storage\") pod \"crc-storage-crc-k6qn2\" (UID: \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\") " pod="crc-storage/crc-storage-crc-k6qn2" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.863434 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-node-mnt\") pod \"crc-storage-crc-k6qn2\" (UID: \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\") " pod="crc-storage/crc-storage-crc-k6qn2" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.864705 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-crc-storage\") pod \"crc-storage-crc-k6qn2\" (UID: \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\") " pod="crc-storage/crc-storage-crc-k6qn2" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.894495 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zlhw\" (UniqueName: \"kubernetes.io/projected/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-kube-api-access-6zlhw\") pod \"crc-storage-crc-k6qn2\" (UID: \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\") " pod="crc-storage/crc-storage-crc-k6qn2" Oct 07 13:38:58 crc kubenswrapper[4854]: I1007 13:38:58.987831 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k6qn2" Oct 07 13:38:59 crc kubenswrapper[4854]: I1007 13:38:59.339773 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-k6qn2"] Oct 07 13:38:59 crc kubenswrapper[4854]: I1007 13:38:59.342685 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:38:59 crc kubenswrapper[4854]: I1007 13:38:59.791360 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-k6qn2" event={"ID":"634b9fa8-9b32-4c25-9ef0-9274f327f3e5","Type":"ContainerStarted","Data":"58d821fdc7ce62b9ac45ea858ab9510d15a9451cbeab2e56d1b7ac86938318e2"} Oct 07 13:39:00 crc kubenswrapper[4854]: I1007 13:39:00.803792 4854 generic.go:334] "Generic (PLEG): container finished" podID="634b9fa8-9b32-4c25-9ef0-9274f327f3e5" containerID="be9e942096e7b803f05d8791b4faecf3c78506b6dc83e190bda095657995872a" exitCode=0 Oct 07 13:39:00 crc kubenswrapper[4854]: I1007 13:39:00.803875 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-k6qn2" event={"ID":"634b9fa8-9b32-4c25-9ef0-9274f327f3e5","Type":"ContainerDied","Data":"be9e942096e7b803f05d8791b4faecf3c78506b6dc83e190bda095657995872a"} Oct 07 13:39:02 crc kubenswrapper[4854]: I1007 13:39:02.215784 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k6qn2" Oct 07 13:39:02 crc kubenswrapper[4854]: I1007 13:39:02.326382 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-crc-storage\") pod \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\" (UID: \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\") " Oct 07 13:39:02 crc kubenswrapper[4854]: I1007 13:39:02.326464 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-node-mnt\") pod \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\" (UID: \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\") " Oct 07 13:39:02 crc kubenswrapper[4854]: I1007 13:39:02.326501 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zlhw\" (UniqueName: \"kubernetes.io/projected/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-kube-api-access-6zlhw\") pod \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\" (UID: \"634b9fa8-9b32-4c25-9ef0-9274f327f3e5\") " Oct 07 13:39:02 crc kubenswrapper[4854]: I1007 13:39:02.326664 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "634b9fa8-9b32-4c25-9ef0-9274f327f3e5" (UID: "634b9fa8-9b32-4c25-9ef0-9274f327f3e5"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:39:02 crc kubenswrapper[4854]: I1007 13:39:02.326973 4854 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:02 crc kubenswrapper[4854]: I1007 13:39:02.334233 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-kube-api-access-6zlhw" (OuterVolumeSpecName: "kube-api-access-6zlhw") pod "634b9fa8-9b32-4c25-9ef0-9274f327f3e5" (UID: "634b9fa8-9b32-4c25-9ef0-9274f327f3e5"). InnerVolumeSpecName "kube-api-access-6zlhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:39:02 crc kubenswrapper[4854]: I1007 13:39:02.361448 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "634b9fa8-9b32-4c25-9ef0-9274f327f3e5" (UID: "634b9fa8-9b32-4c25-9ef0-9274f327f3e5"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:39:02 crc kubenswrapper[4854]: I1007 13:39:02.428449 4854 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:02 crc kubenswrapper[4854]: I1007 13:39:02.428481 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zlhw\" (UniqueName: \"kubernetes.io/projected/634b9fa8-9b32-4c25-9ef0-9274f327f3e5-kube-api-access-6zlhw\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:02 crc kubenswrapper[4854]: I1007 13:39:02.823668 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-k6qn2" event={"ID":"634b9fa8-9b32-4c25-9ef0-9274f327f3e5","Type":"ContainerDied","Data":"58d821fdc7ce62b9ac45ea858ab9510d15a9451cbeab2e56d1b7ac86938318e2"} Oct 07 13:39:02 crc kubenswrapper[4854]: I1007 13:39:02.823740 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58d821fdc7ce62b9ac45ea858ab9510d15a9451cbeab2e56d1b7ac86938318e2" Oct 07 13:39:02 crc kubenswrapper[4854]: I1007 13:39:02.823764 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k6qn2" Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.670436 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-k6qn2"] Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.678950 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-k6qn2"] Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.712007 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="634b9fa8-9b32-4c25-9ef0-9274f327f3e5" path="/var/lib/kubelet/pods/634b9fa8-9b32-4c25-9ef0-9274f327f3e5/volumes" Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.802618 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-fwpt9"] Oct 07 13:39:04 crc kubenswrapper[4854]: E1007 13:39:04.803171 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634b9fa8-9b32-4c25-9ef0-9274f327f3e5" containerName="storage" Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.803192 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="634b9fa8-9b32-4c25-9ef0-9274f327f3e5" containerName="storage" Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.803480 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="634b9fa8-9b32-4c25-9ef0-9274f327f3e5" containerName="storage" Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.804713 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fwpt9" Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.808428 4854 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-glcct" Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.809682 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.809871 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.810511 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.811214 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-fwpt9"] Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.970023 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc10c909-494b-4047-8f3a-3a0832cefbe5-node-mnt\") pod \"crc-storage-crc-fwpt9\" (UID: \"dc10c909-494b-4047-8f3a-3a0832cefbe5\") " pod="crc-storage/crc-storage-crc-fwpt9" Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.971258 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x78s6\" (UniqueName: \"kubernetes.io/projected/dc10c909-494b-4047-8f3a-3a0832cefbe5-kube-api-access-x78s6\") pod \"crc-storage-crc-fwpt9\" (UID: \"dc10c909-494b-4047-8f3a-3a0832cefbe5\") " pod="crc-storage/crc-storage-crc-fwpt9" Oct 07 13:39:04 crc kubenswrapper[4854]: I1007 13:39:04.971347 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc10c909-494b-4047-8f3a-3a0832cefbe5-crc-storage\") pod \"crc-storage-crc-fwpt9\" (UID: \"dc10c909-494b-4047-8f3a-3a0832cefbe5\") " pod="crc-storage/crc-storage-crc-fwpt9" Oct 07 13:39:05 crc kubenswrapper[4854]: I1007 13:39:05.072333 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc10c909-494b-4047-8f3a-3a0832cefbe5-crc-storage\") pod \"crc-storage-crc-fwpt9\" (UID: \"dc10c909-494b-4047-8f3a-3a0832cefbe5\") " pod="crc-storage/crc-storage-crc-fwpt9" Oct 07 13:39:05 crc kubenswrapper[4854]: I1007 13:39:05.072934 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc10c909-494b-4047-8f3a-3a0832cefbe5-node-mnt\") pod \"crc-storage-crc-fwpt9\" (UID: \"dc10c909-494b-4047-8f3a-3a0832cefbe5\") " pod="crc-storage/crc-storage-crc-fwpt9" Oct 07 13:39:05 crc kubenswrapper[4854]: I1007 13:39:05.073197 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x78s6\" (UniqueName: \"kubernetes.io/projected/dc10c909-494b-4047-8f3a-3a0832cefbe5-kube-api-access-x78s6\") pod \"crc-storage-crc-fwpt9\" (UID: \"dc10c909-494b-4047-8f3a-3a0832cefbe5\") " pod="crc-storage/crc-storage-crc-fwpt9" Oct 07 13:39:05 crc kubenswrapper[4854]: I1007 13:39:05.073360 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc10c909-494b-4047-8f3a-3a0832cefbe5-node-mnt\") pod \"crc-storage-crc-fwpt9\" (UID: \"dc10c909-494b-4047-8f3a-3a0832cefbe5\") " pod="crc-storage/crc-storage-crc-fwpt9" Oct 07 13:39:05 crc kubenswrapper[4854]: I1007 13:39:05.074086 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc10c909-494b-4047-8f3a-3a0832cefbe5-crc-storage\") pod \"crc-storage-crc-fwpt9\" (UID: \"dc10c909-494b-4047-8f3a-3a0832cefbe5\") " pod="crc-storage/crc-storage-crc-fwpt9" Oct 07 13:39:05 crc kubenswrapper[4854]: I1007 13:39:05.097236 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x78s6\" (UniqueName: \"kubernetes.io/projected/dc10c909-494b-4047-8f3a-3a0832cefbe5-kube-api-access-x78s6\") pod \"crc-storage-crc-fwpt9\" (UID: \"dc10c909-494b-4047-8f3a-3a0832cefbe5\") " pod="crc-storage/crc-storage-crc-fwpt9" Oct 07 13:39:05 crc kubenswrapper[4854]: I1007 13:39:05.130625 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fwpt9" Oct 07 13:39:05 crc kubenswrapper[4854]: I1007 13:39:05.410560 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-fwpt9"] Oct 07 13:39:05 crc kubenswrapper[4854]: I1007 13:39:05.849515 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fwpt9" event={"ID":"dc10c909-494b-4047-8f3a-3a0832cefbe5","Type":"ContainerStarted","Data":"524a0989e5c3745047c7e1f80a897fa94a0a5917e31db4dca747ac072583887e"} Oct 07 13:39:06 crc kubenswrapper[4854]: I1007 13:39:06.862364 4854 generic.go:334] "Generic (PLEG): container finished" podID="dc10c909-494b-4047-8f3a-3a0832cefbe5" containerID="6301bd9bac443a4addabd5060acd995633322cc8c5d35b084d4eac560a5170f9" exitCode=0 Oct 07 13:39:06 crc kubenswrapper[4854]: I1007 13:39:06.862474 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fwpt9" event={"ID":"dc10c909-494b-4047-8f3a-3a0832cefbe5","Type":"ContainerDied","Data":"6301bd9bac443a4addabd5060acd995633322cc8c5d35b084d4eac560a5170f9"} Oct 07 13:39:08 crc kubenswrapper[4854]: I1007 13:39:08.246353 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fwpt9" Oct 07 13:39:08 crc kubenswrapper[4854]: I1007 13:39:08.429439 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc10c909-494b-4047-8f3a-3a0832cefbe5-crc-storage\") pod \"dc10c909-494b-4047-8f3a-3a0832cefbe5\" (UID: \"dc10c909-494b-4047-8f3a-3a0832cefbe5\") " Oct 07 13:39:08 crc kubenswrapper[4854]: I1007 13:39:08.429618 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x78s6\" (UniqueName: \"kubernetes.io/projected/dc10c909-494b-4047-8f3a-3a0832cefbe5-kube-api-access-x78s6\") pod \"dc10c909-494b-4047-8f3a-3a0832cefbe5\" (UID: \"dc10c909-494b-4047-8f3a-3a0832cefbe5\") " Oct 07 13:39:08 crc kubenswrapper[4854]: I1007 13:39:08.429718 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc10c909-494b-4047-8f3a-3a0832cefbe5-node-mnt\") pod \"dc10c909-494b-4047-8f3a-3a0832cefbe5\" (UID: \"dc10c909-494b-4047-8f3a-3a0832cefbe5\") " Oct 07 13:39:08 crc kubenswrapper[4854]: I1007 13:39:08.429993 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc10c909-494b-4047-8f3a-3a0832cefbe5-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "dc10c909-494b-4047-8f3a-3a0832cefbe5" (UID: "dc10c909-494b-4047-8f3a-3a0832cefbe5"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:39:08 crc kubenswrapper[4854]: I1007 13:39:08.430491 4854 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc10c909-494b-4047-8f3a-3a0832cefbe5-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:08 crc kubenswrapper[4854]: I1007 13:39:08.435591 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc10c909-494b-4047-8f3a-3a0832cefbe5-kube-api-access-x78s6" (OuterVolumeSpecName: "kube-api-access-x78s6") pod "dc10c909-494b-4047-8f3a-3a0832cefbe5" (UID: "dc10c909-494b-4047-8f3a-3a0832cefbe5"). InnerVolumeSpecName "kube-api-access-x78s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:39:08 crc kubenswrapper[4854]: I1007 13:39:08.454043 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc10c909-494b-4047-8f3a-3a0832cefbe5-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "dc10c909-494b-4047-8f3a-3a0832cefbe5" (UID: "dc10c909-494b-4047-8f3a-3a0832cefbe5"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:39:08 crc kubenswrapper[4854]: I1007 13:39:08.531877 4854 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc10c909-494b-4047-8f3a-3a0832cefbe5-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:08 crc kubenswrapper[4854]: I1007 13:39:08.531924 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x78s6\" (UniqueName: \"kubernetes.io/projected/dc10c909-494b-4047-8f3a-3a0832cefbe5-kube-api-access-x78s6\") on node \"crc\" DevicePath \"\"" Oct 07 13:39:08 crc kubenswrapper[4854]: I1007 13:39:08.882868 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fwpt9" event={"ID":"dc10c909-494b-4047-8f3a-3a0832cefbe5","Type":"ContainerDied","Data":"524a0989e5c3745047c7e1f80a897fa94a0a5917e31db4dca747ac072583887e"} Oct 07 13:39:08 crc kubenswrapper[4854]: I1007 13:39:08.883271 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="524a0989e5c3745047c7e1f80a897fa94a0a5917e31db4dca747ac072583887e" Oct 07 13:39:08 crc kubenswrapper[4854]: I1007 13:39:08.882962 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fwpt9" Oct 07 13:39:20 crc kubenswrapper[4854]: I1007 13:39:20.627801 4854 scope.go:117] "RemoveContainer" containerID="2f297ee6c743aa9873760accb9f8555a341df361ace6131af3a2092c7dc90149" Oct 07 13:39:40 crc kubenswrapper[4854]: I1007 13:39:40.807703 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:39:40 crc kubenswrapper[4854]: I1007 13:39:40.808626 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:40:10 crc kubenswrapper[4854]: I1007 13:40:10.807906 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:40:10 crc kubenswrapper[4854]: I1007 13:40:10.808583 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:40:40 crc kubenswrapper[4854]: I1007 13:40:40.807884 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:40:40 crc kubenswrapper[4854]: I1007 13:40:40.808582 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:40:40 crc kubenswrapper[4854]: I1007 13:40:40.808643 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 13:40:40 crc kubenswrapper[4854]: I1007 13:40:40.809353 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:40:40 crc kubenswrapper[4854]: I1007 13:40:40.809417 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" gracePeriod=600 Oct 07 13:40:40 crc kubenswrapper[4854]: E1007 13:40:40.953755 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:40:41 crc kubenswrapper[4854]: I1007 13:40:41.753400 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" exitCode=0 Oct 07 13:40:41 crc kubenswrapper[4854]: I1007 13:40:41.753461 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea"} Oct 07 13:40:41 crc kubenswrapper[4854]: I1007 13:40:41.753850 4854 scope.go:117] "RemoveContainer" containerID="eda8261d4908d65e0e20b4ac93a6888107ad25b338c39cca102080a541d21d95" Oct 07 13:40:41 crc kubenswrapper[4854]: I1007 13:40:41.754530 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:40:41 crc kubenswrapper[4854]: E1007 13:40:41.754959 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:40:53 crc kubenswrapper[4854]: I1007 13:40:53.703743 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:40:53 crc kubenswrapper[4854]: E1007 13:40:53.704771 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:41:08 crc kubenswrapper[4854]: I1007 13:41:08.703818 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:41:08 crc kubenswrapper[4854]: E1007 13:41:08.704810 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:41:21 crc kubenswrapper[4854]: I1007 13:41:21.702958 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:41:21 crc kubenswrapper[4854]: E1007 13:41:21.703751 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:41:33 crc kubenswrapper[4854]: I1007 13:41:33.703646 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:41:33 crc kubenswrapper[4854]: E1007 13:41:33.705006 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:41:45 crc kubenswrapper[4854]: I1007 13:41:45.702624 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:41:45 crc kubenswrapper[4854]: E1007 13:41:45.703246 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:41:59 crc kubenswrapper[4854]: I1007 13:41:59.703081 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:41:59 crc kubenswrapper[4854]: E1007 13:41:59.704213 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.626809 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-47hrb"] Oct 07 13:42:04 crc kubenswrapper[4854]: E1007 13:42:04.628188 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc10c909-494b-4047-8f3a-3a0832cefbe5" containerName="storage" Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.628220 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc10c909-494b-4047-8f3a-3a0832cefbe5" containerName="storage" Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.628573 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc10c909-494b-4047-8f3a-3a0832cefbe5" containerName="storage" Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.630703 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.648676 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47hrb"] Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.669598 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-utilities\") pod \"community-operators-47hrb\" (UID: \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\") " pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.669979 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vrr2\" (UniqueName: \"kubernetes.io/projected/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-kube-api-access-7vrr2\") pod \"community-operators-47hrb\" (UID: \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\") " pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.670064 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-catalog-content\") pod \"community-operators-47hrb\" (UID: \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\") " pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.771105 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vrr2\" (UniqueName: \"kubernetes.io/projected/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-kube-api-access-7vrr2\") pod \"community-operators-47hrb\" (UID: \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\") " pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.771503 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-catalog-content\") pod \"community-operators-47hrb\" (UID: \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\") " pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.771584 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-utilities\") pod \"community-operators-47hrb\" (UID: \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\") " pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.772010 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-catalog-content\") pod \"community-operators-47hrb\" (UID: \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\") " pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.772104 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-utilities\") pod \"community-operators-47hrb\" (UID: \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\") " pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.802372 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vrr2\" (UniqueName: \"kubernetes.io/projected/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-kube-api-access-7vrr2\") pod \"community-operators-47hrb\" (UID: \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\") " pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:04 crc kubenswrapper[4854]: I1007 13:42:04.957023 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:05 crc kubenswrapper[4854]: I1007 13:42:05.504999 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47hrb"] Oct 07 13:42:05 crc kubenswrapper[4854]: I1007 13:42:05.565051 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47hrb" event={"ID":"d2c0c718-8a98-4a91-9deb-ee79445ee1e8","Type":"ContainerStarted","Data":"8e28b2c53ea62b6348ef03b04491ceb140382c025baa19c3ef3269709e06fb63"} Oct 07 13:42:06 crc kubenswrapper[4854]: I1007 13:42:06.578761 4854 generic.go:334] "Generic (PLEG): container finished" podID="d2c0c718-8a98-4a91-9deb-ee79445ee1e8" containerID="18c22bc2916376529a73f0ecf48105093810df22057abc99d572b5b7c54489f7" exitCode=0 Oct 07 13:42:06 crc kubenswrapper[4854]: I1007 13:42:06.579067 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47hrb" event={"ID":"d2c0c718-8a98-4a91-9deb-ee79445ee1e8","Type":"ContainerDied","Data":"18c22bc2916376529a73f0ecf48105093810df22057abc99d572b5b7c54489f7"} Oct 07 13:42:08 crc kubenswrapper[4854]: I1007 13:42:08.603842 4854 generic.go:334] "Generic (PLEG): container finished" podID="d2c0c718-8a98-4a91-9deb-ee79445ee1e8" containerID="9d7058f56625b72382ccb3b766a794327bb29db3ea1f0159ffe9b08759b8b650" exitCode=0 Oct 07 13:42:08 crc kubenswrapper[4854]: I1007 13:42:08.603938 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47hrb" event={"ID":"d2c0c718-8a98-4a91-9deb-ee79445ee1e8","Type":"ContainerDied","Data":"9d7058f56625b72382ccb3b766a794327bb29db3ea1f0159ffe9b08759b8b650"} Oct 07 13:42:10 crc kubenswrapper[4854]: I1007 13:42:10.630955 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47hrb" event={"ID":"d2c0c718-8a98-4a91-9deb-ee79445ee1e8","Type":"ContainerStarted","Data":"c464c8430e1a2e9d750bcd0c727ee79d33a576c6aca4188462276619df7253e4"} Oct 07 13:42:12 crc kubenswrapper[4854]: I1007 13:42:12.703027 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:42:12 crc kubenswrapper[4854]: E1007 13:42:12.704518 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:42:14 crc kubenswrapper[4854]: I1007 13:42:14.957430 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:14 crc kubenswrapper[4854]: I1007 13:42:14.957796 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:15 crc kubenswrapper[4854]: I1007 13:42:15.018762 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:15 crc kubenswrapper[4854]: I1007 13:42:15.047594 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-47hrb" podStartSLOduration=8.539599643 podStartE2EDuration="11.047561409s" podCreationTimestamp="2025-10-07 13:42:04 +0000 UTC" firstStartedPulling="2025-10-07 13:42:06.583011249 +0000 UTC m=+4642.570843514" lastFinishedPulling="2025-10-07 13:42:09.090972985 +0000 UTC m=+4645.078805280" observedRunningTime="2025-10-07 13:42:10.657934316 +0000 UTC m=+4646.645766571" watchObservedRunningTime="2025-10-07 13:42:15.047561409 +0000 UTC m=+4651.035393704" Oct 07 13:42:15 crc kubenswrapper[4854]: I1007 13:42:15.739713 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:15 crc kubenswrapper[4854]: I1007 13:42:15.784367 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47hrb"] Oct 07 13:42:17 crc kubenswrapper[4854]: I1007 13:42:17.698387 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-47hrb" podUID="d2c0c718-8a98-4a91-9deb-ee79445ee1e8" containerName="registry-server" containerID="cri-o://c464c8430e1a2e9d750bcd0c727ee79d33a576c6aca4188462276619df7253e4" gracePeriod=2 Oct 07 13:42:18 crc kubenswrapper[4854]: I1007 13:42:18.715312 4854 generic.go:334] "Generic (PLEG): container finished" podID="d2c0c718-8a98-4a91-9deb-ee79445ee1e8" containerID="c464c8430e1a2e9d750bcd0c727ee79d33a576c6aca4188462276619df7253e4" exitCode=0 Oct 07 13:42:18 crc kubenswrapper[4854]: I1007 13:42:18.726175 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47hrb" event={"ID":"d2c0c718-8a98-4a91-9deb-ee79445ee1e8","Type":"ContainerDied","Data":"c464c8430e1a2e9d750bcd0c727ee79d33a576c6aca4188462276619df7253e4"} Oct 07 13:42:18 crc kubenswrapper[4854]: I1007 13:42:18.726216 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47hrb" event={"ID":"d2c0c718-8a98-4a91-9deb-ee79445ee1e8","Type":"ContainerDied","Data":"8e28b2c53ea62b6348ef03b04491ceb140382c025baa19c3ef3269709e06fb63"} Oct 07 13:42:18 crc kubenswrapper[4854]: I1007 13:42:18.726227 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e28b2c53ea62b6348ef03b04491ceb140382c025baa19c3ef3269709e06fb63" Oct 07 13:42:18 crc kubenswrapper[4854]: I1007 13:42:18.731215 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:18 crc kubenswrapper[4854]: I1007 13:42:18.789938 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vrr2\" (UniqueName: \"kubernetes.io/projected/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-kube-api-access-7vrr2\") pod \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\" (UID: \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\") " Oct 07 13:42:18 crc kubenswrapper[4854]: I1007 13:42:18.795846 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-kube-api-access-7vrr2" (OuterVolumeSpecName: "kube-api-access-7vrr2") pod "d2c0c718-8a98-4a91-9deb-ee79445ee1e8" (UID: "d2c0c718-8a98-4a91-9deb-ee79445ee1e8"). InnerVolumeSpecName "kube-api-access-7vrr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:42:18 crc kubenswrapper[4854]: I1007 13:42:18.891423 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-catalog-content\") pod \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\" (UID: \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\") " Oct 07 13:42:18 crc kubenswrapper[4854]: I1007 13:42:18.891606 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-utilities\") pod \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\" (UID: \"d2c0c718-8a98-4a91-9deb-ee79445ee1e8\") " Oct 07 13:42:18 crc kubenswrapper[4854]: I1007 13:42:18.892230 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vrr2\" (UniqueName: \"kubernetes.io/projected/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-kube-api-access-7vrr2\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:18 crc kubenswrapper[4854]: I1007 13:42:18.893110 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-utilities" (OuterVolumeSpecName: "utilities") pod "d2c0c718-8a98-4a91-9deb-ee79445ee1e8" (UID: "d2c0c718-8a98-4a91-9deb-ee79445ee1e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:42:18 crc kubenswrapper[4854]: I1007 13:42:18.993818 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:19 crc kubenswrapper[4854]: I1007 13:42:19.193402 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2c0c718-8a98-4a91-9deb-ee79445ee1e8" (UID: "d2c0c718-8a98-4a91-9deb-ee79445ee1e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:42:19 crc kubenswrapper[4854]: I1007 13:42:19.197269 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2c0c718-8a98-4a91-9deb-ee79445ee1e8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:19 crc kubenswrapper[4854]: I1007 13:42:19.723952 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47hrb" Oct 07 13:42:19 crc kubenswrapper[4854]: I1007 13:42:19.775424 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47hrb"] Oct 07 13:42:19 crc kubenswrapper[4854]: I1007 13:42:19.780951 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-47hrb"] Oct 07 13:42:20 crc kubenswrapper[4854]: I1007 13:42:20.718818 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2c0c718-8a98-4a91-9deb-ee79445ee1e8" path="/var/lib/kubelet/pods/d2c0c718-8a98-4a91-9deb-ee79445ee1e8/volumes" Oct 07 13:42:26 crc kubenswrapper[4854]: I1007 13:42:26.702607 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:42:26 crc kubenswrapper[4854]: E1007 13:42:26.703183 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.597093 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-dzrd5"] Oct 07 13:42:28 crc kubenswrapper[4854]: E1007 13:42:28.597387 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c0c718-8a98-4a91-9deb-ee79445ee1e8" containerName="extract-content" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.597399 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c0c718-8a98-4a91-9deb-ee79445ee1e8" containerName="extract-content" Oct 07 13:42:28 crc kubenswrapper[4854]: E1007 13:42:28.597418 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c0c718-8a98-4a91-9deb-ee79445ee1e8" containerName="registry-server" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.597424 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c0c718-8a98-4a91-9deb-ee79445ee1e8" containerName="registry-server" Oct 07 13:42:28 crc kubenswrapper[4854]: E1007 13:42:28.597433 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c0c718-8a98-4a91-9deb-ee79445ee1e8" containerName="extract-utilities" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.597440 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c0c718-8a98-4a91-9deb-ee79445ee1e8" containerName="extract-utilities" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.597581 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c0c718-8a98-4a91-9deb-ee79445ee1e8" containerName="registry-server" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.598272 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.605420 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.605485 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.605654 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-tpsvk" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.605743 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.607937 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.610373 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-dzrd5"] Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.732929 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdcmh\" (UniqueName: \"kubernetes.io/projected/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-kube-api-access-qdcmh\") pod \"dnsmasq-dns-5d7b5456f5-dzrd5\" (UID: \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.733655 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-dzrd5\" (UID: \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.733793 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-config\") pod \"dnsmasq-dns-5d7b5456f5-dzrd5\" (UID: \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.835178 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-config\") pod \"dnsmasq-dns-5d7b5456f5-dzrd5\" (UID: \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.835250 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdcmh\" (UniqueName: \"kubernetes.io/projected/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-kube-api-access-qdcmh\") pod \"dnsmasq-dns-5d7b5456f5-dzrd5\" (UID: \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.835274 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-dzrd5\" (UID: \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.836075 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-dzrd5\" (UID: \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.836595 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-config\") pod \"dnsmasq-dns-5d7b5456f5-dzrd5\" (UID: \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.860686 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-mnlfq"] Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.861859 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.866961 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdcmh\" (UniqueName: \"kubernetes.io/projected/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-kube-api-access-qdcmh\") pod \"dnsmasq-dns-5d7b5456f5-dzrd5\" (UID: \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\") " pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.884105 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-mnlfq"] Oct 07 13:42:28 crc kubenswrapper[4854]: I1007 13:42:28.917777 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.038106 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-config\") pod \"dnsmasq-dns-98ddfc8f-mnlfq\" (UID: \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\") " pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.038327 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hsjx\" (UniqueName: \"kubernetes.io/projected/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-kube-api-access-4hsjx\") pod \"dnsmasq-dns-98ddfc8f-mnlfq\" (UID: \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\") " pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.038396 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-mnlfq\" (UID: \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\") " pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.139994 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-config\") pod \"dnsmasq-dns-98ddfc8f-mnlfq\" (UID: \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\") " pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.140060 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hsjx\" (UniqueName: \"kubernetes.io/projected/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-kube-api-access-4hsjx\") pod \"dnsmasq-dns-98ddfc8f-mnlfq\" (UID: \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\") " pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.140110 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-mnlfq\" (UID: \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\") " pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.140898 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-config\") pod \"dnsmasq-dns-98ddfc8f-mnlfq\" (UID: \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\") " pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.140971 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-mnlfq\" (UID: \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\") " pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.175373 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hsjx\" (UniqueName: \"kubernetes.io/projected/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-kube-api-access-4hsjx\") pod \"dnsmasq-dns-98ddfc8f-mnlfq\" (UID: \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\") " pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.206450 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.341446 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-dzrd5"] Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.641198 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-mnlfq"] Oct 07 13:42:29 crc kubenswrapper[4854]: W1007 13:42:29.649260 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e2beed9_bee3_40eb_a67f_30e29fc2cae1.slice/crio-49dba889f339da33ef8be99ddb8c401cba630e84fbf2d13413e07ca3302a8110 WatchSource:0}: Error finding container 49dba889f339da33ef8be99ddb8c401cba630e84fbf2d13413e07ca3302a8110: Status 404 returned error can't find the container with id 49dba889f339da33ef8be99ddb8c401cba630e84fbf2d13413e07ca3302a8110 Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.748619 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.750258 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.752191 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.752197 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.752252 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.752312 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-khxrd" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.752333 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.768166 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.823261 4854 generic.go:334] "Generic (PLEG): container finished" podID="eb3e67ac-bb7e-4028-ab59-697edf4bcaca" containerID="b5fef9720f0cc824ab8c41d2f0fba987b176b5985c2bb741692d1fe055315c4b" exitCode=0 Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.823324 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" event={"ID":"eb3e67ac-bb7e-4028-ab59-697edf4bcaca","Type":"ContainerDied","Data":"b5fef9720f0cc824ab8c41d2f0fba987b176b5985c2bb741692d1fe055315c4b"} Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.823350 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" event={"ID":"eb3e67ac-bb7e-4028-ab59-697edf4bcaca","Type":"ContainerStarted","Data":"486abe6fbf1ff4d80206bbb37a217f5c4d6335be92ca5dc6f411206b8454ec06"} Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.833385 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" event={"ID":"1e2beed9-bee3-40eb-a67f-30e29fc2cae1","Type":"ContainerStarted","Data":"49dba889f339da33ef8be99ddb8c401cba630e84fbf2d13413e07ca3302a8110"} Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.850195 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/289d0100-7bba-49ad-97d5-5da00aff7892-pod-info\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.850265 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.850306 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/289d0100-7bba-49ad-97d5-5da00aff7892-server-conf\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.850335 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/289d0100-7bba-49ad-97d5-5da00aff7892-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.850366 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.850406 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-db9ccd3b-741c-45c5-884e-66033016cffe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db9ccd3b-741c-45c5-884e-66033016cffe\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.850434 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.850463 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/289d0100-7bba-49ad-97d5-5da00aff7892-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.850504 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdhrw\" (UniqueName: \"kubernetes.io/projected/289d0100-7bba-49ad-97d5-5da00aff7892-kube-api-access-kdhrw\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.951350 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.951399 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-db9ccd3b-741c-45c5-884e-66033016cffe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db9ccd3b-741c-45c5-884e-66033016cffe\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.951426 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.951451 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/289d0100-7bba-49ad-97d5-5da00aff7892-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.951471 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdhrw\" (UniqueName: \"kubernetes.io/projected/289d0100-7bba-49ad-97d5-5da00aff7892-kube-api-access-kdhrw\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.951498 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/289d0100-7bba-49ad-97d5-5da00aff7892-pod-info\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.951527 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.951563 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/289d0100-7bba-49ad-97d5-5da00aff7892-server-conf\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.951591 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/289d0100-7bba-49ad-97d5-5da00aff7892-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.953115 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/289d0100-7bba-49ad-97d5-5da00aff7892-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.953165 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.953399 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.954140 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/289d0100-7bba-49ad-97d5-5da00aff7892-server-conf\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.963493 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.972491 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/289d0100-7bba-49ad-97d5-5da00aff7892-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.973328 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/289d0100-7bba-49ad-97d5-5da00aff7892-pod-info\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.975978 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.985757 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.987812 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.988461 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.988575 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-db9ccd3b-741c-45c5-884e-66033016cffe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db9ccd3b-741c-45c5-884e-66033016cffe\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f8ae3fdce626746353c12a32b28066849916b5f2fe33f78cf5c597724d462143/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.989478 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-fflvt" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.991597 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 07 13:42:29 crc kubenswrapper[4854]: I1007 13:42:29.998311 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdhrw\" (UniqueName: \"kubernetes.io/projected/289d0100-7bba-49ad-97d5-5da00aff7892-kube-api-access-kdhrw\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.001820 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.009366 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.018783 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.019324 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.020002 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mbbwj" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.019371 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.027367 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.036849 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.078497 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-db9ccd3b-741c-45c5-884e-66033016cffe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db9ccd3b-741c-45c5-884e-66033016cffe\") pod \"rabbitmq-server-0\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " pod="openstack/rabbitmq-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.154751 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/277cd1b6-6ae4-48d5-9a5b-a6c314a11464-config-data\") pod \"memcached-0\" (UID: \"277cd1b6-6ae4-48d5-9a5b-a6c314a11464\") " pod="openstack/memcached-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.154804 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.154830 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/93f36345-422b-4c6d-ae0e-81df0c77849c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.154847 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.154889 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/277cd1b6-6ae4-48d5-9a5b-a6c314a11464-kolla-config\") pod \"memcached-0\" (UID: \"277cd1b6-6ae4-48d5-9a5b-a6c314a11464\") " pod="openstack/memcached-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.154909 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.154928 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcp7l\" (UniqueName: \"kubernetes.io/projected/277cd1b6-6ae4-48d5-9a5b-a6c314a11464-kube-api-access-bcp7l\") pod \"memcached-0\" (UID: \"277cd1b6-6ae4-48d5-9a5b-a6c314a11464\") " pod="openstack/memcached-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.154971 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/93f36345-422b-4c6d-ae0e-81df0c77849c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.154994 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bmh7\" (UniqueName: \"kubernetes.io/projected/93f36345-422b-4c6d-ae0e-81df0c77849c-kube-api-access-9bmh7\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.155009 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.155041 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/93f36345-422b-4c6d-ae0e-81df0c77849c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.155063 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/93f36345-422b-4c6d-ae0e-81df0c77849c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.208921 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.257382 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/93f36345-422b-4c6d-ae0e-81df0c77849c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.257425 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bmh7\" (UniqueName: \"kubernetes.io/projected/93f36345-422b-4c6d-ae0e-81df0c77849c-kube-api-access-9bmh7\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.257445 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.257489 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/93f36345-422b-4c6d-ae0e-81df0c77849c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.257512 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/93f36345-422b-4c6d-ae0e-81df0c77849c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.257530 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/277cd1b6-6ae4-48d5-9a5b-a6c314a11464-config-data\") pod \"memcached-0\" (UID: \"277cd1b6-6ae4-48d5-9a5b-a6c314a11464\") " pod="openstack/memcached-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.257552 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.257574 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/93f36345-422b-4c6d-ae0e-81df0c77849c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.257588 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.257614 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/277cd1b6-6ae4-48d5-9a5b-a6c314a11464-kolla-config\") pod \"memcached-0\" (UID: \"277cd1b6-6ae4-48d5-9a5b-a6c314a11464\") " pod="openstack/memcached-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.257634 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.257654 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcp7l\" (UniqueName: \"kubernetes.io/projected/277cd1b6-6ae4-48d5-9a5b-a6c314a11464-kube-api-access-bcp7l\") pod \"memcached-0\" (UID: \"277cd1b6-6ae4-48d5-9a5b-a6c314a11464\") " pod="openstack/memcached-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.258635 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.259027 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/277cd1b6-6ae4-48d5-9a5b-a6c314a11464-kolla-config\") pod \"memcached-0\" (UID: \"277cd1b6-6ae4-48d5-9a5b-a6c314a11464\") " pod="openstack/memcached-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.259305 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/93f36345-422b-4c6d-ae0e-81df0c77849c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.259428 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/277cd1b6-6ae4-48d5-9a5b-a6c314a11464-config-data\") pod \"memcached-0\" (UID: \"277cd1b6-6ae4-48d5-9a5b-a6c314a11464\") " pod="openstack/memcached-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.259598 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/93f36345-422b-4c6d-ae0e-81df0c77849c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.260822 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.260860 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a14a361ed6851d73e900ab2b820f03fca2facbdd6d044258239c17fbcb5c3195/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.260935 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.261842 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/93f36345-422b-4c6d-ae0e-81df0c77849c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.262221 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/93f36345-422b-4c6d-ae0e-81df0c77849c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.262358 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.283498 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.284371 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcp7l\" (UniqueName: \"kubernetes.io/projected/277cd1b6-6ae4-48d5-9a5b-a6c314a11464-kube-api-access-bcp7l\") pod \"memcached-0\" (UID: \"277cd1b6-6ae4-48d5-9a5b-a6c314a11464\") " pod="openstack/memcached-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.291968 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bmh7\" (UniqueName: \"kubernetes.io/projected/93f36345-422b-4c6d-ae0e-81df0c77849c-kube-api-access-9bmh7\") pod \"rabbitmq-cell1-server-0\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.400678 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.409723 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.456566 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.457906 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.462971 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.463063 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.463351 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.463439 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-v42fx" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.468631 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.472881 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.478004 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.563740 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.563858 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.563921 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-secrets\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.564005 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bszfc\" (UniqueName: \"kubernetes.io/projected/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-kube-api-access-bszfc\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.564071 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-kolla-config\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.564095 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.564177 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-config-data-default\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.564219 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b43e4ac1-aba7-43aa-ade9-fab0b984c713\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b43e4ac1-aba7-43aa-ade9-fab0b984c713\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.564262 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.632994 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:42:30 crc kubenswrapper[4854]: W1007 13:42:30.636095 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod289d0100_7bba_49ad_97d5_5da00aff7892.slice/crio-f4f0751091e2f1369ea1019490020d3a7ee7bdc74eb80ea3b008375019dbef03 WatchSource:0}: Error finding container f4f0751091e2f1369ea1019490020d3a7ee7bdc74eb80ea3b008375019dbef03: Status 404 returned error can't find the container with id f4f0751091e2f1369ea1019490020d3a7ee7bdc74eb80ea3b008375019dbef03 Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.665310 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.665351 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-secrets\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.665398 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bszfc\" (UniqueName: \"kubernetes.io/projected/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-kube-api-access-bszfc\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.665427 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-kolla-config\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.665443 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.665484 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-config-data-default\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.665509 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b43e4ac1-aba7-43aa-ade9-fab0b984c713\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b43e4ac1-aba7-43aa-ade9-fab0b984c713\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.665525 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.665561 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.667088 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-config-data-default\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.667187 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.667385 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.667684 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-kolla-config\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.669124 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.672260 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-secrets\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.672744 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.672786 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b43e4ac1-aba7-43aa-ade9-fab0b984c713\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b43e4ac1-aba7-43aa-ade9-fab0b984c713\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c2d9171890cd9b601673b726fcb700920366c29b00abc5e1e8ead7a63b1fb54c/globalmount\"" pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.686583 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.696554 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bszfc\" (UniqueName: \"kubernetes.io/projected/5f5e5df6-7e72-4352-bb73-c30a9d3841dc-kube-api-access-bszfc\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.715920 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b43e4ac1-aba7-43aa-ade9-fab0b984c713\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b43e4ac1-aba7-43aa-ade9-fab0b984c713\") pod \"openstack-galera-0\" (UID: \"5f5e5df6-7e72-4352-bb73-c30a9d3841dc\") " pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.789031 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.848937 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"289d0100-7bba-49ad-97d5-5da00aff7892","Type":"ContainerStarted","Data":"f4f0751091e2f1369ea1019490020d3a7ee7bdc74eb80ea3b008375019dbef03"} Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.853354 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" event={"ID":"eb3e67ac-bb7e-4028-ab59-697edf4bcaca","Type":"ContainerStarted","Data":"dd16f30054201d0c15152d5a5aca5689ea97684ddf0f4515446a8d7197dc53ff"} Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.853417 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.858119 4854 generic.go:334] "Generic (PLEG): container finished" podID="1e2beed9-bee3-40eb-a67f-30e29fc2cae1" containerID="e2212ccb65b66334f096d3e38479e12df84727641949be9e08824c0e459f4873" exitCode=0 Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.858207 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" event={"ID":"1e2beed9-bee3-40eb-a67f-30e29fc2cae1","Type":"ContainerDied","Data":"e2212ccb65b66334f096d3e38479e12df84727641949be9e08824c0e459f4873"} Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.864191 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.891488 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" podStartSLOduration=2.891469609 podStartE2EDuration="2.891469609s" podCreationTimestamp="2025-10-07 13:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:42:30.886690572 +0000 UTC m=+4666.874522847" watchObservedRunningTime="2025-10-07 13:42:30.891469609 +0000 UTC m=+4666.879301864" Oct 07 13:42:30 crc kubenswrapper[4854]: W1007 13:42:30.932302 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93f36345_422b_4c6d_ae0e_81df0c77849c.slice/crio-2529f8fae7f0f3bb8e187c44be0fed78381127e92124162939082fe3cdcc07e2 WatchSource:0}: Error finding container 2529f8fae7f0f3bb8e187c44be0fed78381127e92124162939082fe3cdcc07e2: Status 404 returned error can't find the container with id 2529f8fae7f0f3bb8e187c44be0fed78381127e92124162939082fe3cdcc07e2 Oct 07 13:42:30 crc kubenswrapper[4854]: I1007 13:42:30.966918 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 13:42:31 crc kubenswrapper[4854]: W1007 13:42:31.040076 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod277cd1b6_6ae4_48d5_9a5b_a6c314a11464.slice/crio-8609252a8b886b04b539030fff7389b79787e66f60f564ef4f7d7553bfe17109 WatchSource:0}: Error finding container 8609252a8b886b04b539030fff7389b79787e66f60f564ef4f7d7553bfe17109: Status 404 returned error can't find the container with id 8609252a8b886b04b539030fff7389b79787e66f60f564ef4f7d7553bfe17109 Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.157781 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.160677 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.163394 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.165012 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.167612 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-6mmsr" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.167732 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.170096 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.242129 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.274696 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3724f057-2582-4f07-8663-0ac33e5dde32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3724f057-2582-4f07-8663-0ac33e5dde32\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.274734 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl8tz\" (UniqueName: \"kubernetes.io/projected/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-kube-api-access-rl8tz\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.274765 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.274799 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.274821 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.274843 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.274862 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.274987 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.275037 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: W1007 13:42:31.291073 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f5e5df6_7e72_4352_bb73_c30a9d3841dc.slice/crio-91c4a75d08beaf48e73ad05ccdd5f37ad54adf86c15ac56bee4223ca87250500 WatchSource:0}: Error finding container 91c4a75d08beaf48e73ad05ccdd5f37ad54adf86c15ac56bee4223ca87250500: Status 404 returned error can't find the container with id 91c4a75d08beaf48e73ad05ccdd5f37ad54adf86c15ac56bee4223ca87250500 Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.377078 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3724f057-2582-4f07-8663-0ac33e5dde32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3724f057-2582-4f07-8663-0ac33e5dde32\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.377194 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl8tz\" (UniqueName: \"kubernetes.io/projected/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-kube-api-access-rl8tz\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.377259 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.377308 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.377364 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.377390 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.377431 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.377478 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.377610 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.378343 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.378884 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.379793 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.380127 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.382282 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.382393 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.382466 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3724f057-2582-4f07-8663-0ac33e5dde32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3724f057-2582-4f07-8663-0ac33e5dde32\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/df28169dc66dceac819cd9040773f73b069071b2ce77ce0405757ea0c8505c50/globalmount\"" pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.385577 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.386270 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.395098 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl8tz\" (UniqueName: \"kubernetes.io/projected/c5a86825-ec56-46fe-9e53-98d5d66dc2a2-kube-api-access-rl8tz\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.419007 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3724f057-2582-4f07-8663-0ac33e5dde32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3724f057-2582-4f07-8663-0ac33e5dde32\") pod \"openstack-cell1-galera-0\" (UID: \"c5a86825-ec56-46fe-9e53-98d5d66dc2a2\") " pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.653267 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.869261 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"289d0100-7bba-49ad-97d5-5da00aff7892","Type":"ContainerStarted","Data":"b8aea0ec11206a84658e3be4f69f5848dce3034722f3c71933bf792cc9e532f6"} Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.870862 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"93f36345-422b-4c6d-ae0e-81df0c77849c","Type":"ContainerStarted","Data":"2529f8fae7f0f3bb8e187c44be0fed78381127e92124162939082fe3cdcc07e2"} Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.874414 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5f5e5df6-7e72-4352-bb73-c30a9d3841dc","Type":"ContainerStarted","Data":"f4d4e655726b8302d38fa75e6e1260409f9ab51263d54fac23ab7aa5c9c7aa6b"} Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.874451 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5f5e5df6-7e72-4352-bb73-c30a9d3841dc","Type":"ContainerStarted","Data":"91c4a75d08beaf48e73ad05ccdd5f37ad54adf86c15ac56bee4223ca87250500"} Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.877398 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" event={"ID":"1e2beed9-bee3-40eb-a67f-30e29fc2cae1","Type":"ContainerStarted","Data":"a3e92b175c2285941f89b6bdbdcb58819bd1deda0da250c1dacf41ccd33e47ee"} Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.878135 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.881628 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"277cd1b6-6ae4-48d5-9a5b-a6c314a11464","Type":"ContainerStarted","Data":"0dcefb7815dd08f0c9663a44da7bf96371f0406a911608328c8c81aedc634a82"} Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.881707 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"277cd1b6-6ae4-48d5-9a5b-a6c314a11464","Type":"ContainerStarted","Data":"8609252a8b886b04b539030fff7389b79787e66f60f564ef4f7d7553bfe17109"} Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.890501 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.941898 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.942005 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" podStartSLOduration=3.94198474 podStartE2EDuration="3.94198474s" podCreationTimestamp="2025-10-07 13:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:42:31.917752211 +0000 UTC m=+4667.905584476" watchObservedRunningTime="2025-10-07 13:42:31.94198474 +0000 UTC m=+4667.929817005" Oct 07 13:42:31 crc kubenswrapper[4854]: I1007 13:42:31.951678 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.951654299 podStartE2EDuration="2.951654299s" podCreationTimestamp="2025-10-07 13:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:42:31.93990095 +0000 UTC m=+4667.927733205" watchObservedRunningTime="2025-10-07 13:42:31.951654299 +0000 UTC m=+4667.939486554" Oct 07 13:42:32 crc kubenswrapper[4854]: I1007 13:42:32.897214 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c5a86825-ec56-46fe-9e53-98d5d66dc2a2","Type":"ContainerStarted","Data":"95449b2f15d9b89148e70dc6717b54f712a9a6f8d7e1a15e7c0558e73871bcf5"} Oct 07 13:42:32 crc kubenswrapper[4854]: I1007 13:42:32.897735 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c5a86825-ec56-46fe-9e53-98d5d66dc2a2","Type":"ContainerStarted","Data":"9d65844811041ebe54175bf81e1759b0abb9024a9ccdeebaa83a3915517bdd16"} Oct 07 13:42:33 crc kubenswrapper[4854]: I1007 13:42:33.910497 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"93f36345-422b-4c6d-ae0e-81df0c77849c","Type":"ContainerStarted","Data":"9674bb15cc599e22f842d49fa840811dfb11404d97f48a6e5949fbb4bee13d60"} Oct 07 13:42:35 crc kubenswrapper[4854]: I1007 13:42:35.928729 4854 generic.go:334] "Generic (PLEG): container finished" podID="5f5e5df6-7e72-4352-bb73-c30a9d3841dc" containerID="f4d4e655726b8302d38fa75e6e1260409f9ab51263d54fac23ab7aa5c9c7aa6b" exitCode=0 Oct 07 13:42:35 crc kubenswrapper[4854]: I1007 13:42:35.928790 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5f5e5df6-7e72-4352-bb73-c30a9d3841dc","Type":"ContainerDied","Data":"f4d4e655726b8302d38fa75e6e1260409f9ab51263d54fac23ab7aa5c9c7aa6b"} Oct 07 13:42:36 crc kubenswrapper[4854]: I1007 13:42:36.962434 4854 generic.go:334] "Generic (PLEG): container finished" podID="c5a86825-ec56-46fe-9e53-98d5d66dc2a2" containerID="95449b2f15d9b89148e70dc6717b54f712a9a6f8d7e1a15e7c0558e73871bcf5" exitCode=0 Oct 07 13:42:36 crc kubenswrapper[4854]: I1007 13:42:36.962508 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c5a86825-ec56-46fe-9e53-98d5d66dc2a2","Type":"ContainerDied","Data":"95449b2f15d9b89148e70dc6717b54f712a9a6f8d7e1a15e7c0558e73871bcf5"} Oct 07 13:42:36 crc kubenswrapper[4854]: I1007 13:42:36.966530 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5f5e5df6-7e72-4352-bb73-c30a9d3841dc","Type":"ContainerStarted","Data":"049c0022d84417e009a72334243c551a998be4d36b9218192fe4e205241ca5d7"} Oct 07 13:42:37 crc kubenswrapper[4854]: I1007 13:42:37.023075 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.023044439 podStartE2EDuration="8.023044439s" podCreationTimestamp="2025-10-07 13:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:42:37.021047642 +0000 UTC m=+4673.008879937" watchObservedRunningTime="2025-10-07 13:42:37.023044439 +0000 UTC m=+4673.010876804" Oct 07 13:42:37 crc kubenswrapper[4854]: I1007 13:42:37.981106 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c5a86825-ec56-46fe-9e53-98d5d66dc2a2","Type":"ContainerStarted","Data":"5dfa7abc0caaafaba28b3efb3dd6abedfd53569dd7a46ea3f3e282ab40e61871"} Oct 07 13:42:38 crc kubenswrapper[4854]: I1007 13:42:38.014141 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.014099636 podStartE2EDuration="8.014099636s" podCreationTimestamp="2025-10-07 13:42:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:42:38.007462334 +0000 UTC m=+4673.995294629" watchObservedRunningTime="2025-10-07 13:42:38.014099636 +0000 UTC m=+4674.001931931" Oct 07 13:42:38 crc kubenswrapper[4854]: I1007 13:42:38.709901 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:42:38 crc kubenswrapper[4854]: E1007 13:42:38.710373 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:42:38 crc kubenswrapper[4854]: I1007 13:42:38.920498 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.208327 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.251582 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-dzrd5"] Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.252359 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" podUID="eb3e67ac-bb7e-4028-ab59-697edf4bcaca" containerName="dnsmasq-dns" containerID="cri-o://dd16f30054201d0c15152d5a5aca5689ea97684ddf0f4515446a8d7197dc53ff" gracePeriod=10 Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.736278 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.808193 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdcmh\" (UniqueName: \"kubernetes.io/projected/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-kube-api-access-qdcmh\") pod \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\" (UID: \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\") " Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.808268 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-config\") pod \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\" (UID: \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\") " Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.808371 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-dns-svc\") pod \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\" (UID: \"eb3e67ac-bb7e-4028-ab59-697edf4bcaca\") " Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.813964 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-kube-api-access-qdcmh" (OuterVolumeSpecName: "kube-api-access-qdcmh") pod "eb3e67ac-bb7e-4028-ab59-697edf4bcaca" (UID: "eb3e67ac-bb7e-4028-ab59-697edf4bcaca"). InnerVolumeSpecName "kube-api-access-qdcmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.841403 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb3e67ac-bb7e-4028-ab59-697edf4bcaca" (UID: "eb3e67ac-bb7e-4028-ab59-697edf4bcaca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.846334 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-config" (OuterVolumeSpecName: "config") pod "eb3e67ac-bb7e-4028-ab59-697edf4bcaca" (UID: "eb3e67ac-bb7e-4028-ab59-697edf4bcaca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.909823 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.909854 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdcmh\" (UniqueName: \"kubernetes.io/projected/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-kube-api-access-qdcmh\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.909864 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3e67ac-bb7e-4028-ab59-697edf4bcaca-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.997249 4854 generic.go:334] "Generic (PLEG): container finished" podID="eb3e67ac-bb7e-4028-ab59-697edf4bcaca" containerID="dd16f30054201d0c15152d5a5aca5689ea97684ddf0f4515446a8d7197dc53ff" exitCode=0 Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.997309 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" event={"ID":"eb3e67ac-bb7e-4028-ab59-697edf4bcaca","Type":"ContainerDied","Data":"dd16f30054201d0c15152d5a5aca5689ea97684ddf0f4515446a8d7197dc53ff"} Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.997625 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" event={"ID":"eb3e67ac-bb7e-4028-ab59-697edf4bcaca","Type":"ContainerDied","Data":"486abe6fbf1ff4d80206bbb37a217f5c4d6335be92ca5dc6f411206b8454ec06"} Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.997661 4854 scope.go:117] "RemoveContainer" containerID="dd16f30054201d0c15152d5a5aca5689ea97684ddf0f4515446a8d7197dc53ff" Oct 07 13:42:39 crc kubenswrapper[4854]: I1007 13:42:39.997694 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-dzrd5" Oct 07 13:42:40 crc kubenswrapper[4854]: I1007 13:42:40.024683 4854 scope.go:117] "RemoveContainer" containerID="b5fef9720f0cc824ab8c41d2f0fba987b176b5985c2bb741692d1fe055315c4b" Oct 07 13:42:40 crc kubenswrapper[4854]: I1007 13:42:40.042536 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-dzrd5"] Oct 07 13:42:40 crc kubenswrapper[4854]: I1007 13:42:40.047351 4854 scope.go:117] "RemoveContainer" containerID="dd16f30054201d0c15152d5a5aca5689ea97684ddf0f4515446a8d7197dc53ff" Oct 07 13:42:40 crc kubenswrapper[4854]: E1007 13:42:40.047925 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd16f30054201d0c15152d5a5aca5689ea97684ddf0f4515446a8d7197dc53ff\": container with ID starting with dd16f30054201d0c15152d5a5aca5689ea97684ddf0f4515446a8d7197dc53ff not found: ID does not exist" containerID="dd16f30054201d0c15152d5a5aca5689ea97684ddf0f4515446a8d7197dc53ff" Oct 07 13:42:40 crc kubenswrapper[4854]: I1007 13:42:40.047985 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd16f30054201d0c15152d5a5aca5689ea97684ddf0f4515446a8d7197dc53ff"} err="failed to get container status \"dd16f30054201d0c15152d5a5aca5689ea97684ddf0f4515446a8d7197dc53ff\": rpc error: code = NotFound desc = could not find container \"dd16f30054201d0c15152d5a5aca5689ea97684ddf0f4515446a8d7197dc53ff\": container with ID starting with dd16f30054201d0c15152d5a5aca5689ea97684ddf0f4515446a8d7197dc53ff not found: ID does not exist" Oct 07 13:42:40 crc kubenswrapper[4854]: I1007 13:42:40.048023 4854 scope.go:117] "RemoveContainer" containerID="b5fef9720f0cc824ab8c41d2f0fba987b176b5985c2bb741692d1fe055315c4b" Oct 07 13:42:40 crc kubenswrapper[4854]: E1007 13:42:40.048521 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5fef9720f0cc824ab8c41d2f0fba987b176b5985c2bb741692d1fe055315c4b\": container with ID starting with b5fef9720f0cc824ab8c41d2f0fba987b176b5985c2bb741692d1fe055315c4b not found: ID does not exist" containerID="b5fef9720f0cc824ab8c41d2f0fba987b176b5985c2bb741692d1fe055315c4b" Oct 07 13:42:40 crc kubenswrapper[4854]: I1007 13:42:40.048745 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5fef9720f0cc824ab8c41d2f0fba987b176b5985c2bb741692d1fe055315c4b"} err="failed to get container status \"b5fef9720f0cc824ab8c41d2f0fba987b176b5985c2bb741692d1fe055315c4b\": rpc error: code = NotFound desc = could not find container \"b5fef9720f0cc824ab8c41d2f0fba987b176b5985c2bb741692d1fe055315c4b\": container with ID starting with b5fef9720f0cc824ab8c41d2f0fba987b176b5985c2bb741692d1fe055315c4b not found: ID does not exist" Oct 07 13:42:40 crc kubenswrapper[4854]: I1007 13:42:40.050413 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-dzrd5"] Oct 07 13:42:40 crc kubenswrapper[4854]: I1007 13:42:40.401941 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 07 13:42:40 crc kubenswrapper[4854]: I1007 13:42:40.713430 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb3e67ac-bb7e-4028-ab59-697edf4bcaca" path="/var/lib/kubelet/pods/eb3e67ac-bb7e-4028-ab59-697edf4bcaca/volumes" Oct 07 13:42:40 crc kubenswrapper[4854]: I1007 13:42:40.789945 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 07 13:42:40 crc kubenswrapper[4854]: I1007 13:42:40.790257 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 07 13:42:40 crc kubenswrapper[4854]: I1007 13:42:40.842011 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 07 13:42:41 crc kubenswrapper[4854]: I1007 13:42:41.063847 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 07 13:42:41 crc kubenswrapper[4854]: I1007 13:42:41.653715 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:41 crc kubenswrapper[4854]: I1007 13:42:41.654274 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:43 crc kubenswrapper[4854]: I1007 13:42:43.784340 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:43 crc kubenswrapper[4854]: I1007 13:42:43.863995 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 07 13:42:51 crc kubenswrapper[4854]: I1007 13:42:51.702812 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:42:51 crc kubenswrapper[4854]: E1007 13:42:51.703886 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:43:05 crc kubenswrapper[4854]: I1007 13:43:05.229026 4854 generic.go:334] "Generic (PLEG): container finished" podID="289d0100-7bba-49ad-97d5-5da00aff7892" containerID="b8aea0ec11206a84658e3be4f69f5848dce3034722f3c71933bf792cc9e532f6" exitCode=0 Oct 07 13:43:05 crc kubenswrapper[4854]: I1007 13:43:05.229123 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"289d0100-7bba-49ad-97d5-5da00aff7892","Type":"ContainerDied","Data":"b8aea0ec11206a84658e3be4f69f5848dce3034722f3c71933bf792cc9e532f6"} Oct 07 13:43:05 crc kubenswrapper[4854]: I1007 13:43:05.703529 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:43:05 crc kubenswrapper[4854]: E1007 13:43:05.704554 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:43:06 crc kubenswrapper[4854]: I1007 13:43:06.242403 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"289d0100-7bba-49ad-97d5-5da00aff7892","Type":"ContainerStarted","Data":"7a00ecb8cf25fdee65d903a782cd5ccf586f3850bb7dbcf79840a3b6967c16fb"} Oct 07 13:43:06 crc kubenswrapper[4854]: I1007 13:43:06.242669 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 07 13:43:06 crc kubenswrapper[4854]: I1007 13:43:06.244997 4854 generic.go:334] "Generic (PLEG): container finished" podID="93f36345-422b-4c6d-ae0e-81df0c77849c" containerID="9674bb15cc599e22f842d49fa840811dfb11404d97f48a6e5949fbb4bee13d60" exitCode=0 Oct 07 13:43:06 crc kubenswrapper[4854]: I1007 13:43:06.245032 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"93f36345-422b-4c6d-ae0e-81df0c77849c","Type":"ContainerDied","Data":"9674bb15cc599e22f842d49fa840811dfb11404d97f48a6e5949fbb4bee13d60"} Oct 07 13:43:06 crc kubenswrapper[4854]: I1007 13:43:06.281478 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.281449658 podStartE2EDuration="38.281449658s" podCreationTimestamp="2025-10-07 13:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:43:06.266178698 +0000 UTC m=+4702.254010953" watchObservedRunningTime="2025-10-07 13:43:06.281449658 +0000 UTC m=+4702.269281953" Oct 07 13:43:07 crc kubenswrapper[4854]: I1007 13:43:07.254341 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"93f36345-422b-4c6d-ae0e-81df0c77849c","Type":"ContainerStarted","Data":"d23770d8332ddb6faa33926db8af888e57fa814665d848dd0e2dc5baa1789073"} Oct 07 13:43:07 crc kubenswrapper[4854]: I1007 13:43:07.254676 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:07 crc kubenswrapper[4854]: I1007 13:43:07.282479 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.282455742 podStartE2EDuration="39.282455742s" podCreationTimestamp="2025-10-07 13:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:43:07.277911271 +0000 UTC m=+4703.265743536" watchObservedRunningTime="2025-10-07 13:43:07.282455742 +0000 UTC m=+4703.270287997" Oct 07 13:43:18 crc kubenswrapper[4854]: I1007 13:43:18.704228 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:43:18 crc kubenswrapper[4854]: E1007 13:43:18.706474 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:43:20 crc kubenswrapper[4854]: I1007 13:43:20.213403 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 07 13:43:20 crc kubenswrapper[4854]: I1007 13:43:20.413487 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.437561 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pjkqv"] Oct 07 13:43:24 crc kubenswrapper[4854]: E1007 13:43:24.438523 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb3e67ac-bb7e-4028-ab59-697edf4bcaca" containerName="init" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.438540 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb3e67ac-bb7e-4028-ab59-697edf4bcaca" containerName="init" Oct 07 13:43:24 crc kubenswrapper[4854]: E1007 13:43:24.438563 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb3e67ac-bb7e-4028-ab59-697edf4bcaca" containerName="dnsmasq-dns" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.438572 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb3e67ac-bb7e-4028-ab59-697edf4bcaca" containerName="dnsmasq-dns" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.438762 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb3e67ac-bb7e-4028-ab59-697edf4bcaca" containerName="dnsmasq-dns" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.439737 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.447545 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pjkqv"] Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.558095 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0705eb-9aa0-4760-86f2-af1fe2be570b-config\") pod \"dnsmasq-dns-5b7946d7b9-pjkqv\" (UID: \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.558190 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6zqd\" (UniqueName: \"kubernetes.io/projected/dd0705eb-9aa0-4760-86f2-af1fe2be570b-kube-api-access-b6zqd\") pod \"dnsmasq-dns-5b7946d7b9-pjkqv\" (UID: \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.558249 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd0705eb-9aa0-4760-86f2-af1fe2be570b-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-pjkqv\" (UID: \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.660278 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6zqd\" (UniqueName: \"kubernetes.io/projected/dd0705eb-9aa0-4760-86f2-af1fe2be570b-kube-api-access-b6zqd\") pod \"dnsmasq-dns-5b7946d7b9-pjkqv\" (UID: \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.660417 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd0705eb-9aa0-4760-86f2-af1fe2be570b-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-pjkqv\" (UID: \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.660512 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0705eb-9aa0-4760-86f2-af1fe2be570b-config\") pod \"dnsmasq-dns-5b7946d7b9-pjkqv\" (UID: \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.661508 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd0705eb-9aa0-4760-86f2-af1fe2be570b-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-pjkqv\" (UID: \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.661856 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0705eb-9aa0-4760-86f2-af1fe2be570b-config\") pod \"dnsmasq-dns-5b7946d7b9-pjkqv\" (UID: \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.690788 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6zqd\" (UniqueName: \"kubernetes.io/projected/dd0705eb-9aa0-4760-86f2-af1fe2be570b-kube-api-access-b6zqd\") pod \"dnsmasq-dns-5b7946d7b9-pjkqv\" (UID: \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:43:24 crc kubenswrapper[4854]: I1007 13:43:24.767757 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:43:25 crc kubenswrapper[4854]: I1007 13:43:25.239270 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pjkqv"] Oct 07 13:43:25 crc kubenswrapper[4854]: I1007 13:43:25.284326 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:43:25 crc kubenswrapper[4854]: I1007 13:43:25.447883 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" event={"ID":"dd0705eb-9aa0-4760-86f2-af1fe2be570b","Type":"ContainerStarted","Data":"e2abd5ef03c3f4d762c523d560ffa63262b13bbe8b51753b8fc4c9f5accf941b"} Oct 07 13:43:25 crc kubenswrapper[4854]: I1007 13:43:25.963734 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:43:26 crc kubenswrapper[4854]: I1007 13:43:26.459235 4854 generic.go:334] "Generic (PLEG): container finished" podID="dd0705eb-9aa0-4760-86f2-af1fe2be570b" containerID="180d1c3cca5b5fb4919d220de5dbc9ba6836d2fd415b58d06ca62ef6f5bd79c7" exitCode=0 Oct 07 13:43:26 crc kubenswrapper[4854]: I1007 13:43:26.459282 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" event={"ID":"dd0705eb-9aa0-4760-86f2-af1fe2be570b","Type":"ContainerDied","Data":"180d1c3cca5b5fb4919d220de5dbc9ba6836d2fd415b58d06ca62ef6f5bd79c7"} Oct 07 13:43:27 crc kubenswrapper[4854]: I1007 13:43:27.197303 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="289d0100-7bba-49ad-97d5-5da00aff7892" containerName="rabbitmq" containerID="cri-o://7a00ecb8cf25fdee65d903a782cd5ccf586f3850bb7dbcf79840a3b6967c16fb" gracePeriod=604799 Oct 07 13:43:27 crc kubenswrapper[4854]: I1007 13:43:27.472266 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" event={"ID":"dd0705eb-9aa0-4760-86f2-af1fe2be570b","Type":"ContainerStarted","Data":"c2612888d742e1ebcf334051c040771ef82f2849b4ab87da7ed074b8bb411678"} Oct 07 13:43:27 crc kubenswrapper[4854]: I1007 13:43:27.472587 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:43:27 crc kubenswrapper[4854]: I1007 13:43:27.501821 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" podStartSLOduration=3.501792264 podStartE2EDuration="3.501792264s" podCreationTimestamp="2025-10-07 13:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:43:27.493907997 +0000 UTC m=+4723.481740292" watchObservedRunningTime="2025-10-07 13:43:27.501792264 +0000 UTC m=+4723.489624549" Oct 07 13:43:27 crc kubenswrapper[4854]: I1007 13:43:27.788331 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="93f36345-422b-4c6d-ae0e-81df0c77849c" containerName="rabbitmq" containerID="cri-o://d23770d8332ddb6faa33926db8af888e57fa814665d848dd0e2dc5baa1789073" gracePeriod=604799 Oct 07 13:43:30 crc kubenswrapper[4854]: I1007 13:43:30.210035 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="289d0100-7bba-49ad-97d5-5da00aff7892" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.240:5672: connect: connection refused" Oct 07 13:43:30 crc kubenswrapper[4854]: I1007 13:43:30.411138 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="93f36345-422b-4c6d-ae0e-81df0c77849c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.242:5672: connect: connection refused" Oct 07 13:43:32 crc kubenswrapper[4854]: I1007 13:43:32.706236 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:43:32 crc kubenswrapper[4854]: E1007 13:43:32.706543 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:43:33 crc kubenswrapper[4854]: I1007 13:43:33.541919 4854 generic.go:334] "Generic (PLEG): container finished" podID="289d0100-7bba-49ad-97d5-5da00aff7892" containerID="7a00ecb8cf25fdee65d903a782cd5ccf586f3850bb7dbcf79840a3b6967c16fb" exitCode=0 Oct 07 13:43:33 crc kubenswrapper[4854]: I1007 13:43:33.542219 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"289d0100-7bba-49ad-97d5-5da00aff7892","Type":"ContainerDied","Data":"7a00ecb8cf25fdee65d903a782cd5ccf586f3850bb7dbcf79840a3b6967c16fb"} Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.093962 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.180493 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db9ccd3b-741c-45c5-884e-66033016cffe\") pod \"289d0100-7bba-49ad-97d5-5da00aff7892\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.180573 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-confd\") pod \"289d0100-7bba-49ad-97d5-5da00aff7892\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.180624 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/289d0100-7bba-49ad-97d5-5da00aff7892-erlang-cookie-secret\") pod \"289d0100-7bba-49ad-97d5-5da00aff7892\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.180650 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-plugins\") pod \"289d0100-7bba-49ad-97d5-5da00aff7892\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.180686 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-erlang-cookie\") pod \"289d0100-7bba-49ad-97d5-5da00aff7892\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.180719 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdhrw\" (UniqueName: \"kubernetes.io/projected/289d0100-7bba-49ad-97d5-5da00aff7892-kube-api-access-kdhrw\") pod \"289d0100-7bba-49ad-97d5-5da00aff7892\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.180765 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/289d0100-7bba-49ad-97d5-5da00aff7892-pod-info\") pod \"289d0100-7bba-49ad-97d5-5da00aff7892\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.180833 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/289d0100-7bba-49ad-97d5-5da00aff7892-server-conf\") pod \"289d0100-7bba-49ad-97d5-5da00aff7892\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.180875 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/289d0100-7bba-49ad-97d5-5da00aff7892-plugins-conf\") pod \"289d0100-7bba-49ad-97d5-5da00aff7892\" (UID: \"289d0100-7bba-49ad-97d5-5da00aff7892\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.181879 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "289d0100-7bba-49ad-97d5-5da00aff7892" (UID: "289d0100-7bba-49ad-97d5-5da00aff7892"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.182265 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/289d0100-7bba-49ad-97d5-5da00aff7892-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "289d0100-7bba-49ad-97d5-5da00aff7892" (UID: "289d0100-7bba-49ad-97d5-5da00aff7892"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.183021 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "289d0100-7bba-49ad-97d5-5da00aff7892" (UID: "289d0100-7bba-49ad-97d5-5da00aff7892"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.191311 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/289d0100-7bba-49ad-97d5-5da00aff7892-pod-info" (OuterVolumeSpecName: "pod-info") pod "289d0100-7bba-49ad-97d5-5da00aff7892" (UID: "289d0100-7bba-49ad-97d5-5da00aff7892"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.204774 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/289d0100-7bba-49ad-97d5-5da00aff7892-kube-api-access-kdhrw" (OuterVolumeSpecName: "kube-api-access-kdhrw") pod "289d0100-7bba-49ad-97d5-5da00aff7892" (UID: "289d0100-7bba-49ad-97d5-5da00aff7892"). InnerVolumeSpecName "kube-api-access-kdhrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.209301 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/289d0100-7bba-49ad-97d5-5da00aff7892-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "289d0100-7bba-49ad-97d5-5da00aff7892" (UID: "289d0100-7bba-49ad-97d5-5da00aff7892"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.222448 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db9ccd3b-741c-45c5-884e-66033016cffe" (OuterVolumeSpecName: "persistence") pod "289d0100-7bba-49ad-97d5-5da00aff7892" (UID: "289d0100-7bba-49ad-97d5-5da00aff7892"). InnerVolumeSpecName "pvc-db9ccd3b-741c-45c5-884e-66033016cffe". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.223072 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/289d0100-7bba-49ad-97d5-5da00aff7892-server-conf" (OuterVolumeSpecName: "server-conf") pod "289d0100-7bba-49ad-97d5-5da00aff7892" (UID: "289d0100-7bba-49ad-97d5-5da00aff7892"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.283471 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-db9ccd3b-741c-45c5-884e-66033016cffe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db9ccd3b-741c-45c5-884e-66033016cffe\") on node \"crc\" " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.283772 4854 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.283848 4854 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/289d0100-7bba-49ad-97d5-5da00aff7892-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.283937 4854 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.284012 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdhrw\" (UniqueName: \"kubernetes.io/projected/289d0100-7bba-49ad-97d5-5da00aff7892-kube-api-access-kdhrw\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.284077 4854 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/289d0100-7bba-49ad-97d5-5da00aff7892-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.284166 4854 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/289d0100-7bba-49ad-97d5-5da00aff7892-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.284237 4854 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/289d0100-7bba-49ad-97d5-5da00aff7892-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.302077 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "289d0100-7bba-49ad-97d5-5da00aff7892" (UID: "289d0100-7bba-49ad-97d5-5da00aff7892"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.302636 4854 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.302879 4854 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-db9ccd3b-741c-45c5-884e-66033016cffe" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db9ccd3b-741c-45c5-884e-66033016cffe") on node "crc" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.346762 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.384672 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-plugins\") pod \"93f36345-422b-4c6d-ae0e-81df0c77849c\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.384735 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-erlang-cookie\") pod \"93f36345-422b-4c6d-ae0e-81df0c77849c\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.384941 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\") pod \"93f36345-422b-4c6d-ae0e-81df0c77849c\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.384967 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/93f36345-422b-4c6d-ae0e-81df0c77849c-erlang-cookie-secret\") pod \"93f36345-422b-4c6d-ae0e-81df0c77849c\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.384985 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/93f36345-422b-4c6d-ae0e-81df0c77849c-plugins-conf\") pod \"93f36345-422b-4c6d-ae0e-81df0c77849c\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.385034 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/93f36345-422b-4c6d-ae0e-81df0c77849c-server-conf\") pod \"93f36345-422b-4c6d-ae0e-81df0c77849c\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.385060 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bmh7\" (UniqueName: \"kubernetes.io/projected/93f36345-422b-4c6d-ae0e-81df0c77849c-kube-api-access-9bmh7\") pod \"93f36345-422b-4c6d-ae0e-81df0c77849c\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.385081 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-confd\") pod \"93f36345-422b-4c6d-ae0e-81df0c77849c\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.385112 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/93f36345-422b-4c6d-ae0e-81df0c77849c-pod-info\") pod \"93f36345-422b-4c6d-ae0e-81df0c77849c\" (UID: \"93f36345-422b-4c6d-ae0e-81df0c77849c\") " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.385322 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "93f36345-422b-4c6d-ae0e-81df0c77849c" (UID: "93f36345-422b-4c6d-ae0e-81df0c77849c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.385503 4854 reconciler_common.go:293] "Volume detached for volume \"pvc-db9ccd3b-741c-45c5-884e-66033016cffe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db9ccd3b-741c-45c5-884e-66033016cffe\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.385517 4854 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/289d0100-7bba-49ad-97d5-5da00aff7892-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.385528 4854 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.385534 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f36345-422b-4c6d-ae0e-81df0c77849c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "93f36345-422b-4c6d-ae0e-81df0c77849c" (UID: "93f36345-422b-4c6d-ae0e-81df0c77849c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.385638 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "93f36345-422b-4c6d-ae0e-81df0c77849c" (UID: "93f36345-422b-4c6d-ae0e-81df0c77849c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.398662 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/93f36345-422b-4c6d-ae0e-81df0c77849c-pod-info" (OuterVolumeSpecName: "pod-info") pod "93f36345-422b-4c6d-ae0e-81df0c77849c" (UID: "93f36345-422b-4c6d-ae0e-81df0c77849c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.398799 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f36345-422b-4c6d-ae0e-81df0c77849c-kube-api-access-9bmh7" (OuterVolumeSpecName: "kube-api-access-9bmh7") pod "93f36345-422b-4c6d-ae0e-81df0c77849c" (UID: "93f36345-422b-4c6d-ae0e-81df0c77849c"). InnerVolumeSpecName "kube-api-access-9bmh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.399104 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f36345-422b-4c6d-ae0e-81df0c77849c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "93f36345-422b-4c6d-ae0e-81df0c77849c" (UID: "93f36345-422b-4c6d-ae0e-81df0c77849c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.403082 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5" (OuterVolumeSpecName: "persistence") pod "93f36345-422b-4c6d-ae0e-81df0c77849c" (UID: "93f36345-422b-4c6d-ae0e-81df0c77849c"). InnerVolumeSpecName "pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.424962 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f36345-422b-4c6d-ae0e-81df0c77849c-server-conf" (OuterVolumeSpecName: "server-conf") pod "93f36345-422b-4c6d-ae0e-81df0c77849c" (UID: "93f36345-422b-4c6d-ae0e-81df0c77849c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.480872 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "93f36345-422b-4c6d-ae0e-81df0c77849c" (UID: "93f36345-422b-4c6d-ae0e-81df0c77849c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.490990 4854 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\") on node \"crc\" " Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.491028 4854 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/93f36345-422b-4c6d-ae0e-81df0c77849c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.491039 4854 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/93f36345-422b-4c6d-ae0e-81df0c77849c-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.491049 4854 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/93f36345-422b-4c6d-ae0e-81df0c77849c-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.491058 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bmh7\" (UniqueName: \"kubernetes.io/projected/93f36345-422b-4c6d-ae0e-81df0c77849c-kube-api-access-9bmh7\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.491069 4854 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.491077 4854 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/93f36345-422b-4c6d-ae0e-81df0c77849c-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.491084 4854 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/93f36345-422b-4c6d-ae0e-81df0c77849c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.518673 4854 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.518937 4854 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5") on node "crc" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.552426 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"289d0100-7bba-49ad-97d5-5da00aff7892","Type":"ContainerDied","Data":"f4f0751091e2f1369ea1019490020d3a7ee7bdc74eb80ea3b008375019dbef03"} Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.552472 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.552485 4854 scope.go:117] "RemoveContainer" containerID="7a00ecb8cf25fdee65d903a782cd5ccf586f3850bb7dbcf79840a3b6967c16fb" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.558988 4854 generic.go:334] "Generic (PLEG): container finished" podID="93f36345-422b-4c6d-ae0e-81df0c77849c" containerID="d23770d8332ddb6faa33926db8af888e57fa814665d848dd0e2dc5baa1789073" exitCode=0 Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.559020 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"93f36345-422b-4c6d-ae0e-81df0c77849c","Type":"ContainerDied","Data":"d23770d8332ddb6faa33926db8af888e57fa814665d848dd0e2dc5baa1789073"} Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.559041 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"93f36345-422b-4c6d-ae0e-81df0c77849c","Type":"ContainerDied","Data":"2529f8fae7f0f3bb8e187c44be0fed78381127e92124162939082fe3cdcc07e2"} Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.559077 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.578238 4854 scope.go:117] "RemoveContainer" containerID="b8aea0ec11206a84658e3be4f69f5848dce3034722f3c71933bf792cc9e532f6" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.592930 4854 reconciler_common.go:293] "Volume detached for volume \"pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.614178 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.619986 4854 scope.go:117] "RemoveContainer" containerID="d23770d8332ddb6faa33926db8af888e57fa814665d848dd0e2dc5baa1789073" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.623567 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.643713 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.663044 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:43:34 crc kubenswrapper[4854]: E1007 13:43:34.664273 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="289d0100-7bba-49ad-97d5-5da00aff7892" containerName="setup-container" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.664379 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="289d0100-7bba-49ad-97d5-5da00aff7892" containerName="setup-container" Oct 07 13:43:34 crc kubenswrapper[4854]: E1007 13:43:34.664509 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="289d0100-7bba-49ad-97d5-5da00aff7892" containerName="rabbitmq" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.664621 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="289d0100-7bba-49ad-97d5-5da00aff7892" containerName="rabbitmq" Oct 07 13:43:34 crc kubenswrapper[4854]: E1007 13:43:34.664804 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f36345-422b-4c6d-ae0e-81df0c77849c" containerName="setup-container" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.665183 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f36345-422b-4c6d-ae0e-81df0c77849c" containerName="setup-container" Oct 07 13:43:34 crc kubenswrapper[4854]: E1007 13:43:34.665314 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f36345-422b-4c6d-ae0e-81df0c77849c" containerName="rabbitmq" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.665528 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f36345-422b-4c6d-ae0e-81df0c77849c" containerName="rabbitmq" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.665999 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="289d0100-7bba-49ad-97d5-5da00aff7892" containerName="rabbitmq" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.666109 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f36345-422b-4c6d-ae0e-81df0c77849c" containerName="rabbitmq" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.668367 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.675805 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.676046 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.676195 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.676328 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-khxrd" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.676471 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.680047 4854 scope.go:117] "RemoveContainer" containerID="9674bb15cc599e22f842d49fa840811dfb11404d97f48a6e5949fbb4bee13d60" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.692478 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.695101 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/463af3df-b7a7-45fe-a892-19b29608505d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.695169 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/463af3df-b7a7-45fe-a892-19b29608505d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.695223 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-db9ccd3b-741c-45c5-884e-66033016cffe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db9ccd3b-741c-45c5-884e-66033016cffe\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.695250 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/463af3df-b7a7-45fe-a892-19b29608505d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.695297 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/463af3df-b7a7-45fe-a892-19b29608505d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.695440 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fth2d\" (UniqueName: \"kubernetes.io/projected/463af3df-b7a7-45fe-a892-19b29608505d-kube-api-access-fth2d\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.695481 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/463af3df-b7a7-45fe-a892-19b29608505d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.695623 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/463af3df-b7a7-45fe-a892-19b29608505d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.695660 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/463af3df-b7a7-45fe-a892-19b29608505d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.707991 4854 scope.go:117] "RemoveContainer" containerID="d23770d8332ddb6faa33926db8af888e57fa814665d848dd0e2dc5baa1789073" Oct 07 13:43:34 crc kubenswrapper[4854]: E1007 13:43:34.713674 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d23770d8332ddb6faa33926db8af888e57fa814665d848dd0e2dc5baa1789073\": container with ID starting with d23770d8332ddb6faa33926db8af888e57fa814665d848dd0e2dc5baa1789073 not found: ID does not exist" containerID="d23770d8332ddb6faa33926db8af888e57fa814665d848dd0e2dc5baa1789073" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.713734 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d23770d8332ddb6faa33926db8af888e57fa814665d848dd0e2dc5baa1789073"} err="failed to get container status \"d23770d8332ddb6faa33926db8af888e57fa814665d848dd0e2dc5baa1789073\": rpc error: code = NotFound desc = could not find container \"d23770d8332ddb6faa33926db8af888e57fa814665d848dd0e2dc5baa1789073\": container with ID starting with d23770d8332ddb6faa33926db8af888e57fa814665d848dd0e2dc5baa1789073 not found: ID does not exist" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.713767 4854 scope.go:117] "RemoveContainer" containerID="9674bb15cc599e22f842d49fa840811dfb11404d97f48a6e5949fbb4bee13d60" Oct 07 13:43:34 crc kubenswrapper[4854]: E1007 13:43:34.715299 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9674bb15cc599e22f842d49fa840811dfb11404d97f48a6e5949fbb4bee13d60\": container with ID starting with 9674bb15cc599e22f842d49fa840811dfb11404d97f48a6e5949fbb4bee13d60 not found: ID does not exist" containerID="9674bb15cc599e22f842d49fa840811dfb11404d97f48a6e5949fbb4bee13d60" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.715362 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9674bb15cc599e22f842d49fa840811dfb11404d97f48a6e5949fbb4bee13d60"} err="failed to get container status \"9674bb15cc599e22f842d49fa840811dfb11404d97f48a6e5949fbb4bee13d60\": rpc error: code = NotFound desc = could not find container \"9674bb15cc599e22f842d49fa840811dfb11404d97f48a6e5949fbb4bee13d60\": container with ID starting with 9674bb15cc599e22f842d49fa840811dfb11404d97f48a6e5949fbb4bee13d60 not found: ID does not exist" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.733346 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="289d0100-7bba-49ad-97d5-5da00aff7892" path="/var/lib/kubelet/pods/289d0100-7bba-49ad-97d5-5da00aff7892/volumes" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.734197 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f36345-422b-4c6d-ae0e-81df0c77849c" path="/var/lib/kubelet/pods/93f36345-422b-4c6d-ae0e-81df0c77849c/volumes" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.734768 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.735630 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.739091 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.742696 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.742733 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.742808 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.742901 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.743014 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mbbwj" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.744229 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.769321 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.799138 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d2c1bc67-9713-443b-90af-57a362c1d358-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.799678 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d2c1bc67-9713-443b-90af-57a362c1d358-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.799723 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/463af3df-b7a7-45fe-a892-19b29608505d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.799751 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d2c1bc67-9713-443b-90af-57a362c1d358-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.799774 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d2c1bc67-9713-443b-90af-57a362c1d358-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.799881 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/463af3df-b7a7-45fe-a892-19b29608505d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.799914 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvm92\" (UniqueName: \"kubernetes.io/projected/d2c1bc67-9713-443b-90af-57a362c1d358-kube-api-access-nvm92\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.799943 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/463af3df-b7a7-45fe-a892-19b29608505d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.800019 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/463af3df-b7a7-45fe-a892-19b29608505d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.800040 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/463af3df-b7a7-45fe-a892-19b29608505d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.800084 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d2c1bc67-9713-443b-90af-57a362c1d358-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.800213 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d2c1bc67-9713-443b-90af-57a362c1d358-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.800241 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-db9ccd3b-741c-45c5-884e-66033016cffe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db9ccd3b-741c-45c5-884e-66033016cffe\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.800277 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/463af3df-b7a7-45fe-a892-19b29608505d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.800325 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/463af3df-b7a7-45fe-a892-19b29608505d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.800344 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.800830 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/463af3df-b7a7-45fe-a892-19b29608505d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.800870 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/463af3df-b7a7-45fe-a892-19b29608505d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.804472 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d2c1bc67-9713-443b-90af-57a362c1d358-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.804760 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/463af3df-b7a7-45fe-a892-19b29608505d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.804882 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fth2d\" (UniqueName: \"kubernetes.io/projected/463af3df-b7a7-45fe-a892-19b29608505d-kube-api-access-fth2d\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.804931 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/463af3df-b7a7-45fe-a892-19b29608505d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.805970 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/463af3df-b7a7-45fe-a892-19b29608505d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.806032 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/463af3df-b7a7-45fe-a892-19b29608505d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.810453 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/463af3df-b7a7-45fe-a892-19b29608505d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.813228 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.813282 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-db9ccd3b-741c-45c5-884e-66033016cffe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db9ccd3b-741c-45c5-884e-66033016cffe\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f8ae3fdce626746353c12a32b28066849916b5f2fe33f78cf5c597724d462143/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.825774 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-mnlfq"] Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.826058 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" podUID="1e2beed9-bee3-40eb-a67f-30e29fc2cae1" containerName="dnsmasq-dns" containerID="cri-o://a3e92b175c2285941f89b6bdbdcb58819bd1deda0da250c1dacf41ccd33e47ee" gracePeriod=10 Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.827477 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fth2d\" (UniqueName: \"kubernetes.io/projected/463af3df-b7a7-45fe-a892-19b29608505d-kube-api-access-fth2d\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.868675 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-db9ccd3b-741c-45c5-884e-66033016cffe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-db9ccd3b-741c-45c5-884e-66033016cffe\") pod \"rabbitmq-server-0\" (UID: \"463af3df-b7a7-45fe-a892-19b29608505d\") " pod="openstack/rabbitmq-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.906985 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.907053 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d2c1bc67-9713-443b-90af-57a362c1d358-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.907097 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d2c1bc67-9713-443b-90af-57a362c1d358-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.907123 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d2c1bc67-9713-443b-90af-57a362c1d358-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.907170 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d2c1bc67-9713-443b-90af-57a362c1d358-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.907191 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d2c1bc67-9713-443b-90af-57a362c1d358-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.907247 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvm92\" (UniqueName: \"kubernetes.io/projected/d2c1bc67-9713-443b-90af-57a362c1d358-kube-api-access-nvm92\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.907303 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d2c1bc67-9713-443b-90af-57a362c1d358-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.907334 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d2c1bc67-9713-443b-90af-57a362c1d358-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.907880 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d2c1bc67-9713-443b-90af-57a362c1d358-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.908313 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d2c1bc67-9713-443b-90af-57a362c1d358-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.908383 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d2c1bc67-9713-443b-90af-57a362c1d358-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.908511 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d2c1bc67-9713-443b-90af-57a362c1d358-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.909650 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.909678 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a14a361ed6851d73e900ab2b820f03fca2facbdd6d044258239c17fbcb5c3195/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.911416 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d2c1bc67-9713-443b-90af-57a362c1d358-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.911666 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d2c1bc67-9713-443b-90af-57a362c1d358-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.911890 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d2c1bc67-9713-443b-90af-57a362c1d358-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.928141 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvm92\" (UniqueName: \"kubernetes.io/projected/d2c1bc67-9713-443b-90af-57a362c1d358-kube-api-access-nvm92\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:34 crc kubenswrapper[4854]: I1007 13:43:34.942248 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb442d9-449e-433a-b9db-9e95fb0ba6d5\") pod \"rabbitmq-cell1-server-0\" (UID: \"d2c1bc67-9713-443b-90af-57a362c1d358\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.033968 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.058749 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.189868 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.229367 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hsjx\" (UniqueName: \"kubernetes.io/projected/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-kube-api-access-4hsjx\") pod \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\" (UID: \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\") " Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.231240 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-dns-svc\") pod \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\" (UID: \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\") " Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.231329 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-config\") pod \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\" (UID: \"1e2beed9-bee3-40eb-a67f-30e29fc2cae1\") " Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.240440 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-kube-api-access-4hsjx" (OuterVolumeSpecName: "kube-api-access-4hsjx") pod "1e2beed9-bee3-40eb-a67f-30e29fc2cae1" (UID: "1e2beed9-bee3-40eb-a67f-30e29fc2cae1"). InnerVolumeSpecName "kube-api-access-4hsjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.276339 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1e2beed9-bee3-40eb-a67f-30e29fc2cae1" (UID: "1e2beed9-bee3-40eb-a67f-30e29fc2cae1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.281277 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-config" (OuterVolumeSpecName: "config") pod "1e2beed9-bee3-40eb-a67f-30e29fc2cae1" (UID: "1e2beed9-bee3-40eb-a67f-30e29fc2cae1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.332654 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hsjx\" (UniqueName: \"kubernetes.io/projected/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-kube-api-access-4hsjx\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.332682 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.332691 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e2beed9-bee3-40eb-a67f-30e29fc2cae1-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.576745 4854 generic.go:334] "Generic (PLEG): container finished" podID="1e2beed9-bee3-40eb-a67f-30e29fc2cae1" containerID="a3e92b175c2285941f89b6bdbdcb58819bd1deda0da250c1dacf41ccd33e47ee" exitCode=0 Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.576826 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" event={"ID":"1e2beed9-bee3-40eb-a67f-30e29fc2cae1","Type":"ContainerDied","Data":"a3e92b175c2285941f89b6bdbdcb58819bd1deda0da250c1dacf41ccd33e47ee"} Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.576860 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" event={"ID":"1e2beed9-bee3-40eb-a67f-30e29fc2cae1","Type":"ContainerDied","Data":"49dba889f339da33ef8be99ddb8c401cba630e84fbf2d13413e07ca3302a8110"} Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.576882 4854 scope.go:117] "RemoveContainer" containerID="a3e92b175c2285941f89b6bdbdcb58819bd1deda0da250c1dacf41ccd33e47ee" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.577001 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-mnlfq" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.615830 4854 scope.go:117] "RemoveContainer" containerID="e2212ccb65b66334f096d3e38479e12df84727641949be9e08824c0e459f4873" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.628557 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-mnlfq"] Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.632640 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-mnlfq"] Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.653009 4854 scope.go:117] "RemoveContainer" containerID="a3e92b175c2285941f89b6bdbdcb58819bd1deda0da250c1dacf41ccd33e47ee" Oct 07 13:43:35 crc kubenswrapper[4854]: E1007 13:43:35.655673 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3e92b175c2285941f89b6bdbdcb58819bd1deda0da250c1dacf41ccd33e47ee\": container with ID starting with a3e92b175c2285941f89b6bdbdcb58819bd1deda0da250c1dacf41ccd33e47ee not found: ID does not exist" containerID="a3e92b175c2285941f89b6bdbdcb58819bd1deda0da250c1dacf41ccd33e47ee" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.655731 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e92b175c2285941f89b6bdbdcb58819bd1deda0da250c1dacf41ccd33e47ee"} err="failed to get container status \"a3e92b175c2285941f89b6bdbdcb58819bd1deda0da250c1dacf41ccd33e47ee\": rpc error: code = NotFound desc = could not find container \"a3e92b175c2285941f89b6bdbdcb58819bd1deda0da250c1dacf41ccd33e47ee\": container with ID starting with a3e92b175c2285941f89b6bdbdcb58819bd1deda0da250c1dacf41ccd33e47ee not found: ID does not exist" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.655772 4854 scope.go:117] "RemoveContainer" containerID="e2212ccb65b66334f096d3e38479e12df84727641949be9e08824c0e459f4873" Oct 07 13:43:35 crc kubenswrapper[4854]: E1007 13:43:35.656453 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2212ccb65b66334f096d3e38479e12df84727641949be9e08824c0e459f4873\": container with ID starting with e2212ccb65b66334f096d3e38479e12df84727641949be9e08824c0e459f4873 not found: ID does not exist" containerID="e2212ccb65b66334f096d3e38479e12df84727641949be9e08824c0e459f4873" Oct 07 13:43:35 crc kubenswrapper[4854]: I1007 13:43:35.656526 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2212ccb65b66334f096d3e38479e12df84727641949be9e08824c0e459f4873"} err="failed to get container status \"e2212ccb65b66334f096d3e38479e12df84727641949be9e08824c0e459f4873\": rpc error: code = NotFound desc = could not find container \"e2212ccb65b66334f096d3e38479e12df84727641949be9e08824c0e459f4873\": container with ID starting with e2212ccb65b66334f096d3e38479e12df84727641949be9e08824c0e459f4873 not found: ID does not exist" Oct 07 13:43:36 crc kubenswrapper[4854]: I1007 13:43:36.047810 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 13:43:36 crc kubenswrapper[4854]: W1007 13:43:36.068433 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2c1bc67_9713_443b_90af_57a362c1d358.slice/crio-8971f61a30c2e45a18d7778bc57c0893de4cf15bf63a6c664e550b7fefac968e WatchSource:0}: Error finding container 8971f61a30c2e45a18d7778bc57c0893de4cf15bf63a6c664e550b7fefac968e: Status 404 returned error can't find the container with id 8971f61a30c2e45a18d7778bc57c0893de4cf15bf63a6c664e550b7fefac968e Oct 07 13:43:36 crc kubenswrapper[4854]: I1007 13:43:36.069640 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 13:43:36 crc kubenswrapper[4854]: I1007 13:43:36.618790 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d2c1bc67-9713-443b-90af-57a362c1d358","Type":"ContainerStarted","Data":"8971f61a30c2e45a18d7778bc57c0893de4cf15bf63a6c664e550b7fefac968e"} Oct 07 13:43:36 crc kubenswrapper[4854]: I1007 13:43:36.622712 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"463af3df-b7a7-45fe-a892-19b29608505d","Type":"ContainerStarted","Data":"48dcd9158102f4ae6bc7abdcec2b9ed282628b05955ad8c2cdc109fe663fb536"} Oct 07 13:43:36 crc kubenswrapper[4854]: I1007 13:43:36.724782 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e2beed9-bee3-40eb-a67f-30e29fc2cae1" path="/var/lib/kubelet/pods/1e2beed9-bee3-40eb-a67f-30e29fc2cae1/volumes" Oct 07 13:43:37 crc kubenswrapper[4854]: I1007 13:43:37.639135 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d2c1bc67-9713-443b-90af-57a362c1d358","Type":"ContainerStarted","Data":"42c351fde87c5d1b09734f15e979b77172bc276ab7e883dd5a5231f07538e648"} Oct 07 13:43:37 crc kubenswrapper[4854]: I1007 13:43:37.642032 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"463af3df-b7a7-45fe-a892-19b29608505d","Type":"ContainerStarted","Data":"f0f9c1b23405bb928db576ce0a544223cc0f3f87108739ea3078e76ad945f970"} Oct 07 13:43:44 crc kubenswrapper[4854]: I1007 13:43:44.705856 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:43:44 crc kubenswrapper[4854]: E1007 13:43:44.706626 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:43:55 crc kubenswrapper[4854]: I1007 13:43:55.703430 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:43:55 crc kubenswrapper[4854]: E1007 13:43:55.704621 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:44:09 crc kubenswrapper[4854]: I1007 13:44:09.703437 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:44:09 crc kubenswrapper[4854]: E1007 13:44:09.704477 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:44:10 crc kubenswrapper[4854]: I1007 13:44:10.980189 4854 generic.go:334] "Generic (PLEG): container finished" podID="d2c1bc67-9713-443b-90af-57a362c1d358" containerID="42c351fde87c5d1b09734f15e979b77172bc276ab7e883dd5a5231f07538e648" exitCode=0 Oct 07 13:44:10 crc kubenswrapper[4854]: I1007 13:44:10.980317 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d2c1bc67-9713-443b-90af-57a362c1d358","Type":"ContainerDied","Data":"42c351fde87c5d1b09734f15e979b77172bc276ab7e883dd5a5231f07538e648"} Oct 07 13:44:10 crc kubenswrapper[4854]: I1007 13:44:10.983266 4854 generic.go:334] "Generic (PLEG): container finished" podID="463af3df-b7a7-45fe-a892-19b29608505d" containerID="f0f9c1b23405bb928db576ce0a544223cc0f3f87108739ea3078e76ad945f970" exitCode=0 Oct 07 13:44:10 crc kubenswrapper[4854]: I1007 13:44:10.983312 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"463af3df-b7a7-45fe-a892-19b29608505d","Type":"ContainerDied","Data":"f0f9c1b23405bb928db576ce0a544223cc0f3f87108739ea3078e76ad945f970"} Oct 07 13:44:11 crc kubenswrapper[4854]: I1007 13:44:11.997832 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"463af3df-b7a7-45fe-a892-19b29608505d","Type":"ContainerStarted","Data":"eee5ef94c584fd1da9673bbd64d194d82a37be9454b7d94e3eb4d9465b16e6cf"} Oct 07 13:44:11 crc kubenswrapper[4854]: I1007 13:44:11.998888 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 07 13:44:12 crc kubenswrapper[4854]: I1007 13:44:12.000303 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d2c1bc67-9713-443b-90af-57a362c1d358","Type":"ContainerStarted","Data":"4ee8fba2873e959d3b8dc1a3cb6a5269889fec16ab1899a9ff275c2647d4ea29"} Oct 07 13:44:12 crc kubenswrapper[4854]: I1007 13:44:12.000514 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:44:12 crc kubenswrapper[4854]: I1007 13:44:12.022036 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.022019228 podStartE2EDuration="38.022019228s" podCreationTimestamp="2025-10-07 13:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:44:12.019942549 +0000 UTC m=+4768.007774804" watchObservedRunningTime="2025-10-07 13:44:12.022019228 +0000 UTC m=+4768.009851483" Oct 07 13:44:12 crc kubenswrapper[4854]: I1007 13:44:12.053271 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.053246019 podStartE2EDuration="38.053246019s" podCreationTimestamp="2025-10-07 13:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:44:12.049446889 +0000 UTC m=+4768.037279144" watchObservedRunningTime="2025-10-07 13:44:12.053246019 +0000 UTC m=+4768.041078274" Oct 07 13:44:21 crc kubenswrapper[4854]: I1007 13:44:21.702501 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:44:21 crc kubenswrapper[4854]: E1007 13:44:21.703579 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:44:25 crc kubenswrapper[4854]: I1007 13:44:25.038139 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 07 13:44:25 crc kubenswrapper[4854]: I1007 13:44:25.062542 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 07 13:44:33 crc kubenswrapper[4854]: I1007 13:44:33.703941 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Oct 07 13:44:33 crc kubenswrapper[4854]: E1007 13:44:33.706550 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e2beed9-bee3-40eb-a67f-30e29fc2cae1" containerName="init" Oct 07 13:44:33 crc kubenswrapper[4854]: I1007 13:44:33.706592 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e2beed9-bee3-40eb-a67f-30e29fc2cae1" containerName="init" Oct 07 13:44:33 crc kubenswrapper[4854]: E1007 13:44:33.706637 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e2beed9-bee3-40eb-a67f-30e29fc2cae1" containerName="dnsmasq-dns" Oct 07 13:44:33 crc kubenswrapper[4854]: I1007 13:44:33.706647 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e2beed9-bee3-40eb-a67f-30e29fc2cae1" containerName="dnsmasq-dns" Oct 07 13:44:33 crc kubenswrapper[4854]: I1007 13:44:33.707687 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e2beed9-bee3-40eb-a67f-30e29fc2cae1" containerName="dnsmasq-dns" Oct 07 13:44:33 crc kubenswrapper[4854]: I1007 13:44:33.710097 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 07 13:44:33 crc kubenswrapper[4854]: I1007 13:44:33.715231 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-9pznn" Oct 07 13:44:33 crc kubenswrapper[4854]: I1007 13:44:33.743927 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 07 13:44:33 crc kubenswrapper[4854]: I1007 13:44:33.814910 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8xlk\" (UniqueName: \"kubernetes.io/projected/666c1de5-519f-4b3d-a27c-18ecb7773e21-kube-api-access-w8xlk\") pod \"mariadb-client-1-default\" (UID: \"666c1de5-519f-4b3d-a27c-18ecb7773e21\") " pod="openstack/mariadb-client-1-default" Oct 07 13:44:33 crc kubenswrapper[4854]: I1007 13:44:33.916517 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8xlk\" (UniqueName: \"kubernetes.io/projected/666c1de5-519f-4b3d-a27c-18ecb7773e21-kube-api-access-w8xlk\") pod \"mariadb-client-1-default\" (UID: \"666c1de5-519f-4b3d-a27c-18ecb7773e21\") " pod="openstack/mariadb-client-1-default" Oct 07 13:44:33 crc kubenswrapper[4854]: I1007 13:44:33.960878 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8xlk\" (UniqueName: \"kubernetes.io/projected/666c1de5-519f-4b3d-a27c-18ecb7773e21-kube-api-access-w8xlk\") pod \"mariadb-client-1-default\" (UID: \"666c1de5-519f-4b3d-a27c-18ecb7773e21\") " pod="openstack/mariadb-client-1-default" Oct 07 13:44:34 crc kubenswrapper[4854]: I1007 13:44:34.039474 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 07 13:44:34 crc kubenswrapper[4854]: I1007 13:44:34.611569 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 07 13:44:34 crc kubenswrapper[4854]: W1007 13:44:34.621029 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod666c1de5_519f_4b3d_a27c_18ecb7773e21.slice/crio-0de15d172bf64c1d4742a51a349d204689644d112c96c25e9872c7bbb3be2499 WatchSource:0}: Error finding container 0de15d172bf64c1d4742a51a349d204689644d112c96c25e9872c7bbb3be2499: Status 404 returned error can't find the container with id 0de15d172bf64c1d4742a51a349d204689644d112c96c25e9872c7bbb3be2499 Oct 07 13:44:35 crc kubenswrapper[4854]: I1007 13:44:35.243856 4854 generic.go:334] "Generic (PLEG): container finished" podID="666c1de5-519f-4b3d-a27c-18ecb7773e21" containerID="52c3487975b2637d7ec55f68cb5ac54713939ac47e5fe9dcfff9c0774f785a49" exitCode=0 Oct 07 13:44:35 crc kubenswrapper[4854]: I1007 13:44:35.244327 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"666c1de5-519f-4b3d-a27c-18ecb7773e21","Type":"ContainerDied","Data":"52c3487975b2637d7ec55f68cb5ac54713939ac47e5fe9dcfff9c0774f785a49"} Oct 07 13:44:35 crc kubenswrapper[4854]: I1007 13:44:35.244370 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"666c1de5-519f-4b3d-a27c-18ecb7773e21","Type":"ContainerStarted","Data":"0de15d172bf64c1d4742a51a349d204689644d112c96c25e9872c7bbb3be2499"} Oct 07 13:44:36 crc kubenswrapper[4854]: I1007 13:44:36.620417 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 07 13:44:36 crc kubenswrapper[4854]: I1007 13:44:36.653800 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_666c1de5-519f-4b3d-a27c-18ecb7773e21/mariadb-client-1-default/0.log" Oct 07 13:44:36 crc kubenswrapper[4854]: I1007 13:44:36.687695 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 07 13:44:36 crc kubenswrapper[4854]: I1007 13:44:36.696772 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 07 13:44:36 crc kubenswrapper[4854]: I1007 13:44:36.702705 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:44:36 crc kubenswrapper[4854]: E1007 13:44:36.703254 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:44:36 crc kubenswrapper[4854]: I1007 13:44:36.764535 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8xlk\" (UniqueName: \"kubernetes.io/projected/666c1de5-519f-4b3d-a27c-18ecb7773e21-kube-api-access-w8xlk\") pod \"666c1de5-519f-4b3d-a27c-18ecb7773e21\" (UID: \"666c1de5-519f-4b3d-a27c-18ecb7773e21\") " Oct 07 13:44:36 crc kubenswrapper[4854]: I1007 13:44:36.771484 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/666c1de5-519f-4b3d-a27c-18ecb7773e21-kube-api-access-w8xlk" (OuterVolumeSpecName: "kube-api-access-w8xlk") pod "666c1de5-519f-4b3d-a27c-18ecb7773e21" (UID: "666c1de5-519f-4b3d-a27c-18ecb7773e21"). InnerVolumeSpecName "kube-api-access-w8xlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:44:36 crc kubenswrapper[4854]: I1007 13:44:36.868177 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8xlk\" (UniqueName: \"kubernetes.io/projected/666c1de5-519f-4b3d-a27c-18ecb7773e21-kube-api-access-w8xlk\") on node \"crc\" DevicePath \"\"" Oct 07 13:44:37 crc kubenswrapper[4854]: I1007 13:44:37.174324 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Oct 07 13:44:37 crc kubenswrapper[4854]: E1007 13:44:37.174689 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="666c1de5-519f-4b3d-a27c-18ecb7773e21" containerName="mariadb-client-1-default" Oct 07 13:44:37 crc kubenswrapper[4854]: I1007 13:44:37.174704 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="666c1de5-519f-4b3d-a27c-18ecb7773e21" containerName="mariadb-client-1-default" Oct 07 13:44:37 crc kubenswrapper[4854]: I1007 13:44:37.174895 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="666c1de5-519f-4b3d-a27c-18ecb7773e21" containerName="mariadb-client-1-default" Oct 07 13:44:37 crc kubenswrapper[4854]: I1007 13:44:37.175530 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 07 13:44:37 crc kubenswrapper[4854]: I1007 13:44:37.182548 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 07 13:44:37 crc kubenswrapper[4854]: I1007 13:44:37.264567 4854 scope.go:117] "RemoveContainer" containerID="52c3487975b2637d7ec55f68cb5ac54713939ac47e5fe9dcfff9c0774f785a49" Oct 07 13:44:37 crc kubenswrapper[4854]: I1007 13:44:37.264786 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 07 13:44:37 crc kubenswrapper[4854]: I1007 13:44:37.275974 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5trxg\" (UniqueName: \"kubernetes.io/projected/f5ef4831-fb02-457b-a2d6-c8366177aee4-kube-api-access-5trxg\") pod \"mariadb-client-2-default\" (UID: \"f5ef4831-fb02-457b-a2d6-c8366177aee4\") " pod="openstack/mariadb-client-2-default" Oct 07 13:44:37 crc kubenswrapper[4854]: I1007 13:44:37.377443 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5trxg\" (UniqueName: \"kubernetes.io/projected/f5ef4831-fb02-457b-a2d6-c8366177aee4-kube-api-access-5trxg\") pod \"mariadb-client-2-default\" (UID: \"f5ef4831-fb02-457b-a2d6-c8366177aee4\") " pod="openstack/mariadb-client-2-default" Oct 07 13:44:37 crc kubenswrapper[4854]: I1007 13:44:37.404714 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5trxg\" (UniqueName: \"kubernetes.io/projected/f5ef4831-fb02-457b-a2d6-c8366177aee4-kube-api-access-5trxg\") pod \"mariadb-client-2-default\" (UID: \"f5ef4831-fb02-457b-a2d6-c8366177aee4\") " pod="openstack/mariadb-client-2-default" Oct 07 13:44:37 crc kubenswrapper[4854]: I1007 13:44:37.497236 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 07 13:44:37 crc kubenswrapper[4854]: E1007 13:44:37.516129 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod666c1de5_519f_4b3d_a27c_18ecb7773e21.slice\": RecentStats: unable to find data in memory cache]" Oct 07 13:44:37 crc kubenswrapper[4854]: I1007 13:44:37.970544 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 07 13:44:37 crc kubenswrapper[4854]: W1007 13:44:37.982547 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5ef4831_fb02_457b_a2d6_c8366177aee4.slice/crio-71b0f01d5b47de42f79ac6c2102db4afc4a0fe71a5581e23701a38f4442777e4 WatchSource:0}: Error finding container 71b0f01d5b47de42f79ac6c2102db4afc4a0fe71a5581e23701a38f4442777e4: Status 404 returned error can't find the container with id 71b0f01d5b47de42f79ac6c2102db4afc4a0fe71a5581e23701a38f4442777e4 Oct 07 13:44:38 crc kubenswrapper[4854]: I1007 13:44:38.276790 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"f5ef4831-fb02-457b-a2d6-c8366177aee4","Type":"ContainerStarted","Data":"71b0f01d5b47de42f79ac6c2102db4afc4a0fe71a5581e23701a38f4442777e4"} Oct 07 13:44:38 crc kubenswrapper[4854]: I1007 13:44:38.723578 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="666c1de5-519f-4b3d-a27c-18ecb7773e21" path="/var/lib/kubelet/pods/666c1de5-519f-4b3d-a27c-18ecb7773e21/volumes" Oct 07 13:44:39 crc kubenswrapper[4854]: I1007 13:44:39.291869 4854 generic.go:334] "Generic (PLEG): container finished" podID="f5ef4831-fb02-457b-a2d6-c8366177aee4" containerID="9a8a7a345e3ad86e7194a48c26f089d9964d099de6188ff54c7c31044ad55383" exitCode=0 Oct 07 13:44:39 crc kubenswrapper[4854]: I1007 13:44:39.291952 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"f5ef4831-fb02-457b-a2d6-c8366177aee4","Type":"ContainerDied","Data":"9a8a7a345e3ad86e7194a48c26f089d9964d099de6188ff54c7c31044ad55383"} Oct 07 13:44:40 crc kubenswrapper[4854]: I1007 13:44:40.821049 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 07 13:44:40 crc kubenswrapper[4854]: I1007 13:44:40.874016 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2-default_f5ef4831-fb02-457b-a2d6-c8366177aee4/mariadb-client-2-default/0.log" Oct 07 13:44:40 crc kubenswrapper[4854]: I1007 13:44:40.899055 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 07 13:44:40 crc kubenswrapper[4854]: I1007 13:44:40.904621 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 07 13:44:40 crc kubenswrapper[4854]: I1007 13:44:40.940083 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5trxg\" (UniqueName: \"kubernetes.io/projected/f5ef4831-fb02-457b-a2d6-c8366177aee4-kube-api-access-5trxg\") pod \"f5ef4831-fb02-457b-a2d6-c8366177aee4\" (UID: \"f5ef4831-fb02-457b-a2d6-c8366177aee4\") " Oct 07 13:44:40 crc kubenswrapper[4854]: I1007 13:44:40.952469 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ef4831-fb02-457b-a2d6-c8366177aee4-kube-api-access-5trxg" (OuterVolumeSpecName: "kube-api-access-5trxg") pod "f5ef4831-fb02-457b-a2d6-c8366177aee4" (UID: "f5ef4831-fb02-457b-a2d6-c8366177aee4"). InnerVolumeSpecName "kube-api-access-5trxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:44:41 crc kubenswrapper[4854]: I1007 13:44:41.042226 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5trxg\" (UniqueName: \"kubernetes.io/projected/f5ef4831-fb02-457b-a2d6-c8366177aee4-kube-api-access-5trxg\") on node \"crc\" DevicePath \"\"" Oct 07 13:44:41 crc kubenswrapper[4854]: I1007 13:44:41.317938 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71b0f01d5b47de42f79ac6c2102db4afc4a0fe71a5581e23701a38f4442777e4" Oct 07 13:44:41 crc kubenswrapper[4854]: I1007 13:44:41.318044 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 07 13:44:41 crc kubenswrapper[4854]: I1007 13:44:41.368404 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Oct 07 13:44:41 crc kubenswrapper[4854]: E1007 13:44:41.368915 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ef4831-fb02-457b-a2d6-c8366177aee4" containerName="mariadb-client-2-default" Oct 07 13:44:41 crc kubenswrapper[4854]: I1007 13:44:41.368945 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ef4831-fb02-457b-a2d6-c8366177aee4" containerName="mariadb-client-2-default" Oct 07 13:44:41 crc kubenswrapper[4854]: I1007 13:44:41.369262 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5ef4831-fb02-457b-a2d6-c8366177aee4" containerName="mariadb-client-2-default" Oct 07 13:44:41 crc kubenswrapper[4854]: I1007 13:44:41.370026 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 07 13:44:41 crc kubenswrapper[4854]: I1007 13:44:41.374495 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-9pznn" Oct 07 13:44:41 crc kubenswrapper[4854]: I1007 13:44:41.383122 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 07 13:44:41 crc kubenswrapper[4854]: I1007 13:44:41.452031 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzprm\" (UniqueName: \"kubernetes.io/projected/1d9c72a9-2434-41e3-bbca-f91507eb3f9d-kube-api-access-qzprm\") pod \"mariadb-client-1\" (UID: \"1d9c72a9-2434-41e3-bbca-f91507eb3f9d\") " pod="openstack/mariadb-client-1" Oct 07 13:44:41 crc kubenswrapper[4854]: I1007 13:44:41.554231 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzprm\" (UniqueName: \"kubernetes.io/projected/1d9c72a9-2434-41e3-bbca-f91507eb3f9d-kube-api-access-qzprm\") pod \"mariadb-client-1\" (UID: \"1d9c72a9-2434-41e3-bbca-f91507eb3f9d\") " pod="openstack/mariadb-client-1" Oct 07 13:44:41 crc kubenswrapper[4854]: I1007 13:44:41.578753 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzprm\" (UniqueName: \"kubernetes.io/projected/1d9c72a9-2434-41e3-bbca-f91507eb3f9d-kube-api-access-qzprm\") pod \"mariadb-client-1\" (UID: \"1d9c72a9-2434-41e3-bbca-f91507eb3f9d\") " pod="openstack/mariadb-client-1" Oct 07 13:44:41 crc kubenswrapper[4854]: I1007 13:44:41.724670 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 07 13:44:42 crc kubenswrapper[4854]: I1007 13:44:42.290899 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 07 13:44:42 crc kubenswrapper[4854]: W1007 13:44:42.301437 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d9c72a9_2434_41e3_bbca_f91507eb3f9d.slice/crio-468db1122cbf07ba0e30bb02427a74984014e5ad162864b9c4693db3635872ba WatchSource:0}: Error finding container 468db1122cbf07ba0e30bb02427a74984014e5ad162864b9c4693db3635872ba: Status 404 returned error can't find the container with id 468db1122cbf07ba0e30bb02427a74984014e5ad162864b9c4693db3635872ba Oct 07 13:44:42 crc kubenswrapper[4854]: I1007 13:44:42.330056 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"1d9c72a9-2434-41e3-bbca-f91507eb3f9d","Type":"ContainerStarted","Data":"468db1122cbf07ba0e30bb02427a74984014e5ad162864b9c4693db3635872ba"} Oct 07 13:44:42 crc kubenswrapper[4854]: I1007 13:44:42.715709 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5ef4831-fb02-457b-a2d6-c8366177aee4" path="/var/lib/kubelet/pods/f5ef4831-fb02-457b-a2d6-c8366177aee4/volumes" Oct 07 13:44:43 crc kubenswrapper[4854]: I1007 13:44:43.341637 4854 generic.go:334] "Generic (PLEG): container finished" podID="1d9c72a9-2434-41e3-bbca-f91507eb3f9d" containerID="c9c277661c4c241994c2a178c0bcd561c74a72a971d800d23051227389780cf4" exitCode=0 Oct 07 13:44:43 crc kubenswrapper[4854]: I1007 13:44:43.341695 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"1d9c72a9-2434-41e3-bbca-f91507eb3f9d","Type":"ContainerDied","Data":"c9c277661c4c241994c2a178c0bcd561c74a72a971d800d23051227389780cf4"} Oct 07 13:44:44 crc kubenswrapper[4854]: I1007 13:44:44.823353 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 07 13:44:44 crc kubenswrapper[4854]: I1007 13:44:44.844840 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_1d9c72a9-2434-41e3-bbca-f91507eb3f9d/mariadb-client-1/0.log" Oct 07 13:44:44 crc kubenswrapper[4854]: I1007 13:44:44.874425 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Oct 07 13:44:44 crc kubenswrapper[4854]: I1007 13:44:44.881336 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Oct 07 13:44:44 crc kubenswrapper[4854]: I1007 13:44:44.907416 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzprm\" (UniqueName: \"kubernetes.io/projected/1d9c72a9-2434-41e3-bbca-f91507eb3f9d-kube-api-access-qzprm\") pod \"1d9c72a9-2434-41e3-bbca-f91507eb3f9d\" (UID: \"1d9c72a9-2434-41e3-bbca-f91507eb3f9d\") " Oct 07 13:44:44 crc kubenswrapper[4854]: I1007 13:44:44.916050 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d9c72a9-2434-41e3-bbca-f91507eb3f9d-kube-api-access-qzprm" (OuterVolumeSpecName: "kube-api-access-qzprm") pod "1d9c72a9-2434-41e3-bbca-f91507eb3f9d" (UID: "1d9c72a9-2434-41e3-bbca-f91507eb3f9d"). InnerVolumeSpecName "kube-api-access-qzprm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:44:45 crc kubenswrapper[4854]: I1007 13:44:45.008638 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzprm\" (UniqueName: \"kubernetes.io/projected/1d9c72a9-2434-41e3-bbca-f91507eb3f9d-kube-api-access-qzprm\") on node \"crc\" DevicePath \"\"" Oct 07 13:44:45 crc kubenswrapper[4854]: I1007 13:44:45.366548 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="468db1122cbf07ba0e30bb02427a74984014e5ad162864b9c4693db3635872ba" Oct 07 13:44:45 crc kubenswrapper[4854]: I1007 13:44:45.366643 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 07 13:44:45 crc kubenswrapper[4854]: I1007 13:44:45.439947 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Oct 07 13:44:45 crc kubenswrapper[4854]: E1007 13:44:45.440819 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d9c72a9-2434-41e3-bbca-f91507eb3f9d" containerName="mariadb-client-1" Oct 07 13:44:45 crc kubenswrapper[4854]: I1007 13:44:45.440842 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d9c72a9-2434-41e3-bbca-f91507eb3f9d" containerName="mariadb-client-1" Oct 07 13:44:45 crc kubenswrapper[4854]: I1007 13:44:45.441209 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d9c72a9-2434-41e3-bbca-f91507eb3f9d" containerName="mariadb-client-1" Oct 07 13:44:45 crc kubenswrapper[4854]: I1007 13:44:45.442176 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 07 13:44:45 crc kubenswrapper[4854]: I1007 13:44:45.446267 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-9pznn" Oct 07 13:44:45 crc kubenswrapper[4854]: I1007 13:44:45.495021 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 07 13:44:45 crc kubenswrapper[4854]: I1007 13:44:45.518679 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqmsl\" (UniqueName: \"kubernetes.io/projected/d0610381-0cba-4993-9c1d-cd1f6fb36c36-kube-api-access-bqmsl\") pod \"mariadb-client-4-default\" (UID: \"d0610381-0cba-4993-9c1d-cd1f6fb36c36\") " pod="openstack/mariadb-client-4-default" Oct 07 13:44:45 crc kubenswrapper[4854]: I1007 13:44:45.620576 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqmsl\" (UniqueName: \"kubernetes.io/projected/d0610381-0cba-4993-9c1d-cd1f6fb36c36-kube-api-access-bqmsl\") pod \"mariadb-client-4-default\" (UID: \"d0610381-0cba-4993-9c1d-cd1f6fb36c36\") " pod="openstack/mariadb-client-4-default" Oct 07 13:44:45 crc kubenswrapper[4854]: I1007 13:44:45.648578 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqmsl\" (UniqueName: \"kubernetes.io/projected/d0610381-0cba-4993-9c1d-cd1f6fb36c36-kube-api-access-bqmsl\") pod \"mariadb-client-4-default\" (UID: \"d0610381-0cba-4993-9c1d-cd1f6fb36c36\") " pod="openstack/mariadb-client-4-default" Oct 07 13:44:45 crc kubenswrapper[4854]: I1007 13:44:45.821593 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 07 13:44:46 crc kubenswrapper[4854]: I1007 13:44:46.269314 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 07 13:44:46 crc kubenswrapper[4854]: W1007 13:44:46.275299 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0610381_0cba_4993_9c1d_cd1f6fb36c36.slice/crio-af0b13b5a53135ba4b0826b0d6f35a5a36a44e2aebde697cbfcd2c930622a8de WatchSource:0}: Error finding container af0b13b5a53135ba4b0826b0d6f35a5a36a44e2aebde697cbfcd2c930622a8de: Status 404 returned error can't find the container with id af0b13b5a53135ba4b0826b0d6f35a5a36a44e2aebde697cbfcd2c930622a8de Oct 07 13:44:46 crc kubenswrapper[4854]: I1007 13:44:46.377755 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"d0610381-0cba-4993-9c1d-cd1f6fb36c36","Type":"ContainerStarted","Data":"af0b13b5a53135ba4b0826b0d6f35a5a36a44e2aebde697cbfcd2c930622a8de"} Oct 07 13:44:46 crc kubenswrapper[4854]: I1007 13:44:46.720320 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d9c72a9-2434-41e3-bbca-f91507eb3f9d" path="/var/lib/kubelet/pods/1d9c72a9-2434-41e3-bbca-f91507eb3f9d/volumes" Oct 07 13:44:47 crc kubenswrapper[4854]: I1007 13:44:47.391424 4854 generic.go:334] "Generic (PLEG): container finished" podID="d0610381-0cba-4993-9c1d-cd1f6fb36c36" containerID="c47c93bffe35beb85bb46619b63814b0763e0aadad854e5621f7607081674ac8" exitCode=0 Oct 07 13:44:47 crc kubenswrapper[4854]: I1007 13:44:47.391527 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"d0610381-0cba-4993-9c1d-cd1f6fb36c36","Type":"ContainerDied","Data":"c47c93bffe35beb85bb46619b63814b0763e0aadad854e5621f7607081674ac8"} Oct 07 13:44:48 crc kubenswrapper[4854]: I1007 13:44:48.875014 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 07 13:44:48 crc kubenswrapper[4854]: I1007 13:44:48.917308 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_d0610381-0cba-4993-9c1d-cd1f6fb36c36/mariadb-client-4-default/0.log" Oct 07 13:44:48 crc kubenswrapper[4854]: I1007 13:44:48.952905 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 07 13:44:48 crc kubenswrapper[4854]: I1007 13:44:48.959676 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 07 13:44:48 crc kubenswrapper[4854]: I1007 13:44:48.973177 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqmsl\" (UniqueName: \"kubernetes.io/projected/d0610381-0cba-4993-9c1d-cd1f6fb36c36-kube-api-access-bqmsl\") pod \"d0610381-0cba-4993-9c1d-cd1f6fb36c36\" (UID: \"d0610381-0cba-4993-9c1d-cd1f6fb36c36\") " Oct 07 13:44:48 crc kubenswrapper[4854]: I1007 13:44:48.981654 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0610381-0cba-4993-9c1d-cd1f6fb36c36-kube-api-access-bqmsl" (OuterVolumeSpecName: "kube-api-access-bqmsl") pod "d0610381-0cba-4993-9c1d-cd1f6fb36c36" (UID: "d0610381-0cba-4993-9c1d-cd1f6fb36c36"). InnerVolumeSpecName "kube-api-access-bqmsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:44:49 crc kubenswrapper[4854]: I1007 13:44:49.075798 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqmsl\" (UniqueName: \"kubernetes.io/projected/d0610381-0cba-4993-9c1d-cd1f6fb36c36-kube-api-access-bqmsl\") on node \"crc\" DevicePath \"\"" Oct 07 13:44:49 crc kubenswrapper[4854]: I1007 13:44:49.415464 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af0b13b5a53135ba4b0826b0d6f35a5a36a44e2aebde697cbfcd2c930622a8de" Oct 07 13:44:49 crc kubenswrapper[4854]: I1007 13:44:49.415558 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 07 13:44:50 crc kubenswrapper[4854]: I1007 13:44:50.703336 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:44:50 crc kubenswrapper[4854]: E1007 13:44:50.704433 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:44:50 crc kubenswrapper[4854]: I1007 13:44:50.717914 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0610381-0cba-4993-9c1d-cd1f6fb36c36" path="/var/lib/kubelet/pods/d0610381-0cba-4993-9c1d-cd1f6fb36c36/volumes" Oct 07 13:44:53 crc kubenswrapper[4854]: I1007 13:44:53.887189 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Oct 07 13:44:53 crc kubenswrapper[4854]: E1007 13:44:53.888935 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0610381-0cba-4993-9c1d-cd1f6fb36c36" containerName="mariadb-client-4-default" Oct 07 13:44:53 crc kubenswrapper[4854]: I1007 13:44:53.888961 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0610381-0cba-4993-9c1d-cd1f6fb36c36" containerName="mariadb-client-4-default" Oct 07 13:44:53 crc kubenswrapper[4854]: I1007 13:44:53.889285 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0610381-0cba-4993-9c1d-cd1f6fb36c36" containerName="mariadb-client-4-default" Oct 07 13:44:53 crc kubenswrapper[4854]: I1007 13:44:53.890143 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 07 13:44:53 crc kubenswrapper[4854]: I1007 13:44:53.893557 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-9pznn" Oct 07 13:44:53 crc kubenswrapper[4854]: I1007 13:44:53.900246 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 07 13:44:53 crc kubenswrapper[4854]: I1007 13:44:53.960663 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rd2t\" (UniqueName: \"kubernetes.io/projected/4fbb045c-108a-4b96-ace4-3ea13f97fd1e-kube-api-access-4rd2t\") pod \"mariadb-client-5-default\" (UID: \"4fbb045c-108a-4b96-ace4-3ea13f97fd1e\") " pod="openstack/mariadb-client-5-default" Oct 07 13:44:54 crc kubenswrapper[4854]: I1007 13:44:54.062614 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rd2t\" (UniqueName: \"kubernetes.io/projected/4fbb045c-108a-4b96-ace4-3ea13f97fd1e-kube-api-access-4rd2t\") pod \"mariadb-client-5-default\" (UID: \"4fbb045c-108a-4b96-ace4-3ea13f97fd1e\") " pod="openstack/mariadb-client-5-default" Oct 07 13:44:54 crc kubenswrapper[4854]: I1007 13:44:54.091732 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rd2t\" (UniqueName: \"kubernetes.io/projected/4fbb045c-108a-4b96-ace4-3ea13f97fd1e-kube-api-access-4rd2t\") pod \"mariadb-client-5-default\" (UID: \"4fbb045c-108a-4b96-ace4-3ea13f97fd1e\") " pod="openstack/mariadb-client-5-default" Oct 07 13:44:54 crc kubenswrapper[4854]: I1007 13:44:54.222935 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 07 13:44:54 crc kubenswrapper[4854]: I1007 13:44:54.849135 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 07 13:44:55 crc kubenswrapper[4854]: I1007 13:44:55.480881 4854 generic.go:334] "Generic (PLEG): container finished" podID="4fbb045c-108a-4b96-ace4-3ea13f97fd1e" containerID="f71286733b12d0ccc72887cfbca56225b404a500c1af4dc2a966b93beefed5ca" exitCode=0 Oct 07 13:44:55 crc kubenswrapper[4854]: I1007 13:44:55.480996 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"4fbb045c-108a-4b96-ace4-3ea13f97fd1e","Type":"ContainerDied","Data":"f71286733b12d0ccc72887cfbca56225b404a500c1af4dc2a966b93beefed5ca"} Oct 07 13:44:55 crc kubenswrapper[4854]: I1007 13:44:55.481320 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"4fbb045c-108a-4b96-ace4-3ea13f97fd1e","Type":"ContainerStarted","Data":"c29ee64e94359fcd2fca694da044536cda62d75c0fcb2650520d2969e7495c6b"} Oct 07 13:44:56 crc kubenswrapper[4854]: I1007 13:44:56.954207 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 07 13:44:56 crc kubenswrapper[4854]: I1007 13:44:56.978936 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_4fbb045c-108a-4b96-ace4-3ea13f97fd1e/mariadb-client-5-default/0.log" Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.009525 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.016466 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.026665 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rd2t\" (UniqueName: \"kubernetes.io/projected/4fbb045c-108a-4b96-ace4-3ea13f97fd1e-kube-api-access-4rd2t\") pod \"4fbb045c-108a-4b96-ace4-3ea13f97fd1e\" (UID: \"4fbb045c-108a-4b96-ace4-3ea13f97fd1e\") " Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.033989 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fbb045c-108a-4b96-ace4-3ea13f97fd1e-kube-api-access-4rd2t" (OuterVolumeSpecName: "kube-api-access-4rd2t") pod "4fbb045c-108a-4b96-ace4-3ea13f97fd1e" (UID: "4fbb045c-108a-4b96-ace4-3ea13f97fd1e"). InnerVolumeSpecName "kube-api-access-4rd2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.128817 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rd2t\" (UniqueName: \"kubernetes.io/projected/4fbb045c-108a-4b96-ace4-3ea13f97fd1e-kube-api-access-4rd2t\") on node \"crc\" DevicePath \"\"" Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.181597 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Oct 07 13:44:57 crc kubenswrapper[4854]: E1007 13:44:57.182008 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fbb045c-108a-4b96-ace4-3ea13f97fd1e" containerName="mariadb-client-5-default" Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.182027 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fbb045c-108a-4b96-ace4-3ea13f97fd1e" containerName="mariadb-client-5-default" Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.182274 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fbb045c-108a-4b96-ace4-3ea13f97fd1e" containerName="mariadb-client-5-default" Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.182858 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.186532 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.333027 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m44j\" (UniqueName: \"kubernetes.io/projected/65b83727-5b7f-4322-bdef-8835e0d2220e-kube-api-access-6m44j\") pod \"mariadb-client-6-default\" (UID: \"65b83727-5b7f-4322-bdef-8835e0d2220e\") " pod="openstack/mariadb-client-6-default" Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.435687 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m44j\" (UniqueName: \"kubernetes.io/projected/65b83727-5b7f-4322-bdef-8835e0d2220e-kube-api-access-6m44j\") pod \"mariadb-client-6-default\" (UID: \"65b83727-5b7f-4322-bdef-8835e0d2220e\") " pod="openstack/mariadb-client-6-default" Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.473020 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m44j\" (UniqueName: \"kubernetes.io/projected/65b83727-5b7f-4322-bdef-8835e0d2220e-kube-api-access-6m44j\") pod \"mariadb-client-6-default\" (UID: \"65b83727-5b7f-4322-bdef-8835e0d2220e\") " pod="openstack/mariadb-client-6-default" Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.504801 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c29ee64e94359fcd2fca694da044536cda62d75c0fcb2650520d2969e7495c6b" Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.505243 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 07 13:44:57 crc kubenswrapper[4854]: I1007 13:44:57.565395 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 07 13:44:58 crc kubenswrapper[4854]: W1007 13:44:58.016326 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65b83727_5b7f_4322_bdef_8835e0d2220e.slice/crio-1f5113229033a94b5b607a79924815448a2d8c6471452638a1a8a2230b245efd WatchSource:0}: Error finding container 1f5113229033a94b5b607a79924815448a2d8c6471452638a1a8a2230b245efd: Status 404 returned error can't find the container with id 1f5113229033a94b5b607a79924815448a2d8c6471452638a1a8a2230b245efd Oct 07 13:44:58 crc kubenswrapper[4854]: I1007 13:44:58.016669 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 07 13:44:58 crc kubenswrapper[4854]: I1007 13:44:58.515376 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"65b83727-5b7f-4322-bdef-8835e0d2220e","Type":"ContainerStarted","Data":"c97da054a70551a5a5af8d805d868fa7a027f32be0bad5b5e0e603957efd1385"} Oct 07 13:44:58 crc kubenswrapper[4854]: I1007 13:44:58.515439 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"65b83727-5b7f-4322-bdef-8835e0d2220e","Type":"ContainerStarted","Data":"1f5113229033a94b5b607a79924815448a2d8c6471452638a1a8a2230b245efd"} Oct 07 13:44:58 crc kubenswrapper[4854]: I1007 13:44:58.739532 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fbb045c-108a-4b96-ace4-3ea13f97fd1e" path="/var/lib/kubelet/pods/4fbb045c-108a-4b96-ace4-3ea13f97fd1e/volumes" Oct 07 13:44:59 crc kubenswrapper[4854]: I1007 13:44:59.531283 4854 generic.go:334] "Generic (PLEG): container finished" podID="65b83727-5b7f-4322-bdef-8835e0d2220e" containerID="c97da054a70551a5a5af8d805d868fa7a027f32be0bad5b5e0e603957efd1385" exitCode=0 Oct 07 13:44:59 crc kubenswrapper[4854]: I1007 13:44:59.531803 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"65b83727-5b7f-4322-bdef-8835e0d2220e","Type":"ContainerDied","Data":"c97da054a70551a5a5af8d805d868fa7a027f32be0bad5b5e0e603957efd1385"} Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.164927 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf"] Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.169662 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.174627 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.175442 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.183395 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf"] Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.294041 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/512d8fce-7696-41dc-9600-f1b94e06db5c-config-volume\") pod \"collect-profiles-29330745-prhbf\" (UID: \"512d8fce-7696-41dc-9600-f1b94e06db5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.294110 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4pqm\" (UniqueName: \"kubernetes.io/projected/512d8fce-7696-41dc-9600-f1b94e06db5c-kube-api-access-w4pqm\") pod \"collect-profiles-29330745-prhbf\" (UID: \"512d8fce-7696-41dc-9600-f1b94e06db5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.294316 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/512d8fce-7696-41dc-9600-f1b94e06db5c-secret-volume\") pod \"collect-profiles-29330745-prhbf\" (UID: \"512d8fce-7696-41dc-9600-f1b94e06db5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.395499 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/512d8fce-7696-41dc-9600-f1b94e06db5c-config-volume\") pod \"collect-profiles-29330745-prhbf\" (UID: \"512d8fce-7696-41dc-9600-f1b94e06db5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.395574 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4pqm\" (UniqueName: \"kubernetes.io/projected/512d8fce-7696-41dc-9600-f1b94e06db5c-kube-api-access-w4pqm\") pod \"collect-profiles-29330745-prhbf\" (UID: \"512d8fce-7696-41dc-9600-f1b94e06db5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.395662 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/512d8fce-7696-41dc-9600-f1b94e06db5c-secret-volume\") pod \"collect-profiles-29330745-prhbf\" (UID: \"512d8fce-7696-41dc-9600-f1b94e06db5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.398066 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/512d8fce-7696-41dc-9600-f1b94e06db5c-config-volume\") pod \"collect-profiles-29330745-prhbf\" (UID: \"512d8fce-7696-41dc-9600-f1b94e06db5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.406266 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/512d8fce-7696-41dc-9600-f1b94e06db5c-secret-volume\") pod \"collect-profiles-29330745-prhbf\" (UID: \"512d8fce-7696-41dc-9600-f1b94e06db5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.427396 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4pqm\" (UniqueName: \"kubernetes.io/projected/512d8fce-7696-41dc-9600-f1b94e06db5c-kube-api-access-w4pqm\") pod \"collect-profiles-29330745-prhbf\" (UID: \"512d8fce-7696-41dc-9600-f1b94e06db5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.501251 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" Oct 07 13:45:00 crc kubenswrapper[4854]: I1007 13:45:00.994366 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf"] Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.116128 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.184604 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-default_65b83727-5b7f-4322-bdef-8835e0d2220e/mariadb-client-6-default/0.log" Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.211093 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m44j\" (UniqueName: \"kubernetes.io/projected/65b83727-5b7f-4322-bdef-8835e0d2220e-kube-api-access-6m44j\") pod \"65b83727-5b7f-4322-bdef-8835e0d2220e\" (UID: \"65b83727-5b7f-4322-bdef-8835e0d2220e\") " Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.221816 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b83727-5b7f-4322-bdef-8835e0d2220e-kube-api-access-6m44j" (OuterVolumeSpecName: "kube-api-access-6m44j") pod "65b83727-5b7f-4322-bdef-8835e0d2220e" (UID: "65b83727-5b7f-4322-bdef-8835e0d2220e"). InnerVolumeSpecName "kube-api-access-6m44j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.223456 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.229639 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.313549 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m44j\" (UniqueName: \"kubernetes.io/projected/65b83727-5b7f-4322-bdef-8835e0d2220e-kube-api-access-6m44j\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.444591 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Oct 07 13:45:01 crc kubenswrapper[4854]: E1007 13:45:01.444984 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b83727-5b7f-4322-bdef-8835e0d2220e" containerName="mariadb-client-6-default" Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.445004 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b83727-5b7f-4322-bdef-8835e0d2220e" containerName="mariadb-client-6-default" Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.445216 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b83727-5b7f-4322-bdef-8835e0d2220e" containerName="mariadb-client-6-default" Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.445875 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.460560 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.551834 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" event={"ID":"512d8fce-7696-41dc-9600-f1b94e06db5c","Type":"ContainerStarted","Data":"65cc0d6dce84e751b028673ddfc3e069ced4950816cf0ca0cb5b49a18da32ef7"} Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.551909 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" event={"ID":"512d8fce-7696-41dc-9600-f1b94e06db5c","Type":"ContainerStarted","Data":"705af723c7e5061e548ff77b171ce2351d096dce9ffff0292a47ec77442a9e58"} Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.554226 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f5113229033a94b5b607a79924815448a2d8c6471452638a1a8a2230b245efd" Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.554317 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.567915 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" podStartSLOduration=1.567893751 podStartE2EDuration="1.567893751s" podCreationTimestamp="2025-10-07 13:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:45:01.566568333 +0000 UTC m=+4817.554400588" watchObservedRunningTime="2025-10-07 13:45:01.567893751 +0000 UTC m=+4817.555726026" Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.619228 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzfbx\" (UniqueName: \"kubernetes.io/projected/8aa5ac33-48ea-4607-aded-f88953c780d5-kube-api-access-mzfbx\") pod \"mariadb-client-7-default\" (UID: \"8aa5ac33-48ea-4607-aded-f88953c780d5\") " pod="openstack/mariadb-client-7-default" Oct 07 13:45:01 crc kubenswrapper[4854]: I1007 13:45:01.720583 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzfbx\" (UniqueName: \"kubernetes.io/projected/8aa5ac33-48ea-4607-aded-f88953c780d5-kube-api-access-mzfbx\") pod \"mariadb-client-7-default\" (UID: \"8aa5ac33-48ea-4607-aded-f88953c780d5\") " pod="openstack/mariadb-client-7-default" Oct 07 13:45:02 crc kubenswrapper[4854]: I1007 13:45:02.040575 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzfbx\" (UniqueName: \"kubernetes.io/projected/8aa5ac33-48ea-4607-aded-f88953c780d5-kube-api-access-mzfbx\") pod \"mariadb-client-7-default\" (UID: \"8aa5ac33-48ea-4607-aded-f88953c780d5\") " pod="openstack/mariadb-client-7-default" Oct 07 13:45:02 crc kubenswrapper[4854]: I1007 13:45:02.078737 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 07 13:45:02 crc kubenswrapper[4854]: W1007 13:45:02.510816 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8aa5ac33_48ea_4607_aded_f88953c780d5.slice/crio-c024aad3e0edd7c889955f1c623f748dd22c632cab93eb6fa99885e5199f79e8 WatchSource:0}: Error finding container c024aad3e0edd7c889955f1c623f748dd22c632cab93eb6fa99885e5199f79e8: Status 404 returned error can't find the container with id c024aad3e0edd7c889955f1c623f748dd22c632cab93eb6fa99885e5199f79e8 Oct 07 13:45:02 crc kubenswrapper[4854]: I1007 13:45:02.511003 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 07 13:45:02 crc kubenswrapper[4854]: I1007 13:45:02.570703 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"8aa5ac33-48ea-4607-aded-f88953c780d5","Type":"ContainerStarted","Data":"c024aad3e0edd7c889955f1c623f748dd22c632cab93eb6fa99885e5199f79e8"} Oct 07 13:45:02 crc kubenswrapper[4854]: I1007 13:45:02.573494 4854 generic.go:334] "Generic (PLEG): container finished" podID="512d8fce-7696-41dc-9600-f1b94e06db5c" containerID="65cc0d6dce84e751b028673ddfc3e069ced4950816cf0ca0cb5b49a18da32ef7" exitCode=0 Oct 07 13:45:02 crc kubenswrapper[4854]: I1007 13:45:02.573538 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" event={"ID":"512d8fce-7696-41dc-9600-f1b94e06db5c","Type":"ContainerDied","Data":"65cc0d6dce84e751b028673ddfc3e069ced4950816cf0ca0cb5b49a18da32ef7"} Oct 07 13:45:02 crc kubenswrapper[4854]: I1007 13:45:02.703326 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:45:02 crc kubenswrapper[4854]: E1007 13:45:02.703801 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:45:02 crc kubenswrapper[4854]: I1007 13:45:02.731667 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b83727-5b7f-4322-bdef-8835e0d2220e" path="/var/lib/kubelet/pods/65b83727-5b7f-4322-bdef-8835e0d2220e/volumes" Oct 07 13:45:03 crc kubenswrapper[4854]: I1007 13:45:03.590724 4854 generic.go:334] "Generic (PLEG): container finished" podID="8aa5ac33-48ea-4607-aded-f88953c780d5" containerID="83a414b7ec949942f79be6d1dbf0616e11b4720a9ebf75f080cb6c96a5cc8d52" exitCode=0 Oct 07 13:45:03 crc kubenswrapper[4854]: I1007 13:45:03.590826 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"8aa5ac33-48ea-4607-aded-f88953c780d5","Type":"ContainerDied","Data":"83a414b7ec949942f79be6d1dbf0616e11b4720a9ebf75f080cb6c96a5cc8d52"} Oct 07 13:45:03 crc kubenswrapper[4854]: I1007 13:45:03.977031 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.071630 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/512d8fce-7696-41dc-9600-f1b94e06db5c-config-volume\") pod \"512d8fce-7696-41dc-9600-f1b94e06db5c\" (UID: \"512d8fce-7696-41dc-9600-f1b94e06db5c\") " Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.071798 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/512d8fce-7696-41dc-9600-f1b94e06db5c-secret-volume\") pod \"512d8fce-7696-41dc-9600-f1b94e06db5c\" (UID: \"512d8fce-7696-41dc-9600-f1b94e06db5c\") " Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.071843 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4pqm\" (UniqueName: \"kubernetes.io/projected/512d8fce-7696-41dc-9600-f1b94e06db5c-kube-api-access-w4pqm\") pod \"512d8fce-7696-41dc-9600-f1b94e06db5c\" (UID: \"512d8fce-7696-41dc-9600-f1b94e06db5c\") " Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.072777 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/512d8fce-7696-41dc-9600-f1b94e06db5c-config-volume" (OuterVolumeSpecName: "config-volume") pod "512d8fce-7696-41dc-9600-f1b94e06db5c" (UID: "512d8fce-7696-41dc-9600-f1b94e06db5c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.081491 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/512d8fce-7696-41dc-9600-f1b94e06db5c-kube-api-access-w4pqm" (OuterVolumeSpecName: "kube-api-access-w4pqm") pod "512d8fce-7696-41dc-9600-f1b94e06db5c" (UID: "512d8fce-7696-41dc-9600-f1b94e06db5c"). InnerVolumeSpecName "kube-api-access-w4pqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.081479 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512d8fce-7696-41dc-9600-f1b94e06db5c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "512d8fce-7696-41dc-9600-f1b94e06db5c" (UID: "512d8fce-7696-41dc-9600-f1b94e06db5c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.173242 4854 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/512d8fce-7696-41dc-9600-f1b94e06db5c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.173570 4854 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/512d8fce-7696-41dc-9600-f1b94e06db5c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.173580 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4pqm\" (UniqueName: \"kubernetes.io/projected/512d8fce-7696-41dc-9600-f1b94e06db5c-kube-api-access-w4pqm\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.604628 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" event={"ID":"512d8fce-7696-41dc-9600-f1b94e06db5c","Type":"ContainerDied","Data":"705af723c7e5061e548ff77b171ce2351d096dce9ffff0292a47ec77442a9e58"} Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.604690 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="705af723c7e5061e548ff77b171ce2351d096dce9ffff0292a47ec77442a9e58" Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.604916 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf" Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.674271 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m"] Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.681771 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330700-s5w5m"] Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.713738 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a460b3-ea0c-426a-8ab9-86bf29315351" path="/var/lib/kubelet/pods/23a460b3-ea0c-426a-8ab9-86bf29315351/volumes" Oct 07 13:45:04 crc kubenswrapper[4854]: I1007 13:45:04.989757 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.016336 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_8aa5ac33-48ea-4607-aded-f88953c780d5/mariadb-client-7-default/0.log" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.049083 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.053885 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.091256 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzfbx\" (UniqueName: \"kubernetes.io/projected/8aa5ac33-48ea-4607-aded-f88953c780d5-kube-api-access-mzfbx\") pod \"8aa5ac33-48ea-4607-aded-f88953c780d5\" (UID: \"8aa5ac33-48ea-4607-aded-f88953c780d5\") " Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.099352 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aa5ac33-48ea-4607-aded-f88953c780d5-kube-api-access-mzfbx" (OuterVolumeSpecName: "kube-api-access-mzfbx") pod "8aa5ac33-48ea-4607-aded-f88953c780d5" (UID: "8aa5ac33-48ea-4607-aded-f88953c780d5"). InnerVolumeSpecName "kube-api-access-mzfbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.193937 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzfbx\" (UniqueName: \"kubernetes.io/projected/8aa5ac33-48ea-4607-aded-f88953c780d5-kube-api-access-mzfbx\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.255342 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Oct 07 13:45:05 crc kubenswrapper[4854]: E1007 13:45:05.255732 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512d8fce-7696-41dc-9600-f1b94e06db5c" containerName="collect-profiles" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.255749 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="512d8fce-7696-41dc-9600-f1b94e06db5c" containerName="collect-profiles" Oct 07 13:45:05 crc kubenswrapper[4854]: E1007 13:45:05.255766 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aa5ac33-48ea-4607-aded-f88953c780d5" containerName="mariadb-client-7-default" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.255777 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa5ac33-48ea-4607-aded-f88953c780d5" containerName="mariadb-client-7-default" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.256015 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="512d8fce-7696-41dc-9600-f1b94e06db5c" containerName="collect-profiles" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.256033 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aa5ac33-48ea-4607-aded-f88953c780d5" containerName="mariadb-client-7-default" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.256759 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.271989 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.398719 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvm6w\" (UniqueName: \"kubernetes.io/projected/cc3584ab-2f62-46ce-be06-30ed2763940d-kube-api-access-zvm6w\") pod \"mariadb-client-2\" (UID: \"cc3584ab-2f62-46ce-be06-30ed2763940d\") " pod="openstack/mariadb-client-2" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.500461 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvm6w\" (UniqueName: \"kubernetes.io/projected/cc3584ab-2f62-46ce-be06-30ed2763940d-kube-api-access-zvm6w\") pod \"mariadb-client-2\" (UID: \"cc3584ab-2f62-46ce-be06-30ed2763940d\") " pod="openstack/mariadb-client-2" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.533455 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvm6w\" (UniqueName: \"kubernetes.io/projected/cc3584ab-2f62-46ce-be06-30ed2763940d-kube-api-access-zvm6w\") pod \"mariadb-client-2\" (UID: \"cc3584ab-2f62-46ce-be06-30ed2763940d\") " pod="openstack/mariadb-client-2" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.585796 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.626717 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c024aad3e0edd7c889955f1c623f748dd22c632cab93eb6fa99885e5199f79e8" Oct 07 13:45:05 crc kubenswrapper[4854]: I1007 13:45:05.626828 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 07 13:45:06 crc kubenswrapper[4854]: I1007 13:45:06.168068 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 07 13:45:06 crc kubenswrapper[4854]: I1007 13:45:06.641118 4854 generic.go:334] "Generic (PLEG): container finished" podID="cc3584ab-2f62-46ce-be06-30ed2763940d" containerID="7a10764ed191d5690ef344096502c56ee095f17da21ceeca6c4c9aa959627a2d" exitCode=0 Oct 07 13:45:06 crc kubenswrapper[4854]: I1007 13:45:06.641232 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"cc3584ab-2f62-46ce-be06-30ed2763940d","Type":"ContainerDied","Data":"7a10764ed191d5690ef344096502c56ee095f17da21ceeca6c4c9aa959627a2d"} Oct 07 13:45:06 crc kubenswrapper[4854]: I1007 13:45:06.641305 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"cc3584ab-2f62-46ce-be06-30ed2763940d","Type":"ContainerStarted","Data":"1426f52d5f71ca81d681588560ab1c3dd574aca3b3ac33bcda305cd7c880e9e9"} Oct 07 13:45:06 crc kubenswrapper[4854]: I1007 13:45:06.714776 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aa5ac33-48ea-4607-aded-f88953c780d5" path="/var/lib/kubelet/pods/8aa5ac33-48ea-4607-aded-f88953c780d5/volumes" Oct 07 13:45:08 crc kubenswrapper[4854]: I1007 13:45:08.159600 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 07 13:45:08 crc kubenswrapper[4854]: I1007 13:45:08.181954 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_cc3584ab-2f62-46ce-be06-30ed2763940d/mariadb-client-2/0.log" Oct 07 13:45:08 crc kubenswrapper[4854]: I1007 13:45:08.208050 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Oct 07 13:45:08 crc kubenswrapper[4854]: I1007 13:45:08.212912 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Oct 07 13:45:08 crc kubenswrapper[4854]: I1007 13:45:08.249020 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvm6w\" (UniqueName: \"kubernetes.io/projected/cc3584ab-2f62-46ce-be06-30ed2763940d-kube-api-access-zvm6w\") pod \"cc3584ab-2f62-46ce-be06-30ed2763940d\" (UID: \"cc3584ab-2f62-46ce-be06-30ed2763940d\") " Oct 07 13:45:08 crc kubenswrapper[4854]: I1007 13:45:08.255409 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc3584ab-2f62-46ce-be06-30ed2763940d-kube-api-access-zvm6w" (OuterVolumeSpecName: "kube-api-access-zvm6w") pod "cc3584ab-2f62-46ce-be06-30ed2763940d" (UID: "cc3584ab-2f62-46ce-be06-30ed2763940d"). InnerVolumeSpecName "kube-api-access-zvm6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:45:08 crc kubenswrapper[4854]: I1007 13:45:08.351030 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvm6w\" (UniqueName: \"kubernetes.io/projected/cc3584ab-2f62-46ce-be06-30ed2763940d-kube-api-access-zvm6w\") on node \"crc\" DevicePath \"\"" Oct 07 13:45:08 crc kubenswrapper[4854]: I1007 13:45:08.662402 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1426f52d5f71ca81d681588560ab1c3dd574aca3b3ac33bcda305cd7c880e9e9" Oct 07 13:45:08 crc kubenswrapper[4854]: I1007 13:45:08.662500 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 07 13:45:08 crc kubenswrapper[4854]: I1007 13:45:08.714344 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc3584ab-2f62-46ce-be06-30ed2763940d" path="/var/lib/kubelet/pods/cc3584ab-2f62-46ce-be06-30ed2763940d/volumes" Oct 07 13:45:17 crc kubenswrapper[4854]: I1007 13:45:17.703354 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:45:17 crc kubenswrapper[4854]: E1007 13:45:17.703933 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:45:20 crc kubenswrapper[4854]: I1007 13:45:20.883393 4854 scope.go:117] "RemoveContainer" containerID="be9e942096e7b803f05d8791b4faecf3c78506b6dc83e190bda095657995872a" Oct 07 13:45:20 crc kubenswrapper[4854]: I1007 13:45:20.920637 4854 scope.go:117] "RemoveContainer" containerID="d768df5b19a76ee6bc9b9fd92b969ed76c41d17b4e012e19191b75658e07916d" Oct 07 13:45:28 crc kubenswrapper[4854]: I1007 13:45:28.702690 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:45:28 crc kubenswrapper[4854]: E1007 13:45:28.703671 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:45:39 crc kubenswrapper[4854]: I1007 13:45:39.703193 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:45:39 crc kubenswrapper[4854]: E1007 13:45:39.704190 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:45:53 crc kubenswrapper[4854]: I1007 13:45:53.703692 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:45:54 crc kubenswrapper[4854]: I1007 13:45:54.118413 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"2fcfadcb8d4fbb46c6073e004d4af8599758985c036e716d488d12dd6011cd13"} Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.570684 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m986d"] Oct 07 13:46:26 crc kubenswrapper[4854]: E1007 13:46:26.572250 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc3584ab-2f62-46ce-be06-30ed2763940d" containerName="mariadb-client-2" Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.572281 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc3584ab-2f62-46ce-be06-30ed2763940d" containerName="mariadb-client-2" Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.572592 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc3584ab-2f62-46ce-be06-30ed2763940d" containerName="mariadb-client-2" Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.574587 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.602702 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m986d"] Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.717411 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpc24\" (UniqueName: \"kubernetes.io/projected/07b8b36e-6b18-453b-a065-6e0a0bab4a35-kube-api-access-rpc24\") pod \"redhat-marketplace-m986d\" (UID: \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\") " pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.717517 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b8b36e-6b18-453b-a065-6e0a0bab4a35-utilities\") pod \"redhat-marketplace-m986d\" (UID: \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\") " pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.717560 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b8b36e-6b18-453b-a065-6e0a0bab4a35-catalog-content\") pod \"redhat-marketplace-m986d\" (UID: \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\") " pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.819727 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpc24\" (UniqueName: \"kubernetes.io/projected/07b8b36e-6b18-453b-a065-6e0a0bab4a35-kube-api-access-rpc24\") pod \"redhat-marketplace-m986d\" (UID: \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\") " pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.819827 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b8b36e-6b18-453b-a065-6e0a0bab4a35-utilities\") pod \"redhat-marketplace-m986d\" (UID: \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\") " pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.819865 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b8b36e-6b18-453b-a065-6e0a0bab4a35-catalog-content\") pod \"redhat-marketplace-m986d\" (UID: \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\") " pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.820632 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b8b36e-6b18-453b-a065-6e0a0bab4a35-utilities\") pod \"redhat-marketplace-m986d\" (UID: \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\") " pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.821091 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b8b36e-6b18-453b-a065-6e0a0bab4a35-catalog-content\") pod \"redhat-marketplace-m986d\" (UID: \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\") " pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.850063 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpc24\" (UniqueName: \"kubernetes.io/projected/07b8b36e-6b18-453b-a065-6e0a0bab4a35-kube-api-access-rpc24\") pod \"redhat-marketplace-m986d\" (UID: \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\") " pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:26 crc kubenswrapper[4854]: I1007 13:46:26.901199 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:27 crc kubenswrapper[4854]: I1007 13:46:27.181358 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m986d"] Oct 07 13:46:27 crc kubenswrapper[4854]: I1007 13:46:27.477658 4854 generic.go:334] "Generic (PLEG): container finished" podID="07b8b36e-6b18-453b-a065-6e0a0bab4a35" containerID="15d986fe0283899f84895b9fb42260056e51fbdfac6fde7f39e6f8cc4db3b4e4" exitCode=0 Oct 07 13:46:27 crc kubenswrapper[4854]: I1007 13:46:27.477745 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m986d" event={"ID":"07b8b36e-6b18-453b-a065-6e0a0bab4a35","Type":"ContainerDied","Data":"15d986fe0283899f84895b9fb42260056e51fbdfac6fde7f39e6f8cc4db3b4e4"} Oct 07 13:46:27 crc kubenswrapper[4854]: I1007 13:46:27.477991 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m986d" event={"ID":"07b8b36e-6b18-453b-a065-6e0a0bab4a35","Type":"ContainerStarted","Data":"5bd3c0d39391dca57c3bf98c299a16820de647e676e3c307d9d9d12d6a988dd0"} Oct 07 13:46:27 crc kubenswrapper[4854]: I1007 13:46:27.479777 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:46:29 crc kubenswrapper[4854]: I1007 13:46:29.511817 4854 generic.go:334] "Generic (PLEG): container finished" podID="07b8b36e-6b18-453b-a065-6e0a0bab4a35" containerID="99bc7d081251a132e3baecef5269486f91f86790f6d6adf049033be5e2bbfa02" exitCode=0 Oct 07 13:46:29 crc kubenswrapper[4854]: I1007 13:46:29.512002 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m986d" event={"ID":"07b8b36e-6b18-453b-a065-6e0a0bab4a35","Type":"ContainerDied","Data":"99bc7d081251a132e3baecef5269486f91f86790f6d6adf049033be5e2bbfa02"} Oct 07 13:46:30 crc kubenswrapper[4854]: I1007 13:46:30.531681 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m986d" event={"ID":"07b8b36e-6b18-453b-a065-6e0a0bab4a35","Type":"ContainerStarted","Data":"c69a8eeb499314275bd0a37ad55bfb09948c2e51bcb1e918368e68f301532ebf"} Oct 07 13:46:30 crc kubenswrapper[4854]: I1007 13:46:30.564686 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m986d" podStartSLOduration=1.858339843 podStartE2EDuration="4.564652448s" podCreationTimestamp="2025-10-07 13:46:26 +0000 UTC" firstStartedPulling="2025-10-07 13:46:27.479405897 +0000 UTC m=+4903.467238162" lastFinishedPulling="2025-10-07 13:46:30.185718502 +0000 UTC m=+4906.173550767" observedRunningTime="2025-10-07 13:46:30.553486516 +0000 UTC m=+4906.541318851" watchObservedRunningTime="2025-10-07 13:46:30.564652448 +0000 UTC m=+4906.552484743" Oct 07 13:46:36 crc kubenswrapper[4854]: I1007 13:46:36.901851 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:36 crc kubenswrapper[4854]: I1007 13:46:36.902456 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:36 crc kubenswrapper[4854]: I1007 13:46:36.972027 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:37 crc kubenswrapper[4854]: I1007 13:46:37.656102 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:37 crc kubenswrapper[4854]: I1007 13:46:37.722496 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m986d"] Oct 07 13:46:39 crc kubenswrapper[4854]: I1007 13:46:39.614143 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m986d" podUID="07b8b36e-6b18-453b-a065-6e0a0bab4a35" containerName="registry-server" containerID="cri-o://c69a8eeb499314275bd0a37ad55bfb09948c2e51bcb1e918368e68f301532ebf" gracePeriod=2 Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.102671 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.272948 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b8b36e-6b18-453b-a065-6e0a0bab4a35-catalog-content\") pod \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\" (UID: \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\") " Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.273070 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpc24\" (UniqueName: \"kubernetes.io/projected/07b8b36e-6b18-453b-a065-6e0a0bab4a35-kube-api-access-rpc24\") pod \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\" (UID: \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\") " Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.273116 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b8b36e-6b18-453b-a065-6e0a0bab4a35-utilities\") pod \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\" (UID: \"07b8b36e-6b18-453b-a065-6e0a0bab4a35\") " Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.274354 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07b8b36e-6b18-453b-a065-6e0a0bab4a35-utilities" (OuterVolumeSpecName: "utilities") pod "07b8b36e-6b18-453b-a065-6e0a0bab4a35" (UID: "07b8b36e-6b18-453b-a065-6e0a0bab4a35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.286423 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07b8b36e-6b18-453b-a065-6e0a0bab4a35-kube-api-access-rpc24" (OuterVolumeSpecName: "kube-api-access-rpc24") pod "07b8b36e-6b18-453b-a065-6e0a0bab4a35" (UID: "07b8b36e-6b18-453b-a065-6e0a0bab4a35"). InnerVolumeSpecName "kube-api-access-rpc24". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.299616 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07b8b36e-6b18-453b-a065-6e0a0bab4a35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07b8b36e-6b18-453b-a065-6e0a0bab4a35" (UID: "07b8b36e-6b18-453b-a065-6e0a0bab4a35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.375367 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07b8b36e-6b18-453b-a065-6e0a0bab4a35-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.375405 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpc24\" (UniqueName: \"kubernetes.io/projected/07b8b36e-6b18-453b-a065-6e0a0bab4a35-kube-api-access-rpc24\") on node \"crc\" DevicePath \"\"" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.375421 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07b8b36e-6b18-453b-a065-6e0a0bab4a35-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.625228 4854 generic.go:334] "Generic (PLEG): container finished" podID="07b8b36e-6b18-453b-a065-6e0a0bab4a35" containerID="c69a8eeb499314275bd0a37ad55bfb09948c2e51bcb1e918368e68f301532ebf" exitCode=0 Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.625260 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m986d" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.625285 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m986d" event={"ID":"07b8b36e-6b18-453b-a065-6e0a0bab4a35","Type":"ContainerDied","Data":"c69a8eeb499314275bd0a37ad55bfb09948c2e51bcb1e918368e68f301532ebf"} Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.625321 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m986d" event={"ID":"07b8b36e-6b18-453b-a065-6e0a0bab4a35","Type":"ContainerDied","Data":"5bd3c0d39391dca57c3bf98c299a16820de647e676e3c307d9d9d12d6a988dd0"} Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.625342 4854 scope.go:117] "RemoveContainer" containerID="c69a8eeb499314275bd0a37ad55bfb09948c2e51bcb1e918368e68f301532ebf" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.663414 4854 scope.go:117] "RemoveContainer" containerID="99bc7d081251a132e3baecef5269486f91f86790f6d6adf049033be5e2bbfa02" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.672432 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m986d"] Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.680746 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m986d"] Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.695675 4854 scope.go:117] "RemoveContainer" containerID="15d986fe0283899f84895b9fb42260056e51fbdfac6fde7f39e6f8cc4db3b4e4" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.714756 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07b8b36e-6b18-453b-a065-6e0a0bab4a35" path="/var/lib/kubelet/pods/07b8b36e-6b18-453b-a065-6e0a0bab4a35/volumes" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.740465 4854 scope.go:117] "RemoveContainer" containerID="c69a8eeb499314275bd0a37ad55bfb09948c2e51bcb1e918368e68f301532ebf" Oct 07 13:46:40 crc kubenswrapper[4854]: E1007 13:46:40.741182 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c69a8eeb499314275bd0a37ad55bfb09948c2e51bcb1e918368e68f301532ebf\": container with ID starting with c69a8eeb499314275bd0a37ad55bfb09948c2e51bcb1e918368e68f301532ebf not found: ID does not exist" containerID="c69a8eeb499314275bd0a37ad55bfb09948c2e51bcb1e918368e68f301532ebf" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.741274 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69a8eeb499314275bd0a37ad55bfb09948c2e51bcb1e918368e68f301532ebf"} err="failed to get container status \"c69a8eeb499314275bd0a37ad55bfb09948c2e51bcb1e918368e68f301532ebf\": rpc error: code = NotFound desc = could not find container \"c69a8eeb499314275bd0a37ad55bfb09948c2e51bcb1e918368e68f301532ebf\": container with ID starting with c69a8eeb499314275bd0a37ad55bfb09948c2e51bcb1e918368e68f301532ebf not found: ID does not exist" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.741308 4854 scope.go:117] "RemoveContainer" containerID="99bc7d081251a132e3baecef5269486f91f86790f6d6adf049033be5e2bbfa02" Oct 07 13:46:40 crc kubenswrapper[4854]: E1007 13:46:40.741687 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99bc7d081251a132e3baecef5269486f91f86790f6d6adf049033be5e2bbfa02\": container with ID starting with 99bc7d081251a132e3baecef5269486f91f86790f6d6adf049033be5e2bbfa02 not found: ID does not exist" containerID="99bc7d081251a132e3baecef5269486f91f86790f6d6adf049033be5e2bbfa02" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.741732 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99bc7d081251a132e3baecef5269486f91f86790f6d6adf049033be5e2bbfa02"} err="failed to get container status \"99bc7d081251a132e3baecef5269486f91f86790f6d6adf049033be5e2bbfa02\": rpc error: code = NotFound desc = could not find container \"99bc7d081251a132e3baecef5269486f91f86790f6d6adf049033be5e2bbfa02\": container with ID starting with 99bc7d081251a132e3baecef5269486f91f86790f6d6adf049033be5e2bbfa02 not found: ID does not exist" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.741765 4854 scope.go:117] "RemoveContainer" containerID="15d986fe0283899f84895b9fb42260056e51fbdfac6fde7f39e6f8cc4db3b4e4" Oct 07 13:46:40 crc kubenswrapper[4854]: E1007 13:46:40.742177 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15d986fe0283899f84895b9fb42260056e51fbdfac6fde7f39e6f8cc4db3b4e4\": container with ID starting with 15d986fe0283899f84895b9fb42260056e51fbdfac6fde7f39e6f8cc4db3b4e4 not found: ID does not exist" containerID="15d986fe0283899f84895b9fb42260056e51fbdfac6fde7f39e6f8cc4db3b4e4" Oct 07 13:46:40 crc kubenswrapper[4854]: I1007 13:46:40.742271 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15d986fe0283899f84895b9fb42260056e51fbdfac6fde7f39e6f8cc4db3b4e4"} err="failed to get container status \"15d986fe0283899f84895b9fb42260056e51fbdfac6fde7f39e6f8cc4db3b4e4\": rpc error: code = NotFound desc = could not find container \"15d986fe0283899f84895b9fb42260056e51fbdfac6fde7f39e6f8cc4db3b4e4\": container with ID starting with 15d986fe0283899f84895b9fb42260056e51fbdfac6fde7f39e6f8cc4db3b4e4 not found: ID does not exist" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.448079 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-65cbx"] Oct 07 13:46:59 crc kubenswrapper[4854]: E1007 13:46:59.455630 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b8b36e-6b18-453b-a065-6e0a0bab4a35" containerName="registry-server" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.455925 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b8b36e-6b18-453b-a065-6e0a0bab4a35" containerName="registry-server" Oct 07 13:46:59 crc kubenswrapper[4854]: E1007 13:46:59.456176 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b8b36e-6b18-453b-a065-6e0a0bab4a35" containerName="extract-content" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.456401 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b8b36e-6b18-453b-a065-6e0a0bab4a35" containerName="extract-content" Oct 07 13:46:59 crc kubenswrapper[4854]: E1007 13:46:59.456600 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b8b36e-6b18-453b-a065-6e0a0bab4a35" containerName="extract-utilities" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.456760 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b8b36e-6b18-453b-a065-6e0a0bab4a35" containerName="extract-utilities" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.457388 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="07b8b36e-6b18-453b-a065-6e0a0bab4a35" containerName="registry-server" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.460262 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.464983 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-65cbx"] Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.556923 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qsls\" (UniqueName: \"kubernetes.io/projected/b4d0fee4-497f-4faf-8f90-ba64e27923a1-kube-api-access-2qsls\") pod \"certified-operators-65cbx\" (UID: \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\") " pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.557448 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d0fee4-497f-4faf-8f90-ba64e27923a1-catalog-content\") pod \"certified-operators-65cbx\" (UID: \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\") " pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.557496 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d0fee4-497f-4faf-8f90-ba64e27923a1-utilities\") pod \"certified-operators-65cbx\" (UID: \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\") " pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.658752 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qsls\" (UniqueName: \"kubernetes.io/projected/b4d0fee4-497f-4faf-8f90-ba64e27923a1-kube-api-access-2qsls\") pod \"certified-operators-65cbx\" (UID: \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\") " pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.659498 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d0fee4-497f-4faf-8f90-ba64e27923a1-catalog-content\") pod \"certified-operators-65cbx\" (UID: \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\") " pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.659638 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d0fee4-497f-4faf-8f90-ba64e27923a1-utilities\") pod \"certified-operators-65cbx\" (UID: \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\") " pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.660183 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d0fee4-497f-4faf-8f90-ba64e27923a1-catalog-content\") pod \"certified-operators-65cbx\" (UID: \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\") " pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.660477 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d0fee4-497f-4faf-8f90-ba64e27923a1-utilities\") pod \"certified-operators-65cbx\" (UID: \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\") " pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.685133 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qsls\" (UniqueName: \"kubernetes.io/projected/b4d0fee4-497f-4faf-8f90-ba64e27923a1-kube-api-access-2qsls\") pod \"certified-operators-65cbx\" (UID: \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\") " pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:46:59 crc kubenswrapper[4854]: I1007 13:46:59.796436 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:47:00 crc kubenswrapper[4854]: I1007 13:47:00.269333 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-65cbx"] Oct 07 13:47:00 crc kubenswrapper[4854]: I1007 13:47:00.842543 4854 generic.go:334] "Generic (PLEG): container finished" podID="b4d0fee4-497f-4faf-8f90-ba64e27923a1" containerID="8e33642dba12269cb0f76e38b391a4faf36c39c6ff6a5508499b861956975168" exitCode=0 Oct 07 13:47:00 crc kubenswrapper[4854]: I1007 13:47:00.842635 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65cbx" event={"ID":"b4d0fee4-497f-4faf-8f90-ba64e27923a1","Type":"ContainerDied","Data":"8e33642dba12269cb0f76e38b391a4faf36c39c6ff6a5508499b861956975168"} Oct 07 13:47:00 crc kubenswrapper[4854]: I1007 13:47:00.843054 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65cbx" event={"ID":"b4d0fee4-497f-4faf-8f90-ba64e27923a1","Type":"ContainerStarted","Data":"ff6bc8bcac234d1a1cc845dd50ceca271f7ffad5c114cb7b687ad0768f378d32"} Oct 07 13:47:02 crc kubenswrapper[4854]: I1007 13:47:02.871529 4854 generic.go:334] "Generic (PLEG): container finished" podID="b4d0fee4-497f-4faf-8f90-ba64e27923a1" containerID="30ac02544cb414cb0ada33b606d8a2cbece66ab2404c115f717567c6f62634dd" exitCode=0 Oct 07 13:47:02 crc kubenswrapper[4854]: I1007 13:47:02.871602 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65cbx" event={"ID":"b4d0fee4-497f-4faf-8f90-ba64e27923a1","Type":"ContainerDied","Data":"30ac02544cb414cb0ada33b606d8a2cbece66ab2404c115f717567c6f62634dd"} Oct 07 13:47:03 crc kubenswrapper[4854]: I1007 13:47:03.886928 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65cbx" event={"ID":"b4d0fee4-497f-4faf-8f90-ba64e27923a1","Type":"ContainerStarted","Data":"a52e728f96dfd3e562e995ddf814fdb69a6c6719446e39d14066a8016a760f73"} Oct 07 13:47:03 crc kubenswrapper[4854]: I1007 13:47:03.959326 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-65cbx" podStartSLOduration=2.406176462 podStartE2EDuration="4.959308511s" podCreationTimestamp="2025-10-07 13:46:59 +0000 UTC" firstStartedPulling="2025-10-07 13:47:00.845060704 +0000 UTC m=+4936.832892969" lastFinishedPulling="2025-10-07 13:47:03.398192723 +0000 UTC m=+4939.386025018" observedRunningTime="2025-10-07 13:47:03.934508106 +0000 UTC m=+4939.922340401" watchObservedRunningTime="2025-10-07 13:47:03.959308511 +0000 UTC m=+4939.947140766" Oct 07 13:47:09 crc kubenswrapper[4854]: I1007 13:47:09.797259 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:47:09 crc kubenswrapper[4854]: I1007 13:47:09.798014 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:47:09 crc kubenswrapper[4854]: I1007 13:47:09.875980 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:47:10 crc kubenswrapper[4854]: I1007 13:47:10.023322 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:47:10 crc kubenswrapper[4854]: I1007 13:47:10.125268 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-65cbx"] Oct 07 13:47:11 crc kubenswrapper[4854]: I1007 13:47:11.968769 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-65cbx" podUID="b4d0fee4-497f-4faf-8f90-ba64e27923a1" containerName="registry-server" containerID="cri-o://a52e728f96dfd3e562e995ddf814fdb69a6c6719446e39d14066a8016a760f73" gracePeriod=2 Oct 07 13:47:12 crc kubenswrapper[4854]: I1007 13:47:12.457511 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:47:12 crc kubenswrapper[4854]: I1007 13:47:12.619819 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d0fee4-497f-4faf-8f90-ba64e27923a1-utilities\") pod \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\" (UID: \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\") " Oct 07 13:47:12 crc kubenswrapper[4854]: I1007 13:47:12.620015 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d0fee4-497f-4faf-8f90-ba64e27923a1-catalog-content\") pod \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\" (UID: \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\") " Oct 07 13:47:12 crc kubenswrapper[4854]: I1007 13:47:12.620125 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qsls\" (UniqueName: \"kubernetes.io/projected/b4d0fee4-497f-4faf-8f90-ba64e27923a1-kube-api-access-2qsls\") pod \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\" (UID: \"b4d0fee4-497f-4faf-8f90-ba64e27923a1\") " Oct 07 13:47:12 crc kubenswrapper[4854]: I1007 13:47:12.621058 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4d0fee4-497f-4faf-8f90-ba64e27923a1-utilities" (OuterVolumeSpecName: "utilities") pod "b4d0fee4-497f-4faf-8f90-ba64e27923a1" (UID: "b4d0fee4-497f-4faf-8f90-ba64e27923a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:47:12 crc kubenswrapper[4854]: I1007 13:47:12.626133 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4d0fee4-497f-4faf-8f90-ba64e27923a1-kube-api-access-2qsls" (OuterVolumeSpecName: "kube-api-access-2qsls") pod "b4d0fee4-497f-4faf-8f90-ba64e27923a1" (UID: "b4d0fee4-497f-4faf-8f90-ba64e27923a1"). InnerVolumeSpecName "kube-api-access-2qsls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:47:12 crc kubenswrapper[4854]: I1007 13:47:12.721563 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d0fee4-497f-4faf-8f90-ba64e27923a1-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:47:12 crc kubenswrapper[4854]: I1007 13:47:12.721793 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qsls\" (UniqueName: \"kubernetes.io/projected/b4d0fee4-497f-4faf-8f90-ba64e27923a1-kube-api-access-2qsls\") on node \"crc\" DevicePath \"\"" Oct 07 13:47:12 crc kubenswrapper[4854]: I1007 13:47:12.979550 4854 generic.go:334] "Generic (PLEG): container finished" podID="b4d0fee4-497f-4faf-8f90-ba64e27923a1" containerID="a52e728f96dfd3e562e995ddf814fdb69a6c6719446e39d14066a8016a760f73" exitCode=0 Oct 07 13:47:12 crc kubenswrapper[4854]: I1007 13:47:12.979594 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65cbx" event={"ID":"b4d0fee4-497f-4faf-8f90-ba64e27923a1","Type":"ContainerDied","Data":"a52e728f96dfd3e562e995ddf814fdb69a6c6719446e39d14066a8016a760f73"} Oct 07 13:47:12 crc kubenswrapper[4854]: I1007 13:47:12.979628 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65cbx" event={"ID":"b4d0fee4-497f-4faf-8f90-ba64e27923a1","Type":"ContainerDied","Data":"ff6bc8bcac234d1a1cc845dd50ceca271f7ffad5c114cb7b687ad0768f378d32"} Oct 07 13:47:12 crc kubenswrapper[4854]: I1007 13:47:12.979648 4854 scope.go:117] "RemoveContainer" containerID="a52e728f96dfd3e562e995ddf814fdb69a6c6719446e39d14066a8016a760f73" Oct 07 13:47:12 crc kubenswrapper[4854]: I1007 13:47:12.979669 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65cbx" Oct 07 13:47:13 crc kubenswrapper[4854]: I1007 13:47:13.001761 4854 scope.go:117] "RemoveContainer" containerID="30ac02544cb414cb0ada33b606d8a2cbece66ab2404c115f717567c6f62634dd" Oct 07 13:47:13 crc kubenswrapper[4854]: I1007 13:47:13.030044 4854 scope.go:117] "RemoveContainer" containerID="8e33642dba12269cb0f76e38b391a4faf36c39c6ff6a5508499b861956975168" Oct 07 13:47:13 crc kubenswrapper[4854]: I1007 13:47:13.060717 4854 scope.go:117] "RemoveContainer" containerID="a52e728f96dfd3e562e995ddf814fdb69a6c6719446e39d14066a8016a760f73" Oct 07 13:47:13 crc kubenswrapper[4854]: E1007 13:47:13.061301 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a52e728f96dfd3e562e995ddf814fdb69a6c6719446e39d14066a8016a760f73\": container with ID starting with a52e728f96dfd3e562e995ddf814fdb69a6c6719446e39d14066a8016a760f73 not found: ID does not exist" containerID="a52e728f96dfd3e562e995ddf814fdb69a6c6719446e39d14066a8016a760f73" Oct 07 13:47:13 crc kubenswrapper[4854]: I1007 13:47:13.061343 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a52e728f96dfd3e562e995ddf814fdb69a6c6719446e39d14066a8016a760f73"} err="failed to get container status \"a52e728f96dfd3e562e995ddf814fdb69a6c6719446e39d14066a8016a760f73\": rpc error: code = NotFound desc = could not find container \"a52e728f96dfd3e562e995ddf814fdb69a6c6719446e39d14066a8016a760f73\": container with ID starting with a52e728f96dfd3e562e995ddf814fdb69a6c6719446e39d14066a8016a760f73 not found: ID does not exist" Oct 07 13:47:13 crc kubenswrapper[4854]: I1007 13:47:13.061370 4854 scope.go:117] "RemoveContainer" containerID="30ac02544cb414cb0ada33b606d8a2cbece66ab2404c115f717567c6f62634dd" Oct 07 13:47:13 crc kubenswrapper[4854]: E1007 13:47:13.061762 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30ac02544cb414cb0ada33b606d8a2cbece66ab2404c115f717567c6f62634dd\": container with ID starting with 30ac02544cb414cb0ada33b606d8a2cbece66ab2404c115f717567c6f62634dd not found: ID does not exist" containerID="30ac02544cb414cb0ada33b606d8a2cbece66ab2404c115f717567c6f62634dd" Oct 07 13:47:13 crc kubenswrapper[4854]: I1007 13:47:13.061784 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ac02544cb414cb0ada33b606d8a2cbece66ab2404c115f717567c6f62634dd"} err="failed to get container status \"30ac02544cb414cb0ada33b606d8a2cbece66ab2404c115f717567c6f62634dd\": rpc error: code = NotFound desc = could not find container \"30ac02544cb414cb0ada33b606d8a2cbece66ab2404c115f717567c6f62634dd\": container with ID starting with 30ac02544cb414cb0ada33b606d8a2cbece66ab2404c115f717567c6f62634dd not found: ID does not exist" Oct 07 13:47:13 crc kubenswrapper[4854]: I1007 13:47:13.061795 4854 scope.go:117] "RemoveContainer" containerID="8e33642dba12269cb0f76e38b391a4faf36c39c6ff6a5508499b861956975168" Oct 07 13:47:13 crc kubenswrapper[4854]: E1007 13:47:13.062159 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e33642dba12269cb0f76e38b391a4faf36c39c6ff6a5508499b861956975168\": container with ID starting with 8e33642dba12269cb0f76e38b391a4faf36c39c6ff6a5508499b861956975168 not found: ID does not exist" containerID="8e33642dba12269cb0f76e38b391a4faf36c39c6ff6a5508499b861956975168" Oct 07 13:47:13 crc kubenswrapper[4854]: I1007 13:47:13.062180 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e33642dba12269cb0f76e38b391a4faf36c39c6ff6a5508499b861956975168"} err="failed to get container status \"8e33642dba12269cb0f76e38b391a4faf36c39c6ff6a5508499b861956975168\": rpc error: code = NotFound desc = could not find container \"8e33642dba12269cb0f76e38b391a4faf36c39c6ff6a5508499b861956975168\": container with ID starting with 8e33642dba12269cb0f76e38b391a4faf36c39c6ff6a5508499b861956975168 not found: ID does not exist" Oct 07 13:47:13 crc kubenswrapper[4854]: I1007 13:47:13.183485 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4d0fee4-497f-4faf-8f90-ba64e27923a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4d0fee4-497f-4faf-8f90-ba64e27923a1" (UID: "b4d0fee4-497f-4faf-8f90-ba64e27923a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:47:13 crc kubenswrapper[4854]: I1007 13:47:13.229784 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d0fee4-497f-4faf-8f90-ba64e27923a1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:47:13 crc kubenswrapper[4854]: I1007 13:47:13.336614 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-65cbx"] Oct 07 13:47:13 crc kubenswrapper[4854]: I1007 13:47:13.350666 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-65cbx"] Oct 07 13:47:14 crc kubenswrapper[4854]: I1007 13:47:14.719444 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4d0fee4-497f-4faf-8f90-ba64e27923a1" path="/var/lib/kubelet/pods/b4d0fee4-497f-4faf-8f90-ba64e27923a1/volumes" Oct 07 13:48:10 crc kubenswrapper[4854]: I1007 13:48:10.807661 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:48:10 crc kubenswrapper[4854]: I1007 13:48:10.808377 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:48:21 crc kubenswrapper[4854]: I1007 13:48:21.109734 4854 scope.go:117] "RemoveContainer" containerID="c464c8430e1a2e9d750bcd0c727ee79d33a576c6aca4188462276619df7253e4" Oct 07 13:48:21 crc kubenswrapper[4854]: I1007 13:48:21.134911 4854 scope.go:117] "RemoveContainer" containerID="9d7058f56625b72382ccb3b766a794327bb29db3ea1f0159ffe9b08759b8b650" Oct 07 13:48:21 crc kubenswrapper[4854]: I1007 13:48:21.178188 4854 scope.go:117] "RemoveContainer" containerID="18c22bc2916376529a73f0ecf48105093810df22057abc99d572b5b7c54489f7" Oct 07 13:48:40 crc kubenswrapper[4854]: I1007 13:48:40.808005 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:48:40 crc kubenswrapper[4854]: I1007 13:48:40.808970 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:49:10 crc kubenswrapper[4854]: I1007 13:49:10.807540 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:49:10 crc kubenswrapper[4854]: I1007 13:49:10.808434 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:49:10 crc kubenswrapper[4854]: I1007 13:49:10.808527 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 13:49:10 crc kubenswrapper[4854]: I1007 13:49:10.809800 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2fcfadcb8d4fbb46c6073e004d4af8599758985c036e716d488d12dd6011cd13"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:49:10 crc kubenswrapper[4854]: I1007 13:49:10.809910 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://2fcfadcb8d4fbb46c6073e004d4af8599758985c036e716d488d12dd6011cd13" gracePeriod=600 Oct 07 13:49:11 crc kubenswrapper[4854]: I1007 13:49:11.167331 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="2fcfadcb8d4fbb46c6073e004d4af8599758985c036e716d488d12dd6011cd13" exitCode=0 Oct 07 13:49:11 crc kubenswrapper[4854]: I1007 13:49:11.167451 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"2fcfadcb8d4fbb46c6073e004d4af8599758985c036e716d488d12dd6011cd13"} Oct 07 13:49:11 crc kubenswrapper[4854]: I1007 13:49:11.167521 4854 scope.go:117] "RemoveContainer" containerID="e83798b11b7f463b027896c1ec862c4cbd843df2676156232deaf979d790ceea" Oct 07 13:49:12 crc kubenswrapper[4854]: I1007 13:49:12.181717 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e"} Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.684917 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ttzdl"] Oct 07 13:49:15 crc kubenswrapper[4854]: E1007 13:49:15.685814 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d0fee4-497f-4faf-8f90-ba64e27923a1" containerName="extract-content" Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.685826 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d0fee4-497f-4faf-8f90-ba64e27923a1" containerName="extract-content" Oct 07 13:49:15 crc kubenswrapper[4854]: E1007 13:49:15.685833 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d0fee4-497f-4faf-8f90-ba64e27923a1" containerName="registry-server" Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.685839 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d0fee4-497f-4faf-8f90-ba64e27923a1" containerName="registry-server" Oct 07 13:49:15 crc kubenswrapper[4854]: E1007 13:49:15.685855 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d0fee4-497f-4faf-8f90-ba64e27923a1" containerName="extract-utilities" Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.685861 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d0fee4-497f-4faf-8f90-ba64e27923a1" containerName="extract-utilities" Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.686005 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d0fee4-497f-4faf-8f90-ba64e27923a1" containerName="registry-server" Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.687273 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.708725 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttzdl"] Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.788586 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/434856cf-3f9d-4116-bb83-a80ec60c3498-catalog-content\") pod \"redhat-operators-ttzdl\" (UID: \"434856cf-3f9d-4116-bb83-a80ec60c3498\") " pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.788658 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppgg5\" (UniqueName: \"kubernetes.io/projected/434856cf-3f9d-4116-bb83-a80ec60c3498-kube-api-access-ppgg5\") pod \"redhat-operators-ttzdl\" (UID: \"434856cf-3f9d-4116-bb83-a80ec60c3498\") " pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.788748 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/434856cf-3f9d-4116-bb83-a80ec60c3498-utilities\") pod \"redhat-operators-ttzdl\" (UID: \"434856cf-3f9d-4116-bb83-a80ec60c3498\") " pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.890310 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/434856cf-3f9d-4116-bb83-a80ec60c3498-catalog-content\") pod \"redhat-operators-ttzdl\" (UID: \"434856cf-3f9d-4116-bb83-a80ec60c3498\") " pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.890375 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppgg5\" (UniqueName: \"kubernetes.io/projected/434856cf-3f9d-4116-bb83-a80ec60c3498-kube-api-access-ppgg5\") pod \"redhat-operators-ttzdl\" (UID: \"434856cf-3f9d-4116-bb83-a80ec60c3498\") " pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.890429 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/434856cf-3f9d-4116-bb83-a80ec60c3498-utilities\") pod \"redhat-operators-ttzdl\" (UID: \"434856cf-3f9d-4116-bb83-a80ec60c3498\") " pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.891505 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/434856cf-3f9d-4116-bb83-a80ec60c3498-utilities\") pod \"redhat-operators-ttzdl\" (UID: \"434856cf-3f9d-4116-bb83-a80ec60c3498\") " pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:15 crc kubenswrapper[4854]: I1007 13:49:15.891708 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/434856cf-3f9d-4116-bb83-a80ec60c3498-catalog-content\") pod \"redhat-operators-ttzdl\" (UID: \"434856cf-3f9d-4116-bb83-a80ec60c3498\") " pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:16 crc kubenswrapper[4854]: I1007 13:49:16.140746 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppgg5\" (UniqueName: \"kubernetes.io/projected/434856cf-3f9d-4116-bb83-a80ec60c3498-kube-api-access-ppgg5\") pod \"redhat-operators-ttzdl\" (UID: \"434856cf-3f9d-4116-bb83-a80ec60c3498\") " pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:16 crc kubenswrapper[4854]: I1007 13:49:16.309491 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:16 crc kubenswrapper[4854]: I1007 13:49:16.596379 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttzdl"] Oct 07 13:49:17 crc kubenswrapper[4854]: I1007 13:49:17.232290 4854 generic.go:334] "Generic (PLEG): container finished" podID="434856cf-3f9d-4116-bb83-a80ec60c3498" containerID="db11b31687c61c180611e04056b0dc6a40c5a86779e8365bcd1c9d7d89e339f8" exitCode=0 Oct 07 13:49:17 crc kubenswrapper[4854]: I1007 13:49:17.232353 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttzdl" event={"ID":"434856cf-3f9d-4116-bb83-a80ec60c3498","Type":"ContainerDied","Data":"db11b31687c61c180611e04056b0dc6a40c5a86779e8365bcd1c9d7d89e339f8"} Oct 07 13:49:17 crc kubenswrapper[4854]: I1007 13:49:17.232401 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttzdl" event={"ID":"434856cf-3f9d-4116-bb83-a80ec60c3498","Type":"ContainerStarted","Data":"3b8eabaaa60205b00e14b5c1510af0f94997ff454f4fbb68c16f0ba890b95e7b"} Oct 07 13:49:19 crc kubenswrapper[4854]: I1007 13:49:19.191932 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Oct 07 13:49:19 crc kubenswrapper[4854]: I1007 13:49:19.194404 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 07 13:49:19 crc kubenswrapper[4854]: I1007 13:49:19.197098 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-9pznn" Oct 07 13:49:19 crc kubenswrapper[4854]: I1007 13:49:19.207340 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 07 13:49:19 crc kubenswrapper[4854]: I1007 13:49:19.256882 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttzdl" event={"ID":"434856cf-3f9d-4116-bb83-a80ec60c3498","Type":"ContainerStarted","Data":"1751e5586b97bbb83fed659eab40de026174a15c136108d56b1f0c7df5ddc139"} Oct 07 13:49:19 crc kubenswrapper[4854]: I1007 13:49:19.362843 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-93790b84-c815-49ab-af91-702fce8e9c29\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93790b84-c815-49ab-af91-702fce8e9c29\") pod \"mariadb-copy-data\" (UID: \"991f7c53-8762-472e-b968-3b1a8fc55d8c\") " pod="openstack/mariadb-copy-data" Oct 07 13:49:19 crc kubenswrapper[4854]: I1007 13:49:19.364193 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkv2n\" (UniqueName: \"kubernetes.io/projected/991f7c53-8762-472e-b968-3b1a8fc55d8c-kube-api-access-hkv2n\") pod \"mariadb-copy-data\" (UID: \"991f7c53-8762-472e-b968-3b1a8fc55d8c\") " pod="openstack/mariadb-copy-data" Oct 07 13:49:19 crc kubenswrapper[4854]: I1007 13:49:19.465628 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkv2n\" (UniqueName: \"kubernetes.io/projected/991f7c53-8762-472e-b968-3b1a8fc55d8c-kube-api-access-hkv2n\") pod \"mariadb-copy-data\" (UID: \"991f7c53-8762-472e-b968-3b1a8fc55d8c\") " pod="openstack/mariadb-copy-data" Oct 07 13:49:19 crc kubenswrapper[4854]: I1007 13:49:19.465764 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-93790b84-c815-49ab-af91-702fce8e9c29\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93790b84-c815-49ab-af91-702fce8e9c29\") pod \"mariadb-copy-data\" (UID: \"991f7c53-8762-472e-b968-3b1a8fc55d8c\") " pod="openstack/mariadb-copy-data" Oct 07 13:49:19 crc kubenswrapper[4854]: I1007 13:49:19.469502 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:49:19 crc kubenswrapper[4854]: I1007 13:49:19.469659 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-93790b84-c815-49ab-af91-702fce8e9c29\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93790b84-c815-49ab-af91-702fce8e9c29\") pod \"mariadb-copy-data\" (UID: \"991f7c53-8762-472e-b968-3b1a8fc55d8c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/944496601aee2b367bdd594833436f2c9e9f27427c07b084154fa3ecab6de669/globalmount\"" pod="openstack/mariadb-copy-data" Oct 07 13:49:19 crc kubenswrapper[4854]: I1007 13:49:19.497343 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkv2n\" (UniqueName: \"kubernetes.io/projected/991f7c53-8762-472e-b968-3b1a8fc55d8c-kube-api-access-hkv2n\") pod \"mariadb-copy-data\" (UID: \"991f7c53-8762-472e-b968-3b1a8fc55d8c\") " pod="openstack/mariadb-copy-data" Oct 07 13:49:19 crc kubenswrapper[4854]: I1007 13:49:19.506979 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-93790b84-c815-49ab-af91-702fce8e9c29\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93790b84-c815-49ab-af91-702fce8e9c29\") pod \"mariadb-copy-data\" (UID: \"991f7c53-8762-472e-b968-3b1a8fc55d8c\") " pod="openstack/mariadb-copy-data" Oct 07 13:49:19 crc kubenswrapper[4854]: I1007 13:49:19.614113 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 07 13:49:20 crc kubenswrapper[4854]: I1007 13:49:20.173738 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 07 13:49:20 crc kubenswrapper[4854]: I1007 13:49:20.271287 4854 generic.go:334] "Generic (PLEG): container finished" podID="434856cf-3f9d-4116-bb83-a80ec60c3498" containerID="1751e5586b97bbb83fed659eab40de026174a15c136108d56b1f0c7df5ddc139" exitCode=0 Oct 07 13:49:20 crc kubenswrapper[4854]: I1007 13:49:20.271382 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttzdl" event={"ID":"434856cf-3f9d-4116-bb83-a80ec60c3498","Type":"ContainerDied","Data":"1751e5586b97bbb83fed659eab40de026174a15c136108d56b1f0c7df5ddc139"} Oct 07 13:49:20 crc kubenswrapper[4854]: I1007 13:49:20.273157 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttzdl" event={"ID":"434856cf-3f9d-4116-bb83-a80ec60c3498","Type":"ContainerStarted","Data":"4e194ef24c528945917611974791549c53d634ab020f3000d76aba69fd8128aa"} Oct 07 13:49:20 crc kubenswrapper[4854]: I1007 13:49:20.274330 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"991f7c53-8762-472e-b968-3b1a8fc55d8c","Type":"ContainerStarted","Data":"a4dea8d8593bc85cd434fb2f97a8a2bb8b56a15cca8712e3a5ddb03b305de61a"} Oct 07 13:49:21 crc kubenswrapper[4854]: I1007 13:49:21.284071 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"991f7c53-8762-472e-b968-3b1a8fc55d8c","Type":"ContainerStarted","Data":"34e522654c778249f204c8da92ebad0e16f8044902aa8a08e86adaf9c718f1ca"} Oct 07 13:49:21 crc kubenswrapper[4854]: I1007 13:49:21.305970 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ttzdl" podStartSLOduration=3.843959042 podStartE2EDuration="6.305937394s" podCreationTimestamp="2025-10-07 13:49:15 +0000 UTC" firstStartedPulling="2025-10-07 13:49:17.233809784 +0000 UTC m=+5073.221642039" lastFinishedPulling="2025-10-07 13:49:19.695788136 +0000 UTC m=+5075.683620391" observedRunningTime="2025-10-07 13:49:20.302457085 +0000 UTC m=+5076.290289340" watchObservedRunningTime="2025-10-07 13:49:21.305937394 +0000 UTC m=+5077.293769699" Oct 07 13:49:21 crc kubenswrapper[4854]: I1007 13:49:21.314513 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.314479498 podStartE2EDuration="3.314479498s" podCreationTimestamp="2025-10-07 13:49:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:49:21.30439814 +0000 UTC m=+5077.292230435" watchObservedRunningTime="2025-10-07 13:49:21.314479498 +0000 UTC m=+5077.302311793" Oct 07 13:49:23 crc kubenswrapper[4854]: I1007 13:49:23.235519 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 07 13:49:23 crc kubenswrapper[4854]: I1007 13:49:23.237657 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 07 13:49:23 crc kubenswrapper[4854]: I1007 13:49:23.251640 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 07 13:49:23 crc kubenswrapper[4854]: I1007 13:49:23.334554 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8fdj\" (UniqueName: \"kubernetes.io/projected/475e0e33-ef4d-4177-93e3-3cc4baed1b55-kube-api-access-v8fdj\") pod \"mariadb-client\" (UID: \"475e0e33-ef4d-4177-93e3-3cc4baed1b55\") " pod="openstack/mariadb-client" Oct 07 13:49:23 crc kubenswrapper[4854]: I1007 13:49:23.436782 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8fdj\" (UniqueName: \"kubernetes.io/projected/475e0e33-ef4d-4177-93e3-3cc4baed1b55-kube-api-access-v8fdj\") pod \"mariadb-client\" (UID: \"475e0e33-ef4d-4177-93e3-3cc4baed1b55\") " pod="openstack/mariadb-client" Oct 07 13:49:23 crc kubenswrapper[4854]: I1007 13:49:23.455408 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8fdj\" (UniqueName: \"kubernetes.io/projected/475e0e33-ef4d-4177-93e3-3cc4baed1b55-kube-api-access-v8fdj\") pod \"mariadb-client\" (UID: \"475e0e33-ef4d-4177-93e3-3cc4baed1b55\") " pod="openstack/mariadb-client" Oct 07 13:49:23 crc kubenswrapper[4854]: I1007 13:49:23.569068 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 07 13:49:24 crc kubenswrapper[4854]: I1007 13:49:24.025548 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 07 13:49:24 crc kubenswrapper[4854]: W1007 13:49:24.034466 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod475e0e33_ef4d_4177_93e3_3cc4baed1b55.slice/crio-e953d171ff023bd5b77239256fc8d53d53a99f22e1fd6e7a74b458756ab34b5a WatchSource:0}: Error finding container e953d171ff023bd5b77239256fc8d53d53a99f22e1fd6e7a74b458756ab34b5a: Status 404 returned error can't find the container with id e953d171ff023bd5b77239256fc8d53d53a99f22e1fd6e7a74b458756ab34b5a Oct 07 13:49:24 crc kubenswrapper[4854]: I1007 13:49:24.311785 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"475e0e33-ef4d-4177-93e3-3cc4baed1b55","Type":"ContainerStarted","Data":"e953d171ff023bd5b77239256fc8d53d53a99f22e1fd6e7a74b458756ab34b5a"} Oct 07 13:49:25 crc kubenswrapper[4854]: I1007 13:49:25.325555 4854 generic.go:334] "Generic (PLEG): container finished" podID="475e0e33-ef4d-4177-93e3-3cc4baed1b55" containerID="5d7c695acd025dac0a728ba21da2df3779cd37281da99052676fc8850d8306e6" exitCode=0 Oct 07 13:49:25 crc kubenswrapper[4854]: I1007 13:49:25.325646 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"475e0e33-ef4d-4177-93e3-3cc4baed1b55","Type":"ContainerDied","Data":"5d7c695acd025dac0a728ba21da2df3779cd37281da99052676fc8850d8306e6"} Oct 07 13:49:26 crc kubenswrapper[4854]: I1007 13:49:26.310107 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:26 crc kubenswrapper[4854]: I1007 13:49:26.312169 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:26 crc kubenswrapper[4854]: I1007 13:49:26.392918 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:26 crc kubenswrapper[4854]: I1007 13:49:26.465798 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.215814 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.239722 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_475e0e33-ef4d-4177-93e3-3cc4baed1b55/mariadb-client/0.log" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.263362 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.268849 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.301216 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8fdj\" (UniqueName: \"kubernetes.io/projected/475e0e33-ef4d-4177-93e3-3cc4baed1b55-kube-api-access-v8fdj\") pod \"475e0e33-ef4d-4177-93e3-3cc4baed1b55\" (UID: \"475e0e33-ef4d-4177-93e3-3cc4baed1b55\") " Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.316545 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/475e0e33-ef4d-4177-93e3-3cc4baed1b55-kube-api-access-v8fdj" (OuterVolumeSpecName: "kube-api-access-v8fdj") pod "475e0e33-ef4d-4177-93e3-3cc4baed1b55" (UID: "475e0e33-ef4d-4177-93e3-3cc4baed1b55"). InnerVolumeSpecName "kube-api-access-v8fdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.347028 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.347088 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e953d171ff023bd5b77239256fc8d53d53a99f22e1fd6e7a74b458756ab34b5a" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.402851 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8fdj\" (UniqueName: \"kubernetes.io/projected/475e0e33-ef4d-4177-93e3-3cc4baed1b55-kube-api-access-v8fdj\") on node \"crc\" DevicePath \"\"" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.450249 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 07 13:49:27 crc kubenswrapper[4854]: E1007 13:49:27.450955 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="475e0e33-ef4d-4177-93e3-3cc4baed1b55" containerName="mariadb-client" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.451042 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="475e0e33-ef4d-4177-93e3-3cc4baed1b55" containerName="mariadb-client" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.451365 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="475e0e33-ef4d-4177-93e3-3cc4baed1b55" containerName="mariadb-client" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.452440 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.460469 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.481812 4854 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="475e0e33-ef4d-4177-93e3-3cc4baed1b55" podUID="e188c677-83fc-47c3-b475-25da4a70b85d" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.606093 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmksm\" (UniqueName: \"kubernetes.io/projected/e188c677-83fc-47c3-b475-25da4a70b85d-kube-api-access-bmksm\") pod \"mariadb-client\" (UID: \"e188c677-83fc-47c3-b475-25da4a70b85d\") " pod="openstack/mariadb-client" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.711109 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmksm\" (UniqueName: \"kubernetes.io/projected/e188c677-83fc-47c3-b475-25da4a70b85d-kube-api-access-bmksm\") pod \"mariadb-client\" (UID: \"e188c677-83fc-47c3-b475-25da4a70b85d\") " pod="openstack/mariadb-client" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.730362 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmksm\" (UniqueName: \"kubernetes.io/projected/e188c677-83fc-47c3-b475-25da4a70b85d-kube-api-access-bmksm\") pod \"mariadb-client\" (UID: \"e188c677-83fc-47c3-b475-25da4a70b85d\") " pod="openstack/mariadb-client" Oct 07 13:49:27 crc kubenswrapper[4854]: I1007 13:49:27.802226 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 07 13:49:28 crc kubenswrapper[4854]: I1007 13:49:28.055389 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 07 13:49:28 crc kubenswrapper[4854]: I1007 13:49:28.358224 4854 generic.go:334] "Generic (PLEG): container finished" podID="e188c677-83fc-47c3-b475-25da4a70b85d" containerID="f64584e039f9e3bb3a5480e3f5d3e5fa7f8ee5cf1043284bdd5f2a0ae210f71f" exitCode=0 Oct 07 13:49:28 crc kubenswrapper[4854]: I1007 13:49:28.358301 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"e188c677-83fc-47c3-b475-25da4a70b85d","Type":"ContainerDied","Data":"f64584e039f9e3bb3a5480e3f5d3e5fa7f8ee5cf1043284bdd5f2a0ae210f71f"} Oct 07 13:49:28 crc kubenswrapper[4854]: I1007 13:49:28.358386 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"e188c677-83fc-47c3-b475-25da4a70b85d","Type":"ContainerStarted","Data":"0f35f8434c24598908429a847873a5a92225aecb8489c99eb5dc9f104afd70d8"} Oct 07 13:49:28 crc kubenswrapper[4854]: I1007 13:49:28.720106 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="475e0e33-ef4d-4177-93e3-3cc4baed1b55" path="/var/lib/kubelet/pods/475e0e33-ef4d-4177-93e3-3cc4baed1b55/volumes" Oct 07 13:49:29 crc kubenswrapper[4854]: I1007 13:49:29.080033 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ttzdl"] Oct 07 13:49:29 crc kubenswrapper[4854]: I1007 13:49:29.080541 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ttzdl" podUID="434856cf-3f9d-4116-bb83-a80ec60c3498" containerName="registry-server" containerID="cri-o://4e194ef24c528945917611974791549c53d634ab020f3000d76aba69fd8128aa" gracePeriod=2 Oct 07 13:49:29 crc kubenswrapper[4854]: I1007 13:49:29.806613 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 07 13:49:29 crc kubenswrapper[4854]: I1007 13:49:29.839589 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_e188c677-83fc-47c3-b475-25da4a70b85d/mariadb-client/0.log" Oct 07 13:49:29 crc kubenswrapper[4854]: I1007 13:49:29.873247 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 07 13:49:29 crc kubenswrapper[4854]: I1007 13:49:29.879669 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 07 13:49:29 crc kubenswrapper[4854]: I1007 13:49:29.958444 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmksm\" (UniqueName: \"kubernetes.io/projected/e188c677-83fc-47c3-b475-25da4a70b85d-kube-api-access-bmksm\") pod \"e188c677-83fc-47c3-b475-25da4a70b85d\" (UID: \"e188c677-83fc-47c3-b475-25da4a70b85d\") " Oct 07 13:49:29 crc kubenswrapper[4854]: I1007 13:49:29.965000 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e188c677-83fc-47c3-b475-25da4a70b85d-kube-api-access-bmksm" (OuterVolumeSpecName: "kube-api-access-bmksm") pod "e188c677-83fc-47c3-b475-25da4a70b85d" (UID: "e188c677-83fc-47c3-b475-25da4a70b85d"). InnerVolumeSpecName "kube-api-access-bmksm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.060325 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmksm\" (UniqueName: \"kubernetes.io/projected/e188c677-83fc-47c3-b475-25da4a70b85d-kube-api-access-bmksm\") on node \"crc\" DevicePath \"\"" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.157087 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.262619 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/434856cf-3f9d-4116-bb83-a80ec60c3498-catalog-content\") pod \"434856cf-3f9d-4116-bb83-a80ec60c3498\" (UID: \"434856cf-3f9d-4116-bb83-a80ec60c3498\") " Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.263093 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/434856cf-3f9d-4116-bb83-a80ec60c3498-utilities\") pod \"434856cf-3f9d-4116-bb83-a80ec60c3498\" (UID: \"434856cf-3f9d-4116-bb83-a80ec60c3498\") " Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.263366 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppgg5\" (UniqueName: \"kubernetes.io/projected/434856cf-3f9d-4116-bb83-a80ec60c3498-kube-api-access-ppgg5\") pod \"434856cf-3f9d-4116-bb83-a80ec60c3498\" (UID: \"434856cf-3f9d-4116-bb83-a80ec60c3498\") " Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.264320 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/434856cf-3f9d-4116-bb83-a80ec60c3498-utilities" (OuterVolumeSpecName: "utilities") pod "434856cf-3f9d-4116-bb83-a80ec60c3498" (UID: "434856cf-3f9d-4116-bb83-a80ec60c3498"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.264784 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/434856cf-3f9d-4116-bb83-a80ec60c3498-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.266787 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/434856cf-3f9d-4116-bb83-a80ec60c3498-kube-api-access-ppgg5" (OuterVolumeSpecName: "kube-api-access-ppgg5") pod "434856cf-3f9d-4116-bb83-a80ec60c3498" (UID: "434856cf-3f9d-4116-bb83-a80ec60c3498"). InnerVolumeSpecName "kube-api-access-ppgg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.359669 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/434856cf-3f9d-4116-bb83-a80ec60c3498-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "434856cf-3f9d-4116-bb83-a80ec60c3498" (UID: "434856cf-3f9d-4116-bb83-a80ec60c3498"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.368553 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppgg5\" (UniqueName: \"kubernetes.io/projected/434856cf-3f9d-4116-bb83-a80ec60c3498-kube-api-access-ppgg5\") on node \"crc\" DevicePath \"\"" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.368610 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/434856cf-3f9d-4116-bb83-a80ec60c3498-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.383590 4854 generic.go:334] "Generic (PLEG): container finished" podID="434856cf-3f9d-4116-bb83-a80ec60c3498" containerID="4e194ef24c528945917611974791549c53d634ab020f3000d76aba69fd8128aa" exitCode=0 Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.383727 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttzdl" event={"ID":"434856cf-3f9d-4116-bb83-a80ec60c3498","Type":"ContainerDied","Data":"4e194ef24c528945917611974791549c53d634ab020f3000d76aba69fd8128aa"} Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.384258 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttzdl" event={"ID":"434856cf-3f9d-4116-bb83-a80ec60c3498","Type":"ContainerDied","Data":"3b8eabaaa60205b00e14b5c1510af0f94997ff454f4fbb68c16f0ba890b95e7b"} Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.383840 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttzdl" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.384459 4854 scope.go:117] "RemoveContainer" containerID="4e194ef24c528945917611974791549c53d634ab020f3000d76aba69fd8128aa" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.390113 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f35f8434c24598908429a847873a5a92225aecb8489c99eb5dc9f104afd70d8" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.390205 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.417126 4854 scope.go:117] "RemoveContainer" containerID="1751e5586b97bbb83fed659eab40de026174a15c136108d56b1f0c7df5ddc139" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.443055 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ttzdl"] Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.449602 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ttzdl"] Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.464388 4854 scope.go:117] "RemoveContainer" containerID="db11b31687c61c180611e04056b0dc6a40c5a86779e8365bcd1c9d7d89e339f8" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.492573 4854 scope.go:117] "RemoveContainer" containerID="4e194ef24c528945917611974791549c53d634ab020f3000d76aba69fd8128aa" Oct 07 13:49:30 crc kubenswrapper[4854]: E1007 13:49:30.492976 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e194ef24c528945917611974791549c53d634ab020f3000d76aba69fd8128aa\": container with ID starting with 4e194ef24c528945917611974791549c53d634ab020f3000d76aba69fd8128aa not found: ID does not exist" containerID="4e194ef24c528945917611974791549c53d634ab020f3000d76aba69fd8128aa" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.493009 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e194ef24c528945917611974791549c53d634ab020f3000d76aba69fd8128aa"} err="failed to get container status \"4e194ef24c528945917611974791549c53d634ab020f3000d76aba69fd8128aa\": rpc error: code = NotFound desc = could not find container \"4e194ef24c528945917611974791549c53d634ab020f3000d76aba69fd8128aa\": container with ID starting with 4e194ef24c528945917611974791549c53d634ab020f3000d76aba69fd8128aa not found: ID does not exist" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.493034 4854 scope.go:117] "RemoveContainer" containerID="1751e5586b97bbb83fed659eab40de026174a15c136108d56b1f0c7df5ddc139" Oct 07 13:49:30 crc kubenswrapper[4854]: E1007 13:49:30.493328 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1751e5586b97bbb83fed659eab40de026174a15c136108d56b1f0c7df5ddc139\": container with ID starting with 1751e5586b97bbb83fed659eab40de026174a15c136108d56b1f0c7df5ddc139 not found: ID does not exist" containerID="1751e5586b97bbb83fed659eab40de026174a15c136108d56b1f0c7df5ddc139" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.493349 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1751e5586b97bbb83fed659eab40de026174a15c136108d56b1f0c7df5ddc139"} err="failed to get container status \"1751e5586b97bbb83fed659eab40de026174a15c136108d56b1f0c7df5ddc139\": rpc error: code = NotFound desc = could not find container \"1751e5586b97bbb83fed659eab40de026174a15c136108d56b1f0c7df5ddc139\": container with ID starting with 1751e5586b97bbb83fed659eab40de026174a15c136108d56b1f0c7df5ddc139 not found: ID does not exist" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.493367 4854 scope.go:117] "RemoveContainer" containerID="db11b31687c61c180611e04056b0dc6a40c5a86779e8365bcd1c9d7d89e339f8" Oct 07 13:49:30 crc kubenswrapper[4854]: E1007 13:49:30.493701 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db11b31687c61c180611e04056b0dc6a40c5a86779e8365bcd1c9d7d89e339f8\": container with ID starting with db11b31687c61c180611e04056b0dc6a40c5a86779e8365bcd1c9d7d89e339f8 not found: ID does not exist" containerID="db11b31687c61c180611e04056b0dc6a40c5a86779e8365bcd1c9d7d89e339f8" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.493758 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db11b31687c61c180611e04056b0dc6a40c5a86779e8365bcd1c9d7d89e339f8"} err="failed to get container status \"db11b31687c61c180611e04056b0dc6a40c5a86779e8365bcd1c9d7d89e339f8\": rpc error: code = NotFound desc = could not find container \"db11b31687c61c180611e04056b0dc6a40c5a86779e8365bcd1c9d7d89e339f8\": container with ID starting with db11b31687c61c180611e04056b0dc6a40c5a86779e8365bcd1c9d7d89e339f8 not found: ID does not exist" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.721819 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="434856cf-3f9d-4116-bb83-a80ec60c3498" path="/var/lib/kubelet/pods/434856cf-3f9d-4116-bb83-a80ec60c3498/volumes" Oct 07 13:49:30 crc kubenswrapper[4854]: I1007 13:49:30.723379 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e188c677-83fc-47c3-b475-25da4a70b85d" path="/var/lib/kubelet/pods/e188c677-83fc-47c3-b475-25da4a70b85d/volumes" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.020025 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 13:50:07 crc kubenswrapper[4854]: E1007 13:50:07.021548 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434856cf-3f9d-4116-bb83-a80ec60c3498" containerName="registry-server" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.021587 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="434856cf-3f9d-4116-bb83-a80ec60c3498" containerName="registry-server" Oct 07 13:50:07 crc kubenswrapper[4854]: E1007 13:50:07.021611 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e188c677-83fc-47c3-b475-25da4a70b85d" containerName="mariadb-client" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.021629 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e188c677-83fc-47c3-b475-25da4a70b85d" containerName="mariadb-client" Oct 07 13:50:07 crc kubenswrapper[4854]: E1007 13:50:07.021702 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434856cf-3f9d-4116-bb83-a80ec60c3498" containerName="extract-utilities" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.021721 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="434856cf-3f9d-4116-bb83-a80ec60c3498" containerName="extract-utilities" Oct 07 13:50:07 crc kubenswrapper[4854]: E1007 13:50:07.021764 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434856cf-3f9d-4116-bb83-a80ec60c3498" containerName="extract-content" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.021781 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="434856cf-3f9d-4116-bb83-a80ec60c3498" containerName="extract-content" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.022187 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="e188c677-83fc-47c3-b475-25da4a70b85d" containerName="mariadb-client" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.022245 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="434856cf-3f9d-4116-bb83-a80ec60c3498" containerName="registry-server" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.024116 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.035582 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.038212 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.038418 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.039000 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.041578 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-cnc68" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.053427 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.055675 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.063733 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.074141 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.082826 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139065 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f37693ac-831d-4995-a079-b07fb2ce03da\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f37693ac-831d-4995-a079-b07fb2ce03da\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139130 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1eccb89a-97d3-4621-9472-734665cd23c8-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139202 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eccb89a-97d3-4621-9472-734665cd23c8-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139240 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d940085c-87b0-4156-89ee-5ec89b1a7168-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139273 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-13df043b-70cd-4278-8213-a74c20209b43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13df043b-70cd-4278-8213-a74c20209b43\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139318 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5t85\" (UniqueName: \"kubernetes.io/projected/1eccb89a-97d3-4621-9472-734665cd23c8-kube-api-access-p5t85\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139370 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139419 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eccb89a-97d3-4621-9472-734665cd23c8-config\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139453 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn4mm\" (UniqueName: \"kubernetes.io/projected/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-kube-api-access-hn4mm\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139502 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d940085c-87b0-4156-89ee-5ec89b1a7168-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139532 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139563 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139601 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d940085c-87b0-4156-89ee-5ec89b1a7168-config\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139651 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1eccb89a-97d3-4621-9472-734665cd23c8-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139700 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139731 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d940085c-87b0-4156-89ee-5ec89b1a7168-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.139761 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wppl7\" (UniqueName: \"kubernetes.io/projected/d940085c-87b0-4156-89ee-5ec89b1a7168-kube-api-access-wppl7\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.140072 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2ac99ab3-2724-4606-ac4e-d246435b69f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ac99ab3-2724-4606-ac4e-d246435b69f6\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.198041 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.199973 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.201910 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-c9jvs" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.202471 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.202729 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.216109 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.227966 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.229546 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.241590 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1eccb89a-97d3-4621-9472-734665cd23c8-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.241642 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c8eb4bfe-e9a2-412b-a166-431476bbcc10-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.241679 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eccb89a-97d3-4621-9472-734665cd23c8-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.241707 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d940085c-87b0-4156-89ee-5ec89b1a7168-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.241731 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ff152d1c-5d44-4eb1-a001-c8e55f42cb89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff152d1c-5d44-4eb1-a001-c8e55f42cb89\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.241757 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-13df043b-70cd-4278-8213-a74c20209b43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13df043b-70cd-4278-8213-a74c20209b43\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.241794 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5t85\" (UniqueName: \"kubernetes.io/projected/1eccb89a-97d3-4621-9472-734665cd23c8-kube-api-access-p5t85\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.241820 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8eb4bfe-e9a2-412b-a166-431476bbcc10-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.241853 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.241872 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c8eb4bfe-e9a2-412b-a166-431476bbcc10-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.241901 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlgc8\" (UniqueName: \"kubernetes.io/projected/c8eb4bfe-e9a2-412b-a166-431476bbcc10-kube-api-access-nlgc8\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.241934 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eccb89a-97d3-4621-9472-734665cd23c8-config\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.241956 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn4mm\" (UniqueName: \"kubernetes.io/projected/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-kube-api-access-hn4mm\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.241992 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d940085c-87b0-4156-89ee-5ec89b1a7168-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.242012 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.242034 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8eb4bfe-e9a2-412b-a166-431476bbcc10-config\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.242057 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.242080 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d940085c-87b0-4156-89ee-5ec89b1a7168-config\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.242116 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1eccb89a-97d3-4621-9472-734665cd23c8-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.242172 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.242200 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d940085c-87b0-4156-89ee-5ec89b1a7168-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.242219 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wppl7\" (UniqueName: \"kubernetes.io/projected/d940085c-87b0-4156-89ee-5ec89b1a7168-kube-api-access-wppl7\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.242251 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2ac99ab3-2724-4606-ac4e-d246435b69f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ac99ab3-2724-4606-ac4e-d246435b69f6\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.242283 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f37693ac-831d-4995-a079-b07fb2ce03da\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f37693ac-831d-4995-a079-b07fb2ce03da\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.243922 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1eccb89a-97d3-4621-9472-734665cd23c8-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.244896 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.245460 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d940085c-87b0-4156-89ee-5ec89b1a7168-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.246371 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eccb89a-97d3-4621-9472-734665cd23c8-config\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.246632 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.246868 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.246983 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.247244 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1eccb89a-97d3-4621-9472-734665cd23c8-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.247773 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d940085c-87b0-4156-89ee-5ec89b1a7168-config\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.248299 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d940085c-87b0-4156-89ee-5ec89b1a7168-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.248308 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.256012 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.256070 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eccb89a-97d3-4621-9472-734665cd23c8-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.256429 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.256522 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-13df043b-70cd-4278-8213-a74c20209b43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13df043b-70cd-4278-8213-a74c20209b43\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/91093cf5d46a00e4dd46498ee21b065778be1237d045c2cab07a1aaf8c3b9a8c/globalmount\"" pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.256756 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.256800 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2ac99ab3-2724-4606-ac4e-d246435b69f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ac99ab3-2724-4606-ac4e-d246435b69f6\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/12d90591985b9e984f6b1722f02fbe0b8c8453cbc7f6abd6d5ff983f27353e77/globalmount\"" pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.256999 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.257056 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f37693ac-831d-4995-a079-b07fb2ce03da\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f37693ac-831d-4995-a079-b07fb2ce03da\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9d1cb713279de40903c22a7ce87ffbdb31afa3d4d7d14cb0cc2f067fa6458495/globalmount\"" pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.261903 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.273841 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d940085c-87b0-4156-89ee-5ec89b1a7168-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.276671 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.279100 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wppl7\" (UniqueName: \"kubernetes.io/projected/d940085c-87b0-4156-89ee-5ec89b1a7168-kube-api-access-wppl7\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.280486 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn4mm\" (UniqueName: \"kubernetes.io/projected/6a9504f6-d3a8-4e28-b54e-d1c4446d39aa-kube-api-access-hn4mm\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.294638 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5t85\" (UniqueName: \"kubernetes.io/projected/1eccb89a-97d3-4621-9472-734665cd23c8-kube-api-access-p5t85\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.316042 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2ac99ab3-2724-4606-ac4e-d246435b69f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ac99ab3-2724-4606-ac4e-d246435b69f6\") pod \"ovsdbserver-nb-0\" (UID: \"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa\") " pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.319308 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-13df043b-70cd-4278-8213-a74c20209b43\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13df043b-70cd-4278-8213-a74c20209b43\") pod \"ovsdbserver-nb-2\" (UID: \"d940085c-87b0-4156-89ee-5ec89b1a7168\") " pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.320533 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f37693ac-831d-4995-a079-b07fb2ce03da\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f37693ac-831d-4995-a079-b07fb2ce03da\") pod \"ovsdbserver-nb-1\" (UID: \"1eccb89a-97d3-4621-9472-734665cd23c8\") " pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.343360 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-46d31d17-a38f-4177-b1f0-8527a0d18345\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46d31d17-a38f-4177-b1f0-8527a0d18345\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.343485 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a35c4fc5-bf24-44b2-be7a-0da329e40adc-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.343593 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c8eb4bfe-e9a2-412b-a166-431476bbcc10-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.343675 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02e41274-64df-46d9-bec2-4645006646c3-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.343844 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fss2x\" (UniqueName: \"kubernetes.io/projected/02e41274-64df-46d9-bec2-4645006646c3-kube-api-access-fss2x\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.343941 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ff152d1c-5d44-4eb1-a001-c8e55f42cb89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff152d1c-5d44-4eb1-a001-c8e55f42cb89\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.344027 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a35c4fc5-bf24-44b2-be7a-0da329e40adc-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.344170 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02e41274-64df-46d9-bec2-4645006646c3-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.344242 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-addfddbe-5c49-481e-85d9-e43b6e42b337\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-addfddbe-5c49-481e-85d9-e43b6e42b337\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.344341 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8eb4bfe-e9a2-412b-a166-431476bbcc10-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.344452 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c8eb4bfe-e9a2-412b-a166-431476bbcc10-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.344531 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlgc8\" (UniqueName: \"kubernetes.io/projected/c8eb4bfe-e9a2-412b-a166-431476bbcc10-kube-api-access-nlgc8\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.344604 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e41274-64df-46d9-bec2-4645006646c3-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.344767 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkg4j\" (UniqueName: \"kubernetes.io/projected/a35c4fc5-bf24-44b2-be7a-0da329e40adc-kube-api-access-lkg4j\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.344831 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35c4fc5-bf24-44b2-be7a-0da329e40adc-config\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.345010 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c8eb4bfe-e9a2-412b-a166-431476bbcc10-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.345561 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8eb4bfe-e9a2-412b-a166-431476bbcc10-config\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.345679 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8eb4bfe-e9a2-412b-a166-431476bbcc10-config\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.345728 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a35c4fc5-bf24-44b2-be7a-0da329e40adc-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.345737 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c8eb4bfe-e9a2-412b-a166-431476bbcc10-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.345756 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e41274-64df-46d9-bec2-4645006646c3-config\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.348810 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.348860 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ff152d1c-5d44-4eb1-a001-c8e55f42cb89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff152d1c-5d44-4eb1-a001-c8e55f42cb89\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0664db0102cbe8a8d4e935c8404f9daaa2d4df6d3756d21febf85d40fdf986e7/globalmount\"" pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.354891 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8eb4bfe-e9a2-412b-a166-431476bbcc10-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.363076 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlgc8\" (UniqueName: \"kubernetes.io/projected/c8eb4bfe-e9a2-412b-a166-431476bbcc10-kube-api-access-nlgc8\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.377737 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.378777 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ff152d1c-5d44-4eb1-a001-c8e55f42cb89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff152d1c-5d44-4eb1-a001-c8e55f42cb89\") pod \"ovsdbserver-sb-0\" (UID: \"c8eb4bfe-e9a2-412b-a166-431476bbcc10\") " pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.392746 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.405558 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.446353 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e41274-64df-46d9-bec2-4645006646c3-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.446647 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkg4j\" (UniqueName: \"kubernetes.io/projected/a35c4fc5-bf24-44b2-be7a-0da329e40adc-kube-api-access-lkg4j\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.446674 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35c4fc5-bf24-44b2-be7a-0da329e40adc-config\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.446696 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a35c4fc5-bf24-44b2-be7a-0da329e40adc-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.446715 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e41274-64df-46d9-bec2-4645006646c3-config\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.446756 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-46d31d17-a38f-4177-b1f0-8527a0d18345\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46d31d17-a38f-4177-b1f0-8527a0d18345\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.446775 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a35c4fc5-bf24-44b2-be7a-0da329e40adc-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.446804 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02e41274-64df-46d9-bec2-4645006646c3-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.446825 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fss2x\" (UniqueName: \"kubernetes.io/projected/02e41274-64df-46d9-bec2-4645006646c3-kube-api-access-fss2x\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.446848 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a35c4fc5-bf24-44b2-be7a-0da329e40adc-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.446863 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02e41274-64df-46d9-bec2-4645006646c3-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.446885 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-addfddbe-5c49-481e-85d9-e43b6e42b337\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-addfddbe-5c49-481e-85d9-e43b6e42b337\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.447960 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35c4fc5-bf24-44b2-be7a-0da329e40adc-config\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.448795 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02e41274-64df-46d9-bec2-4645006646c3-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.448847 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02e41274-64df-46d9-bec2-4645006646c3-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.449765 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e41274-64df-46d9-bec2-4645006646c3-config\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.450464 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.450519 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-46d31d17-a38f-4177-b1f0-8527a0d18345\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46d31d17-a38f-4177-b1f0-8527a0d18345\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fa19f3e081e200dff92985e0150f0e2d577568ef1771874a7527c5d76996fc16/globalmount\"" pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.450530 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.450554 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-addfddbe-5c49-481e-85d9-e43b6e42b337\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-addfddbe-5c49-481e-85d9-e43b6e42b337\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d29ceaec1199e62cf8a0f16e1f2804b34278237a081c74da572e50c754b12032/globalmount\"" pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.451539 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a35c4fc5-bf24-44b2-be7a-0da329e40adc-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.451557 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a35c4fc5-bf24-44b2-be7a-0da329e40adc-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.459162 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a35c4fc5-bf24-44b2-be7a-0da329e40adc-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.475337 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e41274-64df-46d9-bec2-4645006646c3-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.478363 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fss2x\" (UniqueName: \"kubernetes.io/projected/02e41274-64df-46d9-bec2-4645006646c3-kube-api-access-fss2x\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.489119 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkg4j\" (UniqueName: \"kubernetes.io/projected/a35c4fc5-bf24-44b2-be7a-0da329e40adc-kube-api-access-lkg4j\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.515294 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-46d31d17-a38f-4177-b1f0-8527a0d18345\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46d31d17-a38f-4177-b1f0-8527a0d18345\") pod \"ovsdbserver-sb-2\" (UID: \"02e41274-64df-46d9-bec2-4645006646c3\") " pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.523927 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.526523 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-addfddbe-5c49-481e-85d9-e43b6e42b337\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-addfddbe-5c49-481e-85d9-e43b6e42b337\") pod \"ovsdbserver-sb-1\" (UID: \"a35c4fc5-bf24-44b2-be7a-0da329e40adc\") " pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.633666 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.639318 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:07 crc kubenswrapper[4854]: I1007 13:50:07.914007 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.014440 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.178982 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 13:50:08 crc kubenswrapper[4854]: W1007 13:50:08.208940 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8eb4bfe_e9a2_412b_a166_431476bbcc10.slice/crio-1795f183907f3a52c748cb22b1dec830ec5886a36c4764e3e532085ad3e0fbaf WatchSource:0}: Error finding container 1795f183907f3a52c748cb22b1dec830ec5886a36c4764e3e532085ad3e0fbaf: Status 404 returned error can't find the container with id 1795f183907f3a52c748cb22b1dec830ec5886a36c4764e3e532085ad3e0fbaf Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.278400 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.628773 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.795366 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"d940085c-87b0-4156-89ee-5ec89b1a7168","Type":"ContainerStarted","Data":"cffbef026254bcd03c33c09c23f816d8cbfa5388b97677e75042fbaac5b19433"} Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.795423 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"d940085c-87b0-4156-89ee-5ec89b1a7168","Type":"ContainerStarted","Data":"db3c995fe940dd6ef851747a85eda75c13eaf20c4896d25b2672e2f658c0e863"} Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.795439 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"d940085c-87b0-4156-89ee-5ec89b1a7168","Type":"ContainerStarted","Data":"afe0de383262f1b38f256d7b7084ccb4fdeb0b652d986bc7f53d124b72f25222"} Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.799779 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa","Type":"ContainerStarted","Data":"cd24d0540a1207ad82482167496b45e675fb9cfa2a59c11388e80d43cc173952"} Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.799832 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa","Type":"ContainerStarted","Data":"514500e3cdf487c78f666f75b3fe5d6cf5b61d518499c219a307732442d8409a"} Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.799848 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6a9504f6-d3a8-4e28-b54e-d1c4446d39aa","Type":"ContainerStarted","Data":"4910b71b22d64b2b41ad009df1834c459d5ef453f3760f6c90a3284a45a3905c"} Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.802707 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"02e41274-64df-46d9-bec2-4645006646c3","Type":"ContainerStarted","Data":"0d2555c998485833ca562e0ac51239d5a0d37a05ed7b5a7c179e74ce53031fc5"} Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.802763 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"02e41274-64df-46d9-bec2-4645006646c3","Type":"ContainerStarted","Data":"41f174601891b433670c8beadf5460fa69543004a94cadffe08068255307c131"} Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.802791 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"02e41274-64df-46d9-bec2-4645006646c3","Type":"ContainerStarted","Data":"777e335568d331a0e1dff8e61e1366be06d8ff4c7740e62432e3eaa74b63b8d2"} Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.805994 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c8eb4bfe-e9a2-412b-a166-431476bbcc10","Type":"ContainerStarted","Data":"1375396ae5a756ffbf115ad3388619bec17190010c8c9ae033ae6ee428e6b8ea"} Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.806028 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c8eb4bfe-e9a2-412b-a166-431476bbcc10","Type":"ContainerStarted","Data":"c19b9d5af587c0e832a88403cb31a80940590da19f1655534fc9290d46b19ee1"} Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.806040 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c8eb4bfe-e9a2-412b-a166-431476bbcc10","Type":"ContainerStarted","Data":"1795f183907f3a52c748cb22b1dec830ec5886a36c4764e3e532085ad3e0fbaf"} Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.808744 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"1eccb89a-97d3-4621-9472-734665cd23c8","Type":"ContainerStarted","Data":"f77553c674242f86751f0992fb9b0c4c9e6e9f33fee4c359e1f0968a8572940f"} Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.808774 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"1eccb89a-97d3-4621-9472-734665cd23c8","Type":"ContainerStarted","Data":"0c5f12b7f8be9350bcd62402bd69d1adddeba090098042baa6437e7abbb79c67"} Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.827613 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=2.827585756 podStartE2EDuration="2.827585756s" podCreationTimestamp="2025-10-07 13:50:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:50:08.815324636 +0000 UTC m=+5124.803156901" watchObservedRunningTime="2025-10-07 13:50:08.827585756 +0000 UTC m=+5124.815418031" Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.837024 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=2.836987885 podStartE2EDuration="2.836987885s" podCreationTimestamp="2025-10-07 13:50:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:50:08.831971432 +0000 UTC m=+5124.819803687" watchObservedRunningTime="2025-10-07 13:50:08.836987885 +0000 UTC m=+5124.824820140" Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.855004 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.854980049 podStartE2EDuration="3.854980049s" podCreationTimestamp="2025-10-07 13:50:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:50:08.850128441 +0000 UTC m=+5124.837960706" watchObservedRunningTime="2025-10-07 13:50:08.854980049 +0000 UTC m=+5124.842812304" Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.866054 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=2.866036255 podStartE2EDuration="2.866036255s" podCreationTimestamp="2025-10-07 13:50:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:50:08.865606473 +0000 UTC m=+5124.853438748" watchObservedRunningTime="2025-10-07 13:50:08.866036255 +0000 UTC m=+5124.853868500" Oct 07 13:50:08 crc kubenswrapper[4854]: I1007 13:50:08.900665 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 07 13:50:09 crc kubenswrapper[4854]: I1007 13:50:09.823470 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"1eccb89a-97d3-4621-9472-734665cd23c8","Type":"ContainerStarted","Data":"1e84a422d25360cf496bed0d6c77dc33e0a92a60652f642baa4b5a93f4fa9f92"} Oct 07 13:50:09 crc kubenswrapper[4854]: I1007 13:50:09.826830 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"a35c4fc5-bf24-44b2-be7a-0da329e40adc","Type":"ContainerStarted","Data":"a01d34a9cefb058a65d1ec1c9280a7548dd6027f858738e2a2e3011d3d54a259"} Oct 07 13:50:09 crc kubenswrapper[4854]: I1007 13:50:09.826898 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"a35c4fc5-bf24-44b2-be7a-0da329e40adc","Type":"ContainerStarted","Data":"73846aa2039b2849cad711c5f3c0e9a52cddebfab48ca72a9a942159e7bbc57a"} Oct 07 13:50:09 crc kubenswrapper[4854]: I1007 13:50:09.826931 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"a35c4fc5-bf24-44b2-be7a-0da329e40adc","Type":"ContainerStarted","Data":"14026e2654c14d3ae1e18d444ead6fa1f76fada8871ccd3a891b44998282ad46"} Oct 07 13:50:09 crc kubenswrapper[4854]: I1007 13:50:09.862091 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.862057231 podStartE2EDuration="3.862057231s" podCreationTimestamp="2025-10-07 13:50:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:50:09.854832105 +0000 UTC m=+5125.842664410" watchObservedRunningTime="2025-10-07 13:50:09.862057231 +0000 UTC m=+5125.849889526" Oct 07 13:50:09 crc kubenswrapper[4854]: I1007 13:50:09.888780 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.888752434 podStartE2EDuration="3.888752434s" podCreationTimestamp="2025-10-07 13:50:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:50:09.88723117 +0000 UTC m=+5125.875063465" watchObservedRunningTime="2025-10-07 13:50:09.888752434 +0000 UTC m=+5125.876584729" Oct 07 13:50:10 crc kubenswrapper[4854]: I1007 13:50:10.378569 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:10 crc kubenswrapper[4854]: I1007 13:50:10.393947 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:10 crc kubenswrapper[4854]: I1007 13:50:10.406774 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:10 crc kubenswrapper[4854]: I1007 13:50:10.524496 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:10 crc kubenswrapper[4854]: I1007 13:50:10.635018 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:10 crc kubenswrapper[4854]: I1007 13:50:10.640301 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:12 crc kubenswrapper[4854]: I1007 13:50:12.377931 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:12 crc kubenswrapper[4854]: I1007 13:50:12.393107 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:12 crc kubenswrapper[4854]: I1007 13:50:12.407138 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:12 crc kubenswrapper[4854]: I1007 13:50:12.524475 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:12 crc kubenswrapper[4854]: I1007 13:50:12.634949 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:12 crc kubenswrapper[4854]: I1007 13:50:12.640452 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.453763 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.463709 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.471997 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.532936 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.544651 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.586217 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.652883 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.702317 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.705027 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.750380 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.786028 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8578f89889-zfj8c"] Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.788038 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.791513 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.807714 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8578f89889-zfj8c"] Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.878689 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-config\") pod \"dnsmasq-dns-8578f89889-zfj8c\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.878740 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhnb9\" (UniqueName: \"kubernetes.io/projected/33f319af-2acf-457e-af82-8776a48a24d7-kube-api-access-zhnb9\") pod \"dnsmasq-dns-8578f89889-zfj8c\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.878764 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-dns-svc\") pod \"dnsmasq-dns-8578f89889-zfj8c\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.878779 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-ovsdbserver-nb\") pod \"dnsmasq-dns-8578f89889-zfj8c\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.971750 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.979929 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-config\") pod \"dnsmasq-dns-8578f89889-zfj8c\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.979997 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhnb9\" (UniqueName: \"kubernetes.io/projected/33f319af-2acf-457e-af82-8776a48a24d7-kube-api-access-zhnb9\") pod \"dnsmasq-dns-8578f89889-zfj8c\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.980021 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-dns-svc\") pod \"dnsmasq-dns-8578f89889-zfj8c\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.980039 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-ovsdbserver-nb\") pod \"dnsmasq-dns-8578f89889-zfj8c\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.980911 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-ovsdbserver-nb\") pod \"dnsmasq-dns-8578f89889-zfj8c\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.981054 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-dns-svc\") pod \"dnsmasq-dns-8578f89889-zfj8c\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.981979 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-config\") pod \"dnsmasq-dns-8578f89889-zfj8c\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:13 crc kubenswrapper[4854]: I1007 13:50:13.999724 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhnb9\" (UniqueName: \"kubernetes.io/projected/33f319af-2acf-457e-af82-8776a48a24d7-kube-api-access-zhnb9\") pod \"dnsmasq-dns-8578f89889-zfj8c\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.064427 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8578f89889-zfj8c"] Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.064988 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.081606 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dd484fc77-4qglj"] Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.082889 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.088625 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.099995 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dd484fc77-4qglj"] Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.184400 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-ovsdbserver-sb\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.184757 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-ovsdbserver-nb\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.184783 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-dns-svc\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.184811 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd74m\" (UniqueName: \"kubernetes.io/projected/d31c15dd-5829-4953-9784-0297b0ddf02a-kube-api-access-rd74m\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.184869 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-config\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.286742 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-ovsdbserver-sb\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.286794 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-ovsdbserver-nb\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.286823 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-dns-svc\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.286845 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd74m\" (UniqueName: \"kubernetes.io/projected/d31c15dd-5829-4953-9784-0297b0ddf02a-kube-api-access-rd74m\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.286899 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-config\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.287807 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-ovsdbserver-sb\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.287823 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-config\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.288392 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-ovsdbserver-nb\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.288835 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-dns-svc\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.317992 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd74m\" (UniqueName: \"kubernetes.io/projected/d31c15dd-5829-4953-9784-0297b0ddf02a-kube-api-access-rd74m\") pod \"dnsmasq-dns-6dd484fc77-4qglj\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.461514 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.533387 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8578f89889-zfj8c"] Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.887595 4854 generic.go:334] "Generic (PLEG): container finished" podID="33f319af-2acf-457e-af82-8776a48a24d7" containerID="005a7060508e3da359b6e1ce351a017f684e12dc0c29058c535b1bf0a0d69002" exitCode=0 Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.887811 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8578f89889-zfj8c" event={"ID":"33f319af-2acf-457e-af82-8776a48a24d7","Type":"ContainerDied","Data":"005a7060508e3da359b6e1ce351a017f684e12dc0c29058c535b1bf0a0d69002"} Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.888205 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8578f89889-zfj8c" event={"ID":"33f319af-2acf-457e-af82-8776a48a24d7","Type":"ContainerStarted","Data":"626450828dc1f85b5ade1b76918a8c5dfd7dbcf0478b88999e49893e80b02457"} Oct 07 13:50:14 crc kubenswrapper[4854]: I1007 13:50:14.942167 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dd484fc77-4qglj"] Oct 07 13:50:14 crc kubenswrapper[4854]: W1007 13:50:14.944885 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd31c15dd_5829_4953_9784_0297b0ddf02a.slice/crio-3060eaaf01a0aee91584ed801c627abdb0ca42d0be773ebb5e71f47809720920 WatchSource:0}: Error finding container 3060eaaf01a0aee91584ed801c627abdb0ca42d0be773ebb5e71f47809720920: Status 404 returned error can't find the container with id 3060eaaf01a0aee91584ed801c627abdb0ca42d0be773ebb5e71f47809720920 Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.316952 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.405189 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-config\") pod \"33f319af-2acf-457e-af82-8776a48a24d7\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.405568 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-dns-svc\") pod \"33f319af-2acf-457e-af82-8776a48a24d7\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.405698 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhnb9\" (UniqueName: \"kubernetes.io/projected/33f319af-2acf-457e-af82-8776a48a24d7-kube-api-access-zhnb9\") pod \"33f319af-2acf-457e-af82-8776a48a24d7\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.405816 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-ovsdbserver-nb\") pod \"33f319af-2acf-457e-af82-8776a48a24d7\" (UID: \"33f319af-2acf-457e-af82-8776a48a24d7\") " Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.416529 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f319af-2acf-457e-af82-8776a48a24d7-kube-api-access-zhnb9" (OuterVolumeSpecName: "kube-api-access-zhnb9") pod "33f319af-2acf-457e-af82-8776a48a24d7" (UID: "33f319af-2acf-457e-af82-8776a48a24d7"). InnerVolumeSpecName "kube-api-access-zhnb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.442347 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "33f319af-2acf-457e-af82-8776a48a24d7" (UID: "33f319af-2acf-457e-af82-8776a48a24d7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.449975 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-config" (OuterVolumeSpecName: "config") pod "33f319af-2acf-457e-af82-8776a48a24d7" (UID: "33f319af-2acf-457e-af82-8776a48a24d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.453208 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33f319af-2acf-457e-af82-8776a48a24d7" (UID: "33f319af-2acf-457e-af82-8776a48a24d7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.508402 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhnb9\" (UniqueName: \"kubernetes.io/projected/33f319af-2acf-457e-af82-8776a48a24d7-kube-api-access-zhnb9\") on node \"crc\" DevicePath \"\"" Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.508459 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.508475 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.508488 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33f319af-2acf-457e-af82-8776a48a24d7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.901090 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8578f89889-zfj8c" event={"ID":"33f319af-2acf-457e-af82-8776a48a24d7","Type":"ContainerDied","Data":"626450828dc1f85b5ade1b76918a8c5dfd7dbcf0478b88999e49893e80b02457"} Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.901189 4854 scope.go:117] "RemoveContainer" containerID="005a7060508e3da359b6e1ce351a017f684e12dc0c29058c535b1bf0a0d69002" Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.901198 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8578f89889-zfj8c" Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.903280 4854 generic.go:334] "Generic (PLEG): container finished" podID="d31c15dd-5829-4953-9784-0297b0ddf02a" containerID="094191c9cb2c0c9d00db765f38a388b2be6c692866d0e7518f6c8f0e9261115a" exitCode=0 Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.903346 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" event={"ID":"d31c15dd-5829-4953-9784-0297b0ddf02a","Type":"ContainerDied","Data":"094191c9cb2c0c9d00db765f38a388b2be6c692866d0e7518f6c8f0e9261115a"} Oct 07 13:50:15 crc kubenswrapper[4854]: I1007 13:50:15.903434 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" event={"ID":"d31c15dd-5829-4953-9784-0297b0ddf02a","Type":"ContainerStarted","Data":"3060eaaf01a0aee91584ed801c627abdb0ca42d0be773ebb5e71f47809720920"} Oct 07 13:50:16 crc kubenswrapper[4854]: I1007 13:50:16.007681 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8578f89889-zfj8c"] Oct 07 13:50:16 crc kubenswrapper[4854]: I1007 13:50:16.013830 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8578f89889-zfj8c"] Oct 07 13:50:16 crc kubenswrapper[4854]: I1007 13:50:16.721270 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f319af-2acf-457e-af82-8776a48a24d7" path="/var/lib/kubelet/pods/33f319af-2acf-457e-af82-8776a48a24d7/volumes" Oct 07 13:50:16 crc kubenswrapper[4854]: I1007 13:50:16.915197 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" event={"ID":"d31c15dd-5829-4953-9784-0297b0ddf02a","Type":"ContainerStarted","Data":"549317c676ec2762c187b3e3db339baf213e4e525320d62c088c10e6bed6c35a"} Oct 07 13:50:16 crc kubenswrapper[4854]: I1007 13:50:16.915488 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:17 crc kubenswrapper[4854]: I1007 13:50:17.700030 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Oct 07 13:50:17 crc kubenswrapper[4854]: I1007 13:50:17.730844 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" podStartSLOduration=3.730816827 podStartE2EDuration="3.730816827s" podCreationTimestamp="2025-10-07 13:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:50:16.943603308 +0000 UTC m=+5132.931435603" watchObservedRunningTime="2025-10-07 13:50:17.730816827 +0000 UTC m=+5133.718649122" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.595386 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Oct 07 13:50:20 crc kubenswrapper[4854]: E1007 13:50:20.596466 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f319af-2acf-457e-af82-8776a48a24d7" containerName="init" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.596533 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f319af-2acf-457e-af82-8776a48a24d7" containerName="init" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.596874 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f319af-2acf-457e-af82-8776a48a24d7" containerName="init" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.597817 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.604326 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.605767 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.704542 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/257e4531-5661-4bad-a586-900a88cca502-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"257e4531-5661-4bad-a586-900a88cca502\") " pod="openstack/ovn-copy-data" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.704608 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z97kf\" (UniqueName: \"kubernetes.io/projected/257e4531-5661-4bad-a586-900a88cca502-kube-api-access-z97kf\") pod \"ovn-copy-data\" (UID: \"257e4531-5661-4bad-a586-900a88cca502\") " pod="openstack/ovn-copy-data" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.704639 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c01cb7cd-cd19-4a3e-8ae7-23bd4964d51d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c01cb7cd-cd19-4a3e-8ae7-23bd4964d51d\") pod \"ovn-copy-data\" (UID: \"257e4531-5661-4bad-a586-900a88cca502\") " pod="openstack/ovn-copy-data" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.805955 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/257e4531-5661-4bad-a586-900a88cca502-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"257e4531-5661-4bad-a586-900a88cca502\") " pod="openstack/ovn-copy-data" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.806276 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z97kf\" (UniqueName: \"kubernetes.io/projected/257e4531-5661-4bad-a586-900a88cca502-kube-api-access-z97kf\") pod \"ovn-copy-data\" (UID: \"257e4531-5661-4bad-a586-900a88cca502\") " pod="openstack/ovn-copy-data" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.806307 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c01cb7cd-cd19-4a3e-8ae7-23bd4964d51d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c01cb7cd-cd19-4a3e-8ae7-23bd4964d51d\") pod \"ovn-copy-data\" (UID: \"257e4531-5661-4bad-a586-900a88cca502\") " pod="openstack/ovn-copy-data" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.810667 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.810744 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c01cb7cd-cd19-4a3e-8ae7-23bd4964d51d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c01cb7cd-cd19-4a3e-8ae7-23bd4964d51d\") pod \"ovn-copy-data\" (UID: \"257e4531-5661-4bad-a586-900a88cca502\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/023abdb6308bfe32a664570e83f42f735e2e92d7da06f92c1f1808d0202e8f1b/globalmount\"" pod="openstack/ovn-copy-data" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.825735 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/257e4531-5661-4bad-a586-900a88cca502-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"257e4531-5661-4bad-a586-900a88cca502\") " pod="openstack/ovn-copy-data" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.829659 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z97kf\" (UniqueName: \"kubernetes.io/projected/257e4531-5661-4bad-a586-900a88cca502-kube-api-access-z97kf\") pod \"ovn-copy-data\" (UID: \"257e4531-5661-4bad-a586-900a88cca502\") " pod="openstack/ovn-copy-data" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.878890 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c01cb7cd-cd19-4a3e-8ae7-23bd4964d51d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c01cb7cd-cd19-4a3e-8ae7-23bd4964d51d\") pod \"ovn-copy-data\" (UID: \"257e4531-5661-4bad-a586-900a88cca502\") " pod="openstack/ovn-copy-data" Oct 07 13:50:20 crc kubenswrapper[4854]: I1007 13:50:20.926910 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 07 13:50:21 crc kubenswrapper[4854]: I1007 13:50:21.218453 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 07 13:50:21 crc kubenswrapper[4854]: W1007 13:50:21.227559 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod257e4531_5661_4bad_a586_900a88cca502.slice/crio-9dcf8e809a855a8e7aa55331937bcaf8d638a2b188ccf4630861e321ce9b376c WatchSource:0}: Error finding container 9dcf8e809a855a8e7aa55331937bcaf8d638a2b188ccf4630861e321ce9b376c: Status 404 returned error can't find the container with id 9dcf8e809a855a8e7aa55331937bcaf8d638a2b188ccf4630861e321ce9b376c Oct 07 13:50:21 crc kubenswrapper[4854]: I1007 13:50:21.970390 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"257e4531-5661-4bad-a586-900a88cca502","Type":"ContainerStarted","Data":"438cd046fbcff75f3f3b88fb33d1e8202842f1056a9fb0b74dd47016af7e1250"} Oct 07 13:50:21 crc kubenswrapper[4854]: I1007 13:50:21.970655 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"257e4531-5661-4bad-a586-900a88cca502","Type":"ContainerStarted","Data":"9dcf8e809a855a8e7aa55331937bcaf8d638a2b188ccf4630861e321ce9b376c"} Oct 07 13:50:21 crc kubenswrapper[4854]: I1007 13:50:21.991399 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.991378932 podStartE2EDuration="2.991378932s" podCreationTimestamp="2025-10-07 13:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:50:21.991329851 +0000 UTC m=+5137.979162156" watchObservedRunningTime="2025-10-07 13:50:21.991378932 +0000 UTC m=+5137.979211197" Oct 07 13:50:24 crc kubenswrapper[4854]: I1007 13:50:24.463398 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:50:24 crc kubenswrapper[4854]: I1007 13:50:24.549982 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pjkqv"] Oct 07 13:50:24 crc kubenswrapper[4854]: I1007 13:50:24.550255 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" podUID="dd0705eb-9aa0-4760-86f2-af1fe2be570b" containerName="dnsmasq-dns" containerID="cri-o://c2612888d742e1ebcf334051c040771ef82f2849b4ab87da7ed074b8bb411678" gracePeriod=10 Oct 07 13:50:24 crc kubenswrapper[4854]: I1007 13:50:24.768298 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" podUID="dd0705eb-9aa0-4760-86f2-af1fe2be570b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.245:5353: connect: connection refused" Oct 07 13:50:25 crc kubenswrapper[4854]: I1007 13:50:25.001742 4854 generic.go:334] "Generic (PLEG): container finished" podID="dd0705eb-9aa0-4760-86f2-af1fe2be570b" containerID="c2612888d742e1ebcf334051c040771ef82f2849b4ab87da7ed074b8bb411678" exitCode=0 Oct 07 13:50:25 crc kubenswrapper[4854]: I1007 13:50:25.001785 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" event={"ID":"dd0705eb-9aa0-4760-86f2-af1fe2be570b","Type":"ContainerDied","Data":"c2612888d742e1ebcf334051c040771ef82f2849b4ab87da7ed074b8bb411678"} Oct 07 13:50:25 crc kubenswrapper[4854]: I1007 13:50:25.001810 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" event={"ID":"dd0705eb-9aa0-4760-86f2-af1fe2be570b","Type":"ContainerDied","Data":"e2abd5ef03c3f4d762c523d560ffa63262b13bbe8b51753b8fc4c9f5accf941b"} Oct 07 13:50:25 crc kubenswrapper[4854]: I1007 13:50:25.001821 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2abd5ef03c3f4d762c523d560ffa63262b13bbe8b51753b8fc4c9f5accf941b" Oct 07 13:50:25 crc kubenswrapper[4854]: I1007 13:50:25.042530 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:50:25 crc kubenswrapper[4854]: I1007 13:50:25.202542 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0705eb-9aa0-4760-86f2-af1fe2be570b-config\") pod \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\" (UID: \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\") " Oct 07 13:50:25 crc kubenswrapper[4854]: I1007 13:50:25.202653 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd0705eb-9aa0-4760-86f2-af1fe2be570b-dns-svc\") pod \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\" (UID: \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\") " Oct 07 13:50:25 crc kubenswrapper[4854]: I1007 13:50:25.202707 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6zqd\" (UniqueName: \"kubernetes.io/projected/dd0705eb-9aa0-4760-86f2-af1fe2be570b-kube-api-access-b6zqd\") pod \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\" (UID: \"dd0705eb-9aa0-4760-86f2-af1fe2be570b\") " Oct 07 13:50:25 crc kubenswrapper[4854]: I1007 13:50:25.219013 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd0705eb-9aa0-4760-86f2-af1fe2be570b-kube-api-access-b6zqd" (OuterVolumeSpecName: "kube-api-access-b6zqd") pod "dd0705eb-9aa0-4760-86f2-af1fe2be570b" (UID: "dd0705eb-9aa0-4760-86f2-af1fe2be570b"). InnerVolumeSpecName "kube-api-access-b6zqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:50:25 crc kubenswrapper[4854]: I1007 13:50:25.275451 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0705eb-9aa0-4760-86f2-af1fe2be570b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd0705eb-9aa0-4760-86f2-af1fe2be570b" (UID: "dd0705eb-9aa0-4760-86f2-af1fe2be570b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:50:25 crc kubenswrapper[4854]: I1007 13:50:25.296964 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0705eb-9aa0-4760-86f2-af1fe2be570b-config" (OuterVolumeSpecName: "config") pod "dd0705eb-9aa0-4760-86f2-af1fe2be570b" (UID: "dd0705eb-9aa0-4760-86f2-af1fe2be570b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:50:25 crc kubenswrapper[4854]: I1007 13:50:25.305393 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0705eb-9aa0-4760-86f2-af1fe2be570b-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:50:25 crc kubenswrapper[4854]: I1007 13:50:25.305426 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd0705eb-9aa0-4760-86f2-af1fe2be570b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:50:25 crc kubenswrapper[4854]: I1007 13:50:25.305439 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6zqd\" (UniqueName: \"kubernetes.io/projected/dd0705eb-9aa0-4760-86f2-af1fe2be570b-kube-api-access-b6zqd\") on node \"crc\" DevicePath \"\"" Oct 07 13:50:26 crc kubenswrapper[4854]: I1007 13:50:26.014065 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-pjkqv" Oct 07 13:50:26 crc kubenswrapper[4854]: I1007 13:50:26.068193 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pjkqv"] Oct 07 13:50:26 crc kubenswrapper[4854]: I1007 13:50:26.084968 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pjkqv"] Oct 07 13:50:26 crc kubenswrapper[4854]: I1007 13:50:26.715212 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd0705eb-9aa0-4760-86f2-af1fe2be570b" path="/var/lib/kubelet/pods/dd0705eb-9aa0-4760-86f2-af1fe2be570b/volumes" Oct 07 13:50:27 crc kubenswrapper[4854]: I1007 13:50:27.777966 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 07 13:50:27 crc kubenswrapper[4854]: E1007 13:50:27.778692 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd0705eb-9aa0-4760-86f2-af1fe2be570b" containerName="dnsmasq-dns" Oct 07 13:50:27 crc kubenswrapper[4854]: I1007 13:50:27.778718 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd0705eb-9aa0-4760-86f2-af1fe2be570b" containerName="dnsmasq-dns" Oct 07 13:50:27 crc kubenswrapper[4854]: E1007 13:50:27.778763 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd0705eb-9aa0-4760-86f2-af1fe2be570b" containerName="init" Oct 07 13:50:27 crc kubenswrapper[4854]: I1007 13:50:27.778775 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd0705eb-9aa0-4760-86f2-af1fe2be570b" containerName="init" Oct 07 13:50:27 crc kubenswrapper[4854]: I1007 13:50:27.779075 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd0705eb-9aa0-4760-86f2-af1fe2be570b" containerName="dnsmasq-dns" Oct 07 13:50:27 crc kubenswrapper[4854]: I1007 13:50:27.780658 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 13:50:27 crc kubenswrapper[4854]: I1007 13:50:27.785350 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 13:50:27 crc kubenswrapper[4854]: I1007 13:50:27.788786 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 07 13:50:27 crc kubenswrapper[4854]: I1007 13:50:27.788842 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 07 13:50:27 crc kubenswrapper[4854]: I1007 13:50:27.788788 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-5fq5z" Oct 07 13:50:27 crc kubenswrapper[4854]: I1007 13:50:27.956965 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/62719138-e7a7-4238-ad62-01b9fbed8739-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:27 crc kubenswrapper[4854]: I1007 13:50:27.957276 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62719138-e7a7-4238-ad62-01b9fbed8739-config\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:27 crc kubenswrapper[4854]: I1007 13:50:27.957315 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czxmc\" (UniqueName: \"kubernetes.io/projected/62719138-e7a7-4238-ad62-01b9fbed8739-kube-api-access-czxmc\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:27 crc kubenswrapper[4854]: I1007 13:50:27.957398 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62719138-e7a7-4238-ad62-01b9fbed8739-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:27 crc kubenswrapper[4854]: I1007 13:50:27.957435 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62719138-e7a7-4238-ad62-01b9fbed8739-scripts\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:28 crc kubenswrapper[4854]: I1007 13:50:28.058990 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/62719138-e7a7-4238-ad62-01b9fbed8739-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:28 crc kubenswrapper[4854]: I1007 13:50:28.059052 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62719138-e7a7-4238-ad62-01b9fbed8739-config\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:28 crc kubenswrapper[4854]: I1007 13:50:28.059100 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czxmc\" (UniqueName: \"kubernetes.io/projected/62719138-e7a7-4238-ad62-01b9fbed8739-kube-api-access-czxmc\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:28 crc kubenswrapper[4854]: I1007 13:50:28.059184 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62719138-e7a7-4238-ad62-01b9fbed8739-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:28 crc kubenswrapper[4854]: I1007 13:50:28.059211 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62719138-e7a7-4238-ad62-01b9fbed8739-scripts\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:28 crc kubenswrapper[4854]: I1007 13:50:28.060117 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/62719138-e7a7-4238-ad62-01b9fbed8739-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:28 crc kubenswrapper[4854]: I1007 13:50:28.060317 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62719138-e7a7-4238-ad62-01b9fbed8739-scripts\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:28 crc kubenswrapper[4854]: I1007 13:50:28.060792 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62719138-e7a7-4238-ad62-01b9fbed8739-config\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:28 crc kubenswrapper[4854]: I1007 13:50:28.066264 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62719138-e7a7-4238-ad62-01b9fbed8739-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:28 crc kubenswrapper[4854]: I1007 13:50:28.076941 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czxmc\" (UniqueName: \"kubernetes.io/projected/62719138-e7a7-4238-ad62-01b9fbed8739-kube-api-access-czxmc\") pod \"ovn-northd-0\" (UID: \"62719138-e7a7-4238-ad62-01b9fbed8739\") " pod="openstack/ovn-northd-0" Oct 07 13:50:28 crc kubenswrapper[4854]: I1007 13:50:28.101294 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 13:50:28 crc kubenswrapper[4854]: I1007 13:50:28.550284 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 13:50:29 crc kubenswrapper[4854]: I1007 13:50:29.046859 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"62719138-e7a7-4238-ad62-01b9fbed8739","Type":"ContainerStarted","Data":"6f8b6c7f2520968f1ed3f0dc2e15b8e268af87132157f519a496ca3d13339659"} Oct 07 13:50:29 crc kubenswrapper[4854]: I1007 13:50:29.052406 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"62719138-e7a7-4238-ad62-01b9fbed8739","Type":"ContainerStarted","Data":"29463601eddebf2fabd77b17245ee0d9efffa5a5f2b192103d2785d73151ca7a"} Oct 07 13:50:29 crc kubenswrapper[4854]: I1007 13:50:29.052521 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 07 13:50:29 crc kubenswrapper[4854]: I1007 13:50:29.052561 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"62719138-e7a7-4238-ad62-01b9fbed8739","Type":"ContainerStarted","Data":"143c67153b924c25418e0a96dc2eaf4eeccd47c4590c4acb2224d4502d89eab6"} Oct 07 13:50:29 crc kubenswrapper[4854]: I1007 13:50:29.081281 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.081257539 podStartE2EDuration="2.081257539s" podCreationTimestamp="2025-10-07 13:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:50:29.075673749 +0000 UTC m=+5145.063506014" watchObservedRunningTime="2025-10-07 13:50:29.081257539 +0000 UTC m=+5145.069089804" Oct 07 13:50:33 crc kubenswrapper[4854]: I1007 13:50:33.331419 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-pblnp"] Oct 07 13:50:33 crc kubenswrapper[4854]: I1007 13:50:33.333186 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pblnp" Oct 07 13:50:33 crc kubenswrapper[4854]: I1007 13:50:33.341106 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pblnp"] Oct 07 13:50:33 crc kubenswrapper[4854]: I1007 13:50:33.390697 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgw9w\" (UniqueName: \"kubernetes.io/projected/9d502c4b-5a47-4c65-846d-fb7256c0124f-kube-api-access-fgw9w\") pod \"keystone-db-create-pblnp\" (UID: \"9d502c4b-5a47-4c65-846d-fb7256c0124f\") " pod="openstack/keystone-db-create-pblnp" Oct 07 13:50:33 crc kubenswrapper[4854]: I1007 13:50:33.492222 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgw9w\" (UniqueName: \"kubernetes.io/projected/9d502c4b-5a47-4c65-846d-fb7256c0124f-kube-api-access-fgw9w\") pod \"keystone-db-create-pblnp\" (UID: \"9d502c4b-5a47-4c65-846d-fb7256c0124f\") " pod="openstack/keystone-db-create-pblnp" Oct 07 13:50:33 crc kubenswrapper[4854]: I1007 13:50:33.525400 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgw9w\" (UniqueName: \"kubernetes.io/projected/9d502c4b-5a47-4c65-846d-fb7256c0124f-kube-api-access-fgw9w\") pod \"keystone-db-create-pblnp\" (UID: \"9d502c4b-5a47-4c65-846d-fb7256c0124f\") " pod="openstack/keystone-db-create-pblnp" Oct 07 13:50:33 crc kubenswrapper[4854]: I1007 13:50:33.654444 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pblnp" Oct 07 13:50:33 crc kubenswrapper[4854]: I1007 13:50:33.971184 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pblnp"] Oct 07 13:50:33 crc kubenswrapper[4854]: W1007 13:50:33.977495 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d502c4b_5a47_4c65_846d_fb7256c0124f.slice/crio-0d876342563df19601a0a0d5f9b26a069d10589873c0474f912c5ca2887b4c12 WatchSource:0}: Error finding container 0d876342563df19601a0a0d5f9b26a069d10589873c0474f912c5ca2887b4c12: Status 404 returned error can't find the container with id 0d876342563df19601a0a0d5f9b26a069d10589873c0474f912c5ca2887b4c12 Oct 07 13:50:34 crc kubenswrapper[4854]: I1007 13:50:34.108843 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pblnp" event={"ID":"9d502c4b-5a47-4c65-846d-fb7256c0124f","Type":"ContainerStarted","Data":"0d876342563df19601a0a0d5f9b26a069d10589873c0474f912c5ca2887b4c12"} Oct 07 13:50:35 crc kubenswrapper[4854]: I1007 13:50:35.123731 4854 generic.go:334] "Generic (PLEG): container finished" podID="9d502c4b-5a47-4c65-846d-fb7256c0124f" containerID="65bbe81ddb3ca6061d2d7b38db772081fe329ef5506ca532e06711bc89ff752b" exitCode=0 Oct 07 13:50:35 crc kubenswrapper[4854]: I1007 13:50:35.123860 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pblnp" event={"ID":"9d502c4b-5a47-4c65-846d-fb7256c0124f","Type":"ContainerDied","Data":"65bbe81ddb3ca6061d2d7b38db772081fe329ef5506ca532e06711bc89ff752b"} Oct 07 13:50:36 crc kubenswrapper[4854]: I1007 13:50:36.598857 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pblnp" Oct 07 13:50:36 crc kubenswrapper[4854]: I1007 13:50:36.670065 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgw9w\" (UniqueName: \"kubernetes.io/projected/9d502c4b-5a47-4c65-846d-fb7256c0124f-kube-api-access-fgw9w\") pod \"9d502c4b-5a47-4c65-846d-fb7256c0124f\" (UID: \"9d502c4b-5a47-4c65-846d-fb7256c0124f\") " Oct 07 13:50:36 crc kubenswrapper[4854]: I1007 13:50:36.677857 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d502c4b-5a47-4c65-846d-fb7256c0124f-kube-api-access-fgw9w" (OuterVolumeSpecName: "kube-api-access-fgw9w") pod "9d502c4b-5a47-4c65-846d-fb7256c0124f" (UID: "9d502c4b-5a47-4c65-846d-fb7256c0124f"). InnerVolumeSpecName "kube-api-access-fgw9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:50:36 crc kubenswrapper[4854]: I1007 13:50:36.771824 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgw9w\" (UniqueName: \"kubernetes.io/projected/9d502c4b-5a47-4c65-846d-fb7256c0124f-kube-api-access-fgw9w\") on node \"crc\" DevicePath \"\"" Oct 07 13:50:37 crc kubenswrapper[4854]: I1007 13:50:37.156755 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pblnp" event={"ID":"9d502c4b-5a47-4c65-846d-fb7256c0124f","Type":"ContainerDied","Data":"0d876342563df19601a0a0d5f9b26a069d10589873c0474f912c5ca2887b4c12"} Oct 07 13:50:37 crc kubenswrapper[4854]: I1007 13:50:37.156800 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d876342563df19601a0a0d5f9b26a069d10589873c0474f912c5ca2887b4c12" Oct 07 13:50:37 crc kubenswrapper[4854]: I1007 13:50:37.156845 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pblnp" Oct 07 13:50:38 crc kubenswrapper[4854]: I1007 13:50:38.184374 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 07 13:50:43 crc kubenswrapper[4854]: I1007 13:50:43.435389 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4e30-account-create-8dlnn"] Oct 07 13:50:43 crc kubenswrapper[4854]: E1007 13:50:43.436558 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d502c4b-5a47-4c65-846d-fb7256c0124f" containerName="mariadb-database-create" Oct 07 13:50:43 crc kubenswrapper[4854]: I1007 13:50:43.436581 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d502c4b-5a47-4c65-846d-fb7256c0124f" containerName="mariadb-database-create" Oct 07 13:50:43 crc kubenswrapper[4854]: I1007 13:50:43.436822 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d502c4b-5a47-4c65-846d-fb7256c0124f" containerName="mariadb-database-create" Oct 07 13:50:43 crc kubenswrapper[4854]: I1007 13:50:43.437708 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4e30-account-create-8dlnn" Oct 07 13:50:43 crc kubenswrapper[4854]: I1007 13:50:43.443672 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 07 13:50:43 crc kubenswrapper[4854]: I1007 13:50:43.444040 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4e30-account-create-8dlnn"] Oct 07 13:50:43 crc kubenswrapper[4854]: I1007 13:50:43.616115 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx8dw\" (UniqueName: \"kubernetes.io/projected/cabac5b5-1b98-4966-8046-c3c6a34d5c06-kube-api-access-wx8dw\") pod \"keystone-4e30-account-create-8dlnn\" (UID: \"cabac5b5-1b98-4966-8046-c3c6a34d5c06\") " pod="openstack/keystone-4e30-account-create-8dlnn" Oct 07 13:50:43 crc kubenswrapper[4854]: I1007 13:50:43.717484 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx8dw\" (UniqueName: \"kubernetes.io/projected/cabac5b5-1b98-4966-8046-c3c6a34d5c06-kube-api-access-wx8dw\") pod \"keystone-4e30-account-create-8dlnn\" (UID: \"cabac5b5-1b98-4966-8046-c3c6a34d5c06\") " pod="openstack/keystone-4e30-account-create-8dlnn" Oct 07 13:50:43 crc kubenswrapper[4854]: I1007 13:50:43.741966 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx8dw\" (UniqueName: \"kubernetes.io/projected/cabac5b5-1b98-4966-8046-c3c6a34d5c06-kube-api-access-wx8dw\") pod \"keystone-4e30-account-create-8dlnn\" (UID: \"cabac5b5-1b98-4966-8046-c3c6a34d5c06\") " pod="openstack/keystone-4e30-account-create-8dlnn" Oct 07 13:50:43 crc kubenswrapper[4854]: I1007 13:50:43.760493 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4e30-account-create-8dlnn" Oct 07 13:50:44 crc kubenswrapper[4854]: I1007 13:50:44.236472 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4e30-account-create-8dlnn"] Oct 07 13:50:45 crc kubenswrapper[4854]: I1007 13:50:45.236510 4854 generic.go:334] "Generic (PLEG): container finished" podID="cabac5b5-1b98-4966-8046-c3c6a34d5c06" containerID="0d2342f0757e582d0325164ed00c5243aa7aa673ca6407273fe28faa0f3874d4" exitCode=0 Oct 07 13:50:45 crc kubenswrapper[4854]: I1007 13:50:45.236710 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4e30-account-create-8dlnn" event={"ID":"cabac5b5-1b98-4966-8046-c3c6a34d5c06","Type":"ContainerDied","Data":"0d2342f0757e582d0325164ed00c5243aa7aa673ca6407273fe28faa0f3874d4"} Oct 07 13:50:45 crc kubenswrapper[4854]: I1007 13:50:45.236945 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4e30-account-create-8dlnn" event={"ID":"cabac5b5-1b98-4966-8046-c3c6a34d5c06","Type":"ContainerStarted","Data":"51eb2e8a5f3171ff324e35a9e5f95f909bb869487d802c7b3a2e743437750262"} Oct 07 13:50:46 crc kubenswrapper[4854]: I1007 13:50:46.671881 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4e30-account-create-8dlnn" Oct 07 13:50:46 crc kubenswrapper[4854]: I1007 13:50:46.770258 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx8dw\" (UniqueName: \"kubernetes.io/projected/cabac5b5-1b98-4966-8046-c3c6a34d5c06-kube-api-access-wx8dw\") pod \"cabac5b5-1b98-4966-8046-c3c6a34d5c06\" (UID: \"cabac5b5-1b98-4966-8046-c3c6a34d5c06\") " Oct 07 13:50:46 crc kubenswrapper[4854]: I1007 13:50:46.792364 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cabac5b5-1b98-4966-8046-c3c6a34d5c06-kube-api-access-wx8dw" (OuterVolumeSpecName: "kube-api-access-wx8dw") pod "cabac5b5-1b98-4966-8046-c3c6a34d5c06" (UID: "cabac5b5-1b98-4966-8046-c3c6a34d5c06"). InnerVolumeSpecName "kube-api-access-wx8dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:50:46 crc kubenswrapper[4854]: I1007 13:50:46.872809 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx8dw\" (UniqueName: \"kubernetes.io/projected/cabac5b5-1b98-4966-8046-c3c6a34d5c06-kube-api-access-wx8dw\") on node \"crc\" DevicePath \"\"" Oct 07 13:50:47 crc kubenswrapper[4854]: I1007 13:50:47.258101 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4e30-account-create-8dlnn" event={"ID":"cabac5b5-1b98-4966-8046-c3c6a34d5c06","Type":"ContainerDied","Data":"51eb2e8a5f3171ff324e35a9e5f95f909bb869487d802c7b3a2e743437750262"} Oct 07 13:50:47 crc kubenswrapper[4854]: I1007 13:50:47.258171 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51eb2e8a5f3171ff324e35a9e5f95f909bb869487d802c7b3a2e743437750262" Oct 07 13:50:47 crc kubenswrapper[4854]: I1007 13:50:47.258233 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4e30-account-create-8dlnn" Oct 07 13:50:48 crc kubenswrapper[4854]: I1007 13:50:48.916211 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-q66ps"] Oct 07 13:50:48 crc kubenswrapper[4854]: E1007 13:50:48.916933 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cabac5b5-1b98-4966-8046-c3c6a34d5c06" containerName="mariadb-account-create" Oct 07 13:50:48 crc kubenswrapper[4854]: I1007 13:50:48.916950 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="cabac5b5-1b98-4966-8046-c3c6a34d5c06" containerName="mariadb-account-create" Oct 07 13:50:48 crc kubenswrapper[4854]: I1007 13:50:48.917242 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="cabac5b5-1b98-4966-8046-c3c6a34d5c06" containerName="mariadb-account-create" Oct 07 13:50:48 crc kubenswrapper[4854]: I1007 13:50:48.917975 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q66ps" Oct 07 13:50:48 crc kubenswrapper[4854]: I1007 13:50:48.919800 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 13:50:48 crc kubenswrapper[4854]: I1007 13:50:48.920076 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 13:50:48 crc kubenswrapper[4854]: I1007 13:50:48.920256 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 13:50:48 crc kubenswrapper[4854]: I1007 13:50:48.920948 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wb46k" Oct 07 13:50:48 crc kubenswrapper[4854]: I1007 13:50:48.927474 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-q66ps"] Oct 07 13:50:49 crc kubenswrapper[4854]: I1007 13:50:49.007784 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdf7l\" (UniqueName: \"kubernetes.io/projected/faca3163-f21e-446d-9672-6273c04b186d-kube-api-access-qdf7l\") pod \"keystone-db-sync-q66ps\" (UID: \"faca3163-f21e-446d-9672-6273c04b186d\") " pod="openstack/keystone-db-sync-q66ps" Oct 07 13:50:49 crc kubenswrapper[4854]: I1007 13:50:49.007856 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faca3163-f21e-446d-9672-6273c04b186d-combined-ca-bundle\") pod \"keystone-db-sync-q66ps\" (UID: \"faca3163-f21e-446d-9672-6273c04b186d\") " pod="openstack/keystone-db-sync-q66ps" Oct 07 13:50:49 crc kubenswrapper[4854]: I1007 13:50:49.007886 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faca3163-f21e-446d-9672-6273c04b186d-config-data\") pod \"keystone-db-sync-q66ps\" (UID: \"faca3163-f21e-446d-9672-6273c04b186d\") " pod="openstack/keystone-db-sync-q66ps" Oct 07 13:50:49 crc kubenswrapper[4854]: I1007 13:50:49.109117 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdf7l\" (UniqueName: \"kubernetes.io/projected/faca3163-f21e-446d-9672-6273c04b186d-kube-api-access-qdf7l\") pod \"keystone-db-sync-q66ps\" (UID: \"faca3163-f21e-446d-9672-6273c04b186d\") " pod="openstack/keystone-db-sync-q66ps" Oct 07 13:50:49 crc kubenswrapper[4854]: I1007 13:50:49.109202 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faca3163-f21e-446d-9672-6273c04b186d-combined-ca-bundle\") pod \"keystone-db-sync-q66ps\" (UID: \"faca3163-f21e-446d-9672-6273c04b186d\") " pod="openstack/keystone-db-sync-q66ps" Oct 07 13:50:49 crc kubenswrapper[4854]: I1007 13:50:49.109220 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faca3163-f21e-446d-9672-6273c04b186d-config-data\") pod \"keystone-db-sync-q66ps\" (UID: \"faca3163-f21e-446d-9672-6273c04b186d\") " pod="openstack/keystone-db-sync-q66ps" Oct 07 13:50:49 crc kubenswrapper[4854]: I1007 13:50:49.113598 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faca3163-f21e-446d-9672-6273c04b186d-combined-ca-bundle\") pod \"keystone-db-sync-q66ps\" (UID: \"faca3163-f21e-446d-9672-6273c04b186d\") " pod="openstack/keystone-db-sync-q66ps" Oct 07 13:50:49 crc kubenswrapper[4854]: I1007 13:50:49.114236 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faca3163-f21e-446d-9672-6273c04b186d-config-data\") pod \"keystone-db-sync-q66ps\" (UID: \"faca3163-f21e-446d-9672-6273c04b186d\") " pod="openstack/keystone-db-sync-q66ps" Oct 07 13:50:49 crc kubenswrapper[4854]: I1007 13:50:49.133698 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdf7l\" (UniqueName: \"kubernetes.io/projected/faca3163-f21e-446d-9672-6273c04b186d-kube-api-access-qdf7l\") pod \"keystone-db-sync-q66ps\" (UID: \"faca3163-f21e-446d-9672-6273c04b186d\") " pod="openstack/keystone-db-sync-q66ps" Oct 07 13:50:49 crc kubenswrapper[4854]: I1007 13:50:49.267851 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q66ps" Oct 07 13:50:49 crc kubenswrapper[4854]: I1007 13:50:49.715901 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-q66ps"] Oct 07 13:50:49 crc kubenswrapper[4854]: W1007 13:50:49.715951 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaca3163_f21e_446d_9672_6273c04b186d.slice/crio-f3e5d67df99ee1c6442b35827f61fa830cd0732c1c2605daba4ad7e26ec8e9ac WatchSource:0}: Error finding container f3e5d67df99ee1c6442b35827f61fa830cd0732c1c2605daba4ad7e26ec8e9ac: Status 404 returned error can't find the container with id f3e5d67df99ee1c6442b35827f61fa830cd0732c1c2605daba4ad7e26ec8e9ac Oct 07 13:50:50 crc kubenswrapper[4854]: I1007 13:50:50.289724 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q66ps" event={"ID":"faca3163-f21e-446d-9672-6273c04b186d","Type":"ContainerStarted","Data":"48a55825ee74722bc5f78d76b3d4b17f0e142addc7a31a7c32a9a0dddc32f3e9"} Oct 07 13:50:50 crc kubenswrapper[4854]: I1007 13:50:50.289766 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q66ps" event={"ID":"faca3163-f21e-446d-9672-6273c04b186d","Type":"ContainerStarted","Data":"f3e5d67df99ee1c6442b35827f61fa830cd0732c1c2605daba4ad7e26ec8e9ac"} Oct 07 13:50:50 crc kubenswrapper[4854]: I1007 13:50:50.311223 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-q66ps" podStartSLOduration=2.311198872 podStartE2EDuration="2.311198872s" podCreationTimestamp="2025-10-07 13:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:50:50.308601477 +0000 UTC m=+5166.296433762" watchObservedRunningTime="2025-10-07 13:50:50.311198872 +0000 UTC m=+5166.299031127" Oct 07 13:50:52 crc kubenswrapper[4854]: I1007 13:50:52.322798 4854 generic.go:334] "Generic (PLEG): container finished" podID="faca3163-f21e-446d-9672-6273c04b186d" containerID="48a55825ee74722bc5f78d76b3d4b17f0e142addc7a31a7c32a9a0dddc32f3e9" exitCode=0 Oct 07 13:50:52 crc kubenswrapper[4854]: I1007 13:50:52.322929 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q66ps" event={"ID":"faca3163-f21e-446d-9672-6273c04b186d","Type":"ContainerDied","Data":"48a55825ee74722bc5f78d76b3d4b17f0e142addc7a31a7c32a9a0dddc32f3e9"} Oct 07 13:50:53 crc kubenswrapper[4854]: I1007 13:50:53.721120 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q66ps" Oct 07 13:50:53 crc kubenswrapper[4854]: I1007 13:50:53.742699 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdf7l\" (UniqueName: \"kubernetes.io/projected/faca3163-f21e-446d-9672-6273c04b186d-kube-api-access-qdf7l\") pod \"faca3163-f21e-446d-9672-6273c04b186d\" (UID: \"faca3163-f21e-446d-9672-6273c04b186d\") " Oct 07 13:50:53 crc kubenswrapper[4854]: I1007 13:50:53.743004 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faca3163-f21e-446d-9672-6273c04b186d-config-data\") pod \"faca3163-f21e-446d-9672-6273c04b186d\" (UID: \"faca3163-f21e-446d-9672-6273c04b186d\") " Oct 07 13:50:53 crc kubenswrapper[4854]: I1007 13:50:53.743112 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faca3163-f21e-446d-9672-6273c04b186d-combined-ca-bundle\") pod \"faca3163-f21e-446d-9672-6273c04b186d\" (UID: \"faca3163-f21e-446d-9672-6273c04b186d\") " Oct 07 13:50:53 crc kubenswrapper[4854]: I1007 13:50:53.794929 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faca3163-f21e-446d-9672-6273c04b186d-kube-api-access-qdf7l" (OuterVolumeSpecName: "kube-api-access-qdf7l") pod "faca3163-f21e-446d-9672-6273c04b186d" (UID: "faca3163-f21e-446d-9672-6273c04b186d"). InnerVolumeSpecName "kube-api-access-qdf7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:50:53 crc kubenswrapper[4854]: I1007 13:50:53.795918 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faca3163-f21e-446d-9672-6273c04b186d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faca3163-f21e-446d-9672-6273c04b186d" (UID: "faca3163-f21e-446d-9672-6273c04b186d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:50:53 crc kubenswrapper[4854]: I1007 13:50:53.838467 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faca3163-f21e-446d-9672-6273c04b186d-config-data" (OuterVolumeSpecName: "config-data") pod "faca3163-f21e-446d-9672-6273c04b186d" (UID: "faca3163-f21e-446d-9672-6273c04b186d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:50:53 crc kubenswrapper[4854]: I1007 13:50:53.844255 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faca3163-f21e-446d-9672-6273c04b186d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:50:53 crc kubenswrapper[4854]: I1007 13:50:53.844287 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdf7l\" (UniqueName: \"kubernetes.io/projected/faca3163-f21e-446d-9672-6273c04b186d-kube-api-access-qdf7l\") on node \"crc\" DevicePath \"\"" Oct 07 13:50:53 crc kubenswrapper[4854]: I1007 13:50:53.844297 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faca3163-f21e-446d-9672-6273c04b186d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.344478 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q66ps" event={"ID":"faca3163-f21e-446d-9672-6273c04b186d","Type":"ContainerDied","Data":"f3e5d67df99ee1c6442b35827f61fa830cd0732c1c2605daba4ad7e26ec8e9ac"} Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.344522 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e5d67df99ee1c6442b35827f61fa830cd0732c1c2605daba4ad7e26ec8e9ac" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.344525 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q66ps" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.613675 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85887f4b95-lk5hg"] Oct 07 13:50:54 crc kubenswrapper[4854]: E1007 13:50:54.614730 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faca3163-f21e-446d-9672-6273c04b186d" containerName="keystone-db-sync" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.614758 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="faca3163-f21e-446d-9672-6273c04b186d" containerName="keystone-db-sync" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.615181 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="faca3163-f21e-446d-9672-6273c04b186d" containerName="keystone-db-sync" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.616544 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.652411 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xcbgn"] Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.653841 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.655421 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85887f4b95-lk5hg"] Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.661389 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.661734 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.661619 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.661738 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wb46k" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.664669 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xcbgn"] Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.757935 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-ovsdbserver-nb\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.758263 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-fernet-keys\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.758453 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7p6x\" (UniqueName: \"kubernetes.io/projected/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-kube-api-access-t7p6x\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.758564 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s6jf\" (UniqueName: \"kubernetes.io/projected/fefbc96f-0f15-4ad9-813f-763a96c25e30-kube-api-access-7s6jf\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.758658 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-credential-keys\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.758783 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-scripts\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.759015 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-config\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.759118 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-dns-svc\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.759222 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-ovsdbserver-sb\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.759286 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-combined-ca-bundle\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.759485 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-config-data\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.861454 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-scripts\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.861557 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-config\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.861593 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-dns-svc\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.861634 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-ovsdbserver-sb\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.861669 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-combined-ca-bundle\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.861718 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-config-data\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.861754 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-ovsdbserver-nb\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.861856 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-fernet-keys\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.861895 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7p6x\" (UniqueName: \"kubernetes.io/projected/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-kube-api-access-t7p6x\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.861940 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s6jf\" (UniqueName: \"kubernetes.io/projected/fefbc96f-0f15-4ad9-813f-763a96c25e30-kube-api-access-7s6jf\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.862012 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-credential-keys\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.862481 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-config\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.863083 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-dns-svc\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.863083 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-ovsdbserver-sb\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.863229 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-ovsdbserver-nb\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.866404 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-scripts\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.867016 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-credential-keys\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.867715 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-fernet-keys\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.870193 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-combined-ca-bundle\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.878702 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-config-data\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.879536 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s6jf\" (UniqueName: \"kubernetes.io/projected/fefbc96f-0f15-4ad9-813f-763a96c25e30-kube-api-access-7s6jf\") pod \"dnsmasq-dns-85887f4b95-lk5hg\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.881095 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7p6x\" (UniqueName: \"kubernetes.io/projected/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-kube-api-access-t7p6x\") pod \"keystone-bootstrap-xcbgn\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.962220 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:54 crc kubenswrapper[4854]: I1007 13:50:54.989333 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:50:55 crc kubenswrapper[4854]: I1007 13:50:55.448758 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85887f4b95-lk5hg"] Oct 07 13:50:55 crc kubenswrapper[4854]: I1007 13:50:55.502809 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xcbgn"] Oct 07 13:50:55 crc kubenswrapper[4854]: W1007 13:50:55.503246 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebd8f9f7_d9c4_4901_a6a2_bb1028290c64.slice/crio-8fa08875703082758fe9367f9cd4cda80bd1f04552aba5478b2fe27e6553e53f WatchSource:0}: Error finding container 8fa08875703082758fe9367f9cd4cda80bd1f04552aba5478b2fe27e6553e53f: Status 404 returned error can't find the container with id 8fa08875703082758fe9367f9cd4cda80bd1f04552aba5478b2fe27e6553e53f Oct 07 13:50:56 crc kubenswrapper[4854]: I1007 13:50:56.361353 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" event={"ID":"fefbc96f-0f15-4ad9-813f-763a96c25e30","Type":"ContainerDied","Data":"49927adb42284504a9f85efd83a85d5e15c25ec75515b8158d731b454c713685"} Oct 07 13:50:56 crc kubenswrapper[4854]: I1007 13:50:56.361137 4854 generic.go:334] "Generic (PLEG): container finished" podID="fefbc96f-0f15-4ad9-813f-763a96c25e30" containerID="49927adb42284504a9f85efd83a85d5e15c25ec75515b8158d731b454c713685" exitCode=0 Oct 07 13:50:56 crc kubenswrapper[4854]: I1007 13:50:56.362165 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" event={"ID":"fefbc96f-0f15-4ad9-813f-763a96c25e30","Type":"ContainerStarted","Data":"0eda816e3f5d26a2ff581d4def7c110ed0aedf96a8b0192e11a807fdd6adfaf7"} Oct 07 13:50:56 crc kubenswrapper[4854]: I1007 13:50:56.366486 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xcbgn" event={"ID":"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64","Type":"ContainerStarted","Data":"32a0afc3d512ac404f6dbc29f85124ed93f3712c65f6763cc70eeb8a10f831bd"} Oct 07 13:50:56 crc kubenswrapper[4854]: I1007 13:50:56.366534 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xcbgn" event={"ID":"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64","Type":"ContainerStarted","Data":"8fa08875703082758fe9367f9cd4cda80bd1f04552aba5478b2fe27e6553e53f"} Oct 07 13:50:56 crc kubenswrapper[4854]: I1007 13:50:56.409596 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xcbgn" podStartSLOduration=2.409577221 podStartE2EDuration="2.409577221s" podCreationTimestamp="2025-10-07 13:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:50:56.405063142 +0000 UTC m=+5172.392895417" watchObservedRunningTime="2025-10-07 13:50:56.409577221 +0000 UTC m=+5172.397409476" Oct 07 13:50:57 crc kubenswrapper[4854]: I1007 13:50:57.382317 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" event={"ID":"fefbc96f-0f15-4ad9-813f-763a96c25e30","Type":"ContainerStarted","Data":"944fdd35ec9d291c34e62c80116adb620f505803872c14e3f989209fef4cf521"} Oct 07 13:50:57 crc kubenswrapper[4854]: I1007 13:50:57.383117 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:50:57 crc kubenswrapper[4854]: I1007 13:50:57.427366 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" podStartSLOduration=3.427347059 podStartE2EDuration="3.427347059s" podCreationTimestamp="2025-10-07 13:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:50:57.420865794 +0000 UTC m=+5173.408698049" watchObservedRunningTime="2025-10-07 13:50:57.427347059 +0000 UTC m=+5173.415179314" Oct 07 13:50:59 crc kubenswrapper[4854]: I1007 13:50:59.406360 4854 generic.go:334] "Generic (PLEG): container finished" podID="ebd8f9f7-d9c4-4901-a6a2-bb1028290c64" containerID="32a0afc3d512ac404f6dbc29f85124ed93f3712c65f6763cc70eeb8a10f831bd" exitCode=0 Oct 07 13:50:59 crc kubenswrapper[4854]: I1007 13:50:59.406694 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xcbgn" event={"ID":"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64","Type":"ContainerDied","Data":"32a0afc3d512ac404f6dbc29f85124ed93f3712c65f6763cc70eeb8a10f831bd"} Oct 07 13:51:00 crc kubenswrapper[4854]: I1007 13:51:00.861994 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:51:00 crc kubenswrapper[4854]: I1007 13:51:00.991975 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-credential-keys\") pod \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " Oct 07 13:51:00 crc kubenswrapper[4854]: I1007 13:51:00.992250 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-config-data\") pod \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " Oct 07 13:51:00 crc kubenswrapper[4854]: I1007 13:51:00.992449 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-fernet-keys\") pod \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " Oct 07 13:51:00 crc kubenswrapper[4854]: I1007 13:51:00.992550 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7p6x\" (UniqueName: \"kubernetes.io/projected/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-kube-api-access-t7p6x\") pod \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " Oct 07 13:51:00 crc kubenswrapper[4854]: I1007 13:51:00.992620 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-scripts\") pod \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " Oct 07 13:51:00 crc kubenswrapper[4854]: I1007 13:51:00.992655 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-combined-ca-bundle\") pod \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\" (UID: \"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64\") " Oct 07 13:51:00 crc kubenswrapper[4854]: I1007 13:51:00.999098 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ebd8f9f7-d9c4-4901-a6a2-bb1028290c64" (UID: "ebd8f9f7-d9c4-4901-a6a2-bb1028290c64"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.001984 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-scripts" (OuterVolumeSpecName: "scripts") pod "ebd8f9f7-d9c4-4901-a6a2-bb1028290c64" (UID: "ebd8f9f7-d9c4-4901-a6a2-bb1028290c64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.002536 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ebd8f9f7-d9c4-4901-a6a2-bb1028290c64" (UID: "ebd8f9f7-d9c4-4901-a6a2-bb1028290c64"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.003419 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-kube-api-access-t7p6x" (OuterVolumeSpecName: "kube-api-access-t7p6x") pod "ebd8f9f7-d9c4-4901-a6a2-bb1028290c64" (UID: "ebd8f9f7-d9c4-4901-a6a2-bb1028290c64"). InnerVolumeSpecName "kube-api-access-t7p6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.030653 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-config-data" (OuterVolumeSpecName: "config-data") pod "ebd8f9f7-d9c4-4901-a6a2-bb1028290c64" (UID: "ebd8f9f7-d9c4-4901-a6a2-bb1028290c64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.038263 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebd8f9f7-d9c4-4901-a6a2-bb1028290c64" (UID: "ebd8f9f7-d9c4-4901-a6a2-bb1028290c64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.095685 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.095744 4854 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.095766 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7p6x\" (UniqueName: \"kubernetes.io/projected/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-kube-api-access-t7p6x\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.095786 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.095806 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.095825 4854 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.430622 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xcbgn" event={"ID":"ebd8f9f7-d9c4-4901-a6a2-bb1028290c64","Type":"ContainerDied","Data":"8fa08875703082758fe9367f9cd4cda80bd1f04552aba5478b2fe27e6553e53f"} Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.430662 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fa08875703082758fe9367f9cd4cda80bd1f04552aba5478b2fe27e6553e53f" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.430699 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xcbgn" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.520673 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xcbgn"] Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.526575 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xcbgn"] Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.652464 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-257hr"] Oct 07 13:51:01 crc kubenswrapper[4854]: E1007 13:51:01.652784 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd8f9f7-d9c4-4901-a6a2-bb1028290c64" containerName="keystone-bootstrap" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.652797 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd8f9f7-d9c4-4901-a6a2-bb1028290c64" containerName="keystone-bootstrap" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.652971 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebd8f9f7-d9c4-4901-a6a2-bb1028290c64" containerName="keystone-bootstrap" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.653566 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.656304 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.656564 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wb46k" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.656721 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.656741 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.688278 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-257hr"] Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.807852 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-credential-keys\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.807953 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-combined-ca-bundle\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.808077 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-config-data\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.808591 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-fernet-keys\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.808752 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8ndz\" (UniqueName: \"kubernetes.io/projected/ad13166b-d165-4483-9e36-967308c3f93b-kube-api-access-b8ndz\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.808842 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-scripts\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.910368 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-fernet-keys\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.910518 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8ndz\" (UniqueName: \"kubernetes.io/projected/ad13166b-d165-4483-9e36-967308c3f93b-kube-api-access-b8ndz\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.910603 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-scripts\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.910711 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-credential-keys\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.910768 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-combined-ca-bundle\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.910858 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-config-data\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.918813 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-credential-keys\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.919047 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-fernet-keys\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.922337 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-config-data\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.922835 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-scripts\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.927019 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-combined-ca-bundle\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.945278 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8ndz\" (UniqueName: \"kubernetes.io/projected/ad13166b-d165-4483-9e36-967308c3f93b-kube-api-access-b8ndz\") pod \"keystone-bootstrap-257hr\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:01 crc kubenswrapper[4854]: I1007 13:51:01.990601 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:02 crc kubenswrapper[4854]: I1007 13:51:02.486707 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-257hr"] Oct 07 13:51:02 crc kubenswrapper[4854]: I1007 13:51:02.719327 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebd8f9f7-d9c4-4901-a6a2-bb1028290c64" path="/var/lib/kubelet/pods/ebd8f9f7-d9c4-4901-a6a2-bb1028290c64/volumes" Oct 07 13:51:03 crc kubenswrapper[4854]: I1007 13:51:03.459163 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-257hr" event={"ID":"ad13166b-d165-4483-9e36-967308c3f93b","Type":"ContainerStarted","Data":"39512b3157f2fc83c7c64b3331b61692b937f98a439b3da1fb9b2d783bb5ab8d"} Oct 07 13:51:03 crc kubenswrapper[4854]: I1007 13:51:03.459538 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-257hr" event={"ID":"ad13166b-d165-4483-9e36-967308c3f93b","Type":"ContainerStarted","Data":"f79eb3aaacccddabf17b06e544916d3ccd8b34e4df1440889f7f457fcf617f7e"} Oct 07 13:51:03 crc kubenswrapper[4854]: I1007 13:51:03.495299 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-257hr" podStartSLOduration=2.495230597 podStartE2EDuration="2.495230597s" podCreationTimestamp="2025-10-07 13:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:51:03.483688527 +0000 UTC m=+5179.471520812" watchObservedRunningTime="2025-10-07 13:51:03.495230597 +0000 UTC m=+5179.483062892" Oct 07 13:51:04 crc kubenswrapper[4854]: I1007 13:51:04.964435 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.056216 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dd484fc77-4qglj"] Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.057193 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" podUID="d31c15dd-5829-4953-9784-0297b0ddf02a" containerName="dnsmasq-dns" containerID="cri-o://549317c676ec2762c187b3e3db339baf213e4e525320d62c088c10e6bed6c35a" gracePeriod=10 Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.483601 4854 generic.go:334] "Generic (PLEG): container finished" podID="ad13166b-d165-4483-9e36-967308c3f93b" containerID="39512b3157f2fc83c7c64b3331b61692b937f98a439b3da1fb9b2d783bb5ab8d" exitCode=0 Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.483648 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-257hr" event={"ID":"ad13166b-d165-4483-9e36-967308c3f93b","Type":"ContainerDied","Data":"39512b3157f2fc83c7c64b3331b61692b937f98a439b3da1fb9b2d783bb5ab8d"} Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.485805 4854 generic.go:334] "Generic (PLEG): container finished" podID="d31c15dd-5829-4953-9784-0297b0ddf02a" containerID="549317c676ec2762c187b3e3db339baf213e4e525320d62c088c10e6bed6c35a" exitCode=0 Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.485839 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" event={"ID":"d31c15dd-5829-4953-9784-0297b0ddf02a","Type":"ContainerDied","Data":"549317c676ec2762c187b3e3db339baf213e4e525320d62c088c10e6bed6c35a"} Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.562389 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.580380 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-ovsdbserver-sb\") pod \"d31c15dd-5829-4953-9784-0297b0ddf02a\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.623996 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d31c15dd-5829-4953-9784-0297b0ddf02a" (UID: "d31c15dd-5829-4953-9784-0297b0ddf02a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.681633 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-dns-svc\") pod \"d31c15dd-5829-4953-9784-0297b0ddf02a\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.681739 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-config\") pod \"d31c15dd-5829-4953-9784-0297b0ddf02a\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.681785 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-ovsdbserver-nb\") pod \"d31c15dd-5829-4953-9784-0297b0ddf02a\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.681840 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd74m\" (UniqueName: \"kubernetes.io/projected/d31c15dd-5829-4953-9784-0297b0ddf02a-kube-api-access-rd74m\") pod \"d31c15dd-5829-4953-9784-0297b0ddf02a\" (UID: \"d31c15dd-5829-4953-9784-0297b0ddf02a\") " Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.682042 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.684946 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d31c15dd-5829-4953-9784-0297b0ddf02a-kube-api-access-rd74m" (OuterVolumeSpecName: "kube-api-access-rd74m") pod "d31c15dd-5829-4953-9784-0297b0ddf02a" (UID: "d31c15dd-5829-4953-9784-0297b0ddf02a"). InnerVolumeSpecName "kube-api-access-rd74m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.715559 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d31c15dd-5829-4953-9784-0297b0ddf02a" (UID: "d31c15dd-5829-4953-9784-0297b0ddf02a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.716622 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-config" (OuterVolumeSpecName: "config") pod "d31c15dd-5829-4953-9784-0297b0ddf02a" (UID: "d31c15dd-5829-4953-9784-0297b0ddf02a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.729882 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d31c15dd-5829-4953-9784-0297b0ddf02a" (UID: "d31c15dd-5829-4953-9784-0297b0ddf02a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.783318 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.783355 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.783365 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d31c15dd-5829-4953-9784-0297b0ddf02a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:05 crc kubenswrapper[4854]: I1007 13:51:05.783374 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd74m\" (UniqueName: \"kubernetes.io/projected/d31c15dd-5829-4953-9784-0297b0ddf02a-kube-api-access-rd74m\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.499993 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" event={"ID":"d31c15dd-5829-4953-9784-0297b0ddf02a","Type":"ContainerDied","Data":"3060eaaf01a0aee91584ed801c627abdb0ca42d0be773ebb5e71f47809720920"} Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.500051 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dd484fc77-4qglj" Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.501319 4854 scope.go:117] "RemoveContainer" containerID="549317c676ec2762c187b3e3db339baf213e4e525320d62c088c10e6bed6c35a" Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.546667 4854 scope.go:117] "RemoveContainer" containerID="094191c9cb2c0c9d00db765f38a388b2be6c692866d0e7518f6c8f0e9261115a" Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.573437 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dd484fc77-4qglj"] Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.581185 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dd484fc77-4qglj"] Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.717552 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d31c15dd-5829-4953-9784-0297b0ddf02a" path="/var/lib/kubelet/pods/d31c15dd-5829-4953-9784-0297b0ddf02a/volumes" Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.902085 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.911238 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-fernet-keys\") pod \"ad13166b-d165-4483-9e36-967308c3f93b\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.911302 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-scripts\") pod \"ad13166b-d165-4483-9e36-967308c3f93b\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.911337 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-config-data\") pod \"ad13166b-d165-4483-9e36-967308c3f93b\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.911357 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8ndz\" (UniqueName: \"kubernetes.io/projected/ad13166b-d165-4483-9e36-967308c3f93b-kube-api-access-b8ndz\") pod \"ad13166b-d165-4483-9e36-967308c3f93b\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.911406 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-credential-keys\") pod \"ad13166b-d165-4483-9e36-967308c3f93b\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.911479 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-combined-ca-bundle\") pod \"ad13166b-d165-4483-9e36-967308c3f93b\" (UID: \"ad13166b-d165-4483-9e36-967308c3f93b\") " Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.915525 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-scripts" (OuterVolumeSpecName: "scripts") pod "ad13166b-d165-4483-9e36-967308c3f93b" (UID: "ad13166b-d165-4483-9e36-967308c3f93b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.915702 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ad13166b-d165-4483-9e36-967308c3f93b" (UID: "ad13166b-d165-4483-9e36-967308c3f93b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.923056 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad13166b-d165-4483-9e36-967308c3f93b-kube-api-access-b8ndz" (OuterVolumeSpecName: "kube-api-access-b8ndz") pod "ad13166b-d165-4483-9e36-967308c3f93b" (UID: "ad13166b-d165-4483-9e36-967308c3f93b"). InnerVolumeSpecName "kube-api-access-b8ndz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.926755 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ad13166b-d165-4483-9e36-967308c3f93b" (UID: "ad13166b-d165-4483-9e36-967308c3f93b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.947291 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad13166b-d165-4483-9e36-967308c3f93b" (UID: "ad13166b-d165-4483-9e36-967308c3f93b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:06 crc kubenswrapper[4854]: I1007 13:51:06.954605 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-config-data" (OuterVolumeSpecName: "config-data") pod "ad13166b-d165-4483-9e36-967308c3f93b" (UID: "ad13166b-d165-4483-9e36-967308c3f93b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:07 crc kubenswrapper[4854]: I1007 13:51:07.012852 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:07 crc kubenswrapper[4854]: I1007 13:51:07.012936 4854 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:07 crc kubenswrapper[4854]: I1007 13:51:07.012956 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:07 crc kubenswrapper[4854]: I1007 13:51:07.012974 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:07 crc kubenswrapper[4854]: I1007 13:51:07.012988 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8ndz\" (UniqueName: \"kubernetes.io/projected/ad13166b-d165-4483-9e36-967308c3f93b-kube-api-access-b8ndz\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:07 crc kubenswrapper[4854]: I1007 13:51:07.013006 4854 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ad13166b-d165-4483-9e36-967308c3f93b-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:07 crc kubenswrapper[4854]: I1007 13:51:07.512055 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-257hr" event={"ID":"ad13166b-d165-4483-9e36-967308c3f93b","Type":"ContainerDied","Data":"f79eb3aaacccddabf17b06e544916d3ccd8b34e4df1440889f7f457fcf617f7e"} Oct 07 13:51:07 crc kubenswrapper[4854]: I1007 13:51:07.512099 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f79eb3aaacccddabf17b06e544916d3ccd8b34e4df1440889f7f457fcf617f7e" Oct 07 13:51:07 crc kubenswrapper[4854]: I1007 13:51:07.512116 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-257hr" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.104478 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-845977846b-69rp6"] Oct 07 13:51:08 crc kubenswrapper[4854]: E1007 13:51:08.105547 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad13166b-d165-4483-9e36-967308c3f93b" containerName="keystone-bootstrap" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.105576 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad13166b-d165-4483-9e36-967308c3f93b" containerName="keystone-bootstrap" Oct 07 13:51:08 crc kubenswrapper[4854]: E1007 13:51:08.105615 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d31c15dd-5829-4953-9784-0297b0ddf02a" containerName="dnsmasq-dns" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.105630 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31c15dd-5829-4953-9784-0297b0ddf02a" containerName="dnsmasq-dns" Oct 07 13:51:08 crc kubenswrapper[4854]: E1007 13:51:08.105761 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d31c15dd-5829-4953-9784-0297b0ddf02a" containerName="init" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.106662 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31c15dd-5829-4953-9784-0297b0ddf02a" containerName="init" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.106999 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad13166b-d165-4483-9e36-967308c3f93b" containerName="keystone-bootstrap" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.107042 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="d31c15dd-5829-4953-9784-0297b0ddf02a" containerName="dnsmasq-dns" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.107932 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.112566 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.112666 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wb46k" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.112855 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.114183 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.118584 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-845977846b-69rp6"] Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.134944 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rhrl\" (UniqueName: \"kubernetes.io/projected/ae5b44ef-2398-488e-bcef-c88d09cea90a-kube-api-access-8rhrl\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.135052 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-combined-ca-bundle\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.135089 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-credential-keys\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.136859 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-scripts\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.137027 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-config-data\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.137066 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-fernet-keys\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.238057 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-config-data\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.238113 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-fernet-keys\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.238161 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rhrl\" (UniqueName: \"kubernetes.io/projected/ae5b44ef-2398-488e-bcef-c88d09cea90a-kube-api-access-8rhrl\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.238247 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-combined-ca-bundle\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.238575 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-credential-keys\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.238667 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-scripts\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.245326 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-config-data\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.246957 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-credential-keys\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.247574 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-combined-ca-bundle\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.250565 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-fernet-keys\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.257650 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae5b44ef-2398-488e-bcef-c88d09cea90a-scripts\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.262237 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rhrl\" (UniqueName: \"kubernetes.io/projected/ae5b44ef-2398-488e-bcef-c88d09cea90a-kube-api-access-8rhrl\") pod \"keystone-845977846b-69rp6\" (UID: \"ae5b44ef-2398-488e-bcef-c88d09cea90a\") " pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:08 crc kubenswrapper[4854]: I1007 13:51:08.452181 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:09 crc kubenswrapper[4854]: I1007 13:51:09.024983 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-845977846b-69rp6"] Oct 07 13:51:09 crc kubenswrapper[4854]: I1007 13:51:09.554170 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-845977846b-69rp6" event={"ID":"ae5b44ef-2398-488e-bcef-c88d09cea90a","Type":"ContainerStarted","Data":"444a4a2f18c07679f256bf51adb961e591b69663bfd3839392abf0082cbe96c4"} Oct 07 13:51:09 crc kubenswrapper[4854]: I1007 13:51:09.554485 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:09 crc kubenswrapper[4854]: I1007 13:51:09.554500 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-845977846b-69rp6" event={"ID":"ae5b44ef-2398-488e-bcef-c88d09cea90a","Type":"ContainerStarted","Data":"d6b90d2f3063757ba6499d6264d740069e7adf8bc94417ff83a95311ff07aacc"} Oct 07 13:51:09 crc kubenswrapper[4854]: I1007 13:51:09.586018 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-845977846b-69rp6" podStartSLOduration=1.585986089 podStartE2EDuration="1.585986089s" podCreationTimestamp="2025-10-07 13:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:51:09.580550564 +0000 UTC m=+5185.568382909" watchObservedRunningTime="2025-10-07 13:51:09.585986089 +0000 UTC m=+5185.573818374" Oct 07 13:51:21 crc kubenswrapper[4854]: I1007 13:51:21.316631 4854 scope.go:117] "RemoveContainer" containerID="83a414b7ec949942f79be6d1dbf0616e11b4720a9ebf75f080cb6c96a5cc8d52" Oct 07 13:51:21 crc kubenswrapper[4854]: I1007 13:51:21.350024 4854 scope.go:117] "RemoveContainer" containerID="f71286733b12d0ccc72887cfbca56225b404a500c1af4dc2a966b93beefed5ca" Oct 07 13:51:21 crc kubenswrapper[4854]: I1007 13:51:21.418927 4854 scope.go:117] "RemoveContainer" containerID="c47c93bffe35beb85bb46619b63814b0763e0aadad854e5621f7607081674ac8" Oct 07 13:51:21 crc kubenswrapper[4854]: I1007 13:51:21.446402 4854 scope.go:117] "RemoveContainer" containerID="180d1c3cca5b5fb4919d220de5dbc9ba6836d2fd415b58d06ca62ef6f5bd79c7" Oct 07 13:51:21 crc kubenswrapper[4854]: I1007 13:51:21.478415 4854 scope.go:117] "RemoveContainer" containerID="c9c277661c4c241994c2a178c0bcd561c74a72a971d800d23051227389780cf4" Oct 07 13:51:21 crc kubenswrapper[4854]: I1007 13:51:21.530467 4854 scope.go:117] "RemoveContainer" containerID="c97da054a70551a5a5af8d805d868fa7a027f32be0bad5b5e0e603957efd1385" Oct 07 13:51:21 crc kubenswrapper[4854]: I1007 13:51:21.580555 4854 scope.go:117] "RemoveContainer" containerID="7a10764ed191d5690ef344096502c56ee095f17da21ceeca6c4c9aa959627a2d" Oct 07 13:51:21 crc kubenswrapper[4854]: I1007 13:51:21.610695 4854 scope.go:117] "RemoveContainer" containerID="9a8a7a345e3ad86e7194a48c26f089d9964d099de6188ff54c7c31044ad55383" Oct 07 13:51:21 crc kubenswrapper[4854]: I1007 13:51:21.635378 4854 scope.go:117] "RemoveContainer" containerID="c2612888d742e1ebcf334051c040771ef82f2849b4ab87da7ed074b8bb411678" Oct 07 13:51:39 crc kubenswrapper[4854]: I1007 13:51:39.872903 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-845977846b-69rp6" Oct 07 13:51:40 crc kubenswrapper[4854]: I1007 13:51:40.808232 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:51:40 crc kubenswrapper[4854]: I1007 13:51:40.808492 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.493548 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.496775 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.501353 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.501408 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.501496 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-v725t" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.506458 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.546990 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvk2s\" (UniqueName: \"kubernetes.io/projected/a75d91b5-4a5c-492f-9c59-a4338368ed9d-kube-api-access-fvk2s\") pod \"openstackclient\" (UID: \"a75d91b5-4a5c-492f-9c59-a4338368ed9d\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.547122 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a75d91b5-4a5c-492f-9c59-a4338368ed9d-openstack-config-secret\") pod \"openstackclient\" (UID: \"a75d91b5-4a5c-492f-9c59-a4338368ed9d\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.547207 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a75d91b5-4a5c-492f-9c59-a4338368ed9d-openstack-config\") pod \"openstackclient\" (UID: \"a75d91b5-4a5c-492f-9c59-a4338368ed9d\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.554336 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 07 13:51:43 crc kubenswrapper[4854]: E1007 13:51:43.555325 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-fvk2s openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="a75d91b5-4a5c-492f-9c59-a4338368ed9d" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.561850 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.573278 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.574522 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.598933 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.648626 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66c67541-4e54-48ab-96dd-7d904d0ff7d2-openstack-config-secret\") pod \"openstackclient\" (UID: \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.648698 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a75d91b5-4a5c-492f-9c59-a4338368ed9d-openstack-config-secret\") pod \"openstackclient\" (UID: \"a75d91b5-4a5c-492f-9c59-a4338368ed9d\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.648752 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a75d91b5-4a5c-492f-9c59-a4338368ed9d-openstack-config\") pod \"openstackclient\" (UID: \"a75d91b5-4a5c-492f-9c59-a4338368ed9d\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.648863 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvk2s\" (UniqueName: \"kubernetes.io/projected/a75d91b5-4a5c-492f-9c59-a4338368ed9d-kube-api-access-fvk2s\") pod \"openstackclient\" (UID: \"a75d91b5-4a5c-492f-9c59-a4338368ed9d\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.648923 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66c67541-4e54-48ab-96dd-7d904d0ff7d2-openstack-config\") pod \"openstackclient\" (UID: \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.648948 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgfkf\" (UniqueName: \"kubernetes.io/projected/66c67541-4e54-48ab-96dd-7d904d0ff7d2-kube-api-access-kgfkf\") pod \"openstackclient\" (UID: \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.650315 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a75d91b5-4a5c-492f-9c59-a4338368ed9d-openstack-config\") pod \"openstackclient\" (UID: \"a75d91b5-4a5c-492f-9c59-a4338368ed9d\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: E1007 13:51:43.653788 4854 projected.go:194] Error preparing data for projected volume kube-api-access-fvk2s for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (a75d91b5-4a5c-492f-9c59-a4338368ed9d) does not match the UID in record. The object might have been deleted and then recreated Oct 07 13:51:43 crc kubenswrapper[4854]: E1007 13:51:43.653903 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a75d91b5-4a5c-492f-9c59-a4338368ed9d-kube-api-access-fvk2s podName:a75d91b5-4a5c-492f-9c59-a4338368ed9d nodeName:}" failed. No retries permitted until 2025-10-07 13:51:44.153873844 +0000 UTC m=+5220.141706129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fvk2s" (UniqueName: "kubernetes.io/projected/a75d91b5-4a5c-492f-9c59-a4338368ed9d-kube-api-access-fvk2s") pod "openstackclient" (UID: "a75d91b5-4a5c-492f-9c59-a4338368ed9d") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (a75d91b5-4a5c-492f-9c59-a4338368ed9d) does not match the UID in record. The object might have been deleted and then recreated Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.657940 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a75d91b5-4a5c-492f-9c59-a4338368ed9d-openstack-config-secret\") pod \"openstackclient\" (UID: \"a75d91b5-4a5c-492f-9c59-a4338368ed9d\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.749881 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgfkf\" (UniqueName: \"kubernetes.io/projected/66c67541-4e54-48ab-96dd-7d904d0ff7d2-kube-api-access-kgfkf\") pod \"openstackclient\" (UID: \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.749953 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66c67541-4e54-48ab-96dd-7d904d0ff7d2-openstack-config-secret\") pod \"openstackclient\" (UID: \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.750299 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66c67541-4e54-48ab-96dd-7d904d0ff7d2-openstack-config\") pod \"openstackclient\" (UID: \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.751185 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66c67541-4e54-48ab-96dd-7d904d0ff7d2-openstack-config\") pod \"openstackclient\" (UID: \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.759014 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66c67541-4e54-48ab-96dd-7d904d0ff7d2-openstack-config-secret\") pod \"openstackclient\" (UID: \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.778086 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgfkf\" (UniqueName: \"kubernetes.io/projected/66c67541-4e54-48ab-96dd-7d904d0ff7d2-kube-api-access-kgfkf\") pod \"openstackclient\" (UID: \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\") " pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.899199 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.985792 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 13:51:43 crc kubenswrapper[4854]: I1007 13:51:43.989579 4854 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a75d91b5-4a5c-492f-9c59-a4338368ed9d" podUID="66c67541-4e54-48ab-96dd-7d904d0ff7d2" Oct 07 13:51:44 crc kubenswrapper[4854]: I1007 13:51:44.002994 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 13:51:44 crc kubenswrapper[4854]: I1007 13:51:44.054759 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a75d91b5-4a5c-492f-9c59-a4338368ed9d-openstack-config-secret\") pod \"a75d91b5-4a5c-492f-9c59-a4338368ed9d\" (UID: \"a75d91b5-4a5c-492f-9c59-a4338368ed9d\") " Oct 07 13:51:44 crc kubenswrapper[4854]: I1007 13:51:44.054872 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a75d91b5-4a5c-492f-9c59-a4338368ed9d-openstack-config\") pod \"a75d91b5-4a5c-492f-9c59-a4338368ed9d\" (UID: \"a75d91b5-4a5c-492f-9c59-a4338368ed9d\") " Oct 07 13:51:44 crc kubenswrapper[4854]: I1007 13:51:44.055298 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvk2s\" (UniqueName: \"kubernetes.io/projected/a75d91b5-4a5c-492f-9c59-a4338368ed9d-kube-api-access-fvk2s\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:44 crc kubenswrapper[4854]: I1007 13:51:44.055407 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75d91b5-4a5c-492f-9c59-a4338368ed9d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a75d91b5-4a5c-492f-9c59-a4338368ed9d" (UID: "a75d91b5-4a5c-492f-9c59-a4338368ed9d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:51:44 crc kubenswrapper[4854]: I1007 13:51:44.060232 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75d91b5-4a5c-492f-9c59-a4338368ed9d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a75d91b5-4a5c-492f-9c59-a4338368ed9d" (UID: "a75d91b5-4a5c-492f-9c59-a4338368ed9d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:51:44 crc kubenswrapper[4854]: I1007 13:51:44.156674 4854 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a75d91b5-4a5c-492f-9c59-a4338368ed9d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:44 crc kubenswrapper[4854]: I1007 13:51:44.156714 4854 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a75d91b5-4a5c-492f-9c59-a4338368ed9d-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:51:44 crc kubenswrapper[4854]: I1007 13:51:44.362532 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 13:51:44 crc kubenswrapper[4854]: I1007 13:51:44.712210 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75d91b5-4a5c-492f-9c59-a4338368ed9d" path="/var/lib/kubelet/pods/a75d91b5-4a5c-492f-9c59-a4338368ed9d/volumes" Oct 07 13:51:44 crc kubenswrapper[4854]: I1007 13:51:44.994647 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 13:51:44 crc kubenswrapper[4854]: I1007 13:51:44.995888 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"66c67541-4e54-48ab-96dd-7d904d0ff7d2","Type":"ContainerStarted","Data":"7a04548110984e439eb546bdcce8e1598dbda5abb0f426cacc29230e34833f37"} Oct 07 13:51:44 crc kubenswrapper[4854]: I1007 13:51:44.995928 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"66c67541-4e54-48ab-96dd-7d904d0ff7d2","Type":"ContainerStarted","Data":"afd32c99e2ba57fd9e93eea8897241e04f4cf5bf485a248bddbbe53d20119f74"} Oct 07 13:51:45 crc kubenswrapper[4854]: I1007 13:51:45.026524 4854 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a75d91b5-4a5c-492f-9c59-a4338368ed9d" podUID="66c67541-4e54-48ab-96dd-7d904d0ff7d2" Oct 07 13:51:45 crc kubenswrapper[4854]: I1007 13:51:45.027276 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.027249645 podStartE2EDuration="2.027249645s" podCreationTimestamp="2025-10-07 13:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:51:45.022987113 +0000 UTC m=+5221.010819368" watchObservedRunningTime="2025-10-07 13:51:45.027249645 +0000 UTC m=+5221.015081940" Oct 07 13:51:47 crc kubenswrapper[4854]: E1007 13:51:47.337401 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75d91b5_4a5c_492f_9c59_a4338368ed9d.slice\": RecentStats: unable to find data in memory cache]" Oct 07 13:51:57 crc kubenswrapper[4854]: E1007 13:51:57.616448 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75d91b5_4a5c_492f_9c59_a4338368ed9d.slice\": RecentStats: unable to find data in memory cache]" Oct 07 13:52:07 crc kubenswrapper[4854]: E1007 13:52:07.858117 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75d91b5_4a5c_492f_9c59_a4338368ed9d.slice\": RecentStats: unable to find data in memory cache]" Oct 07 13:52:10 crc kubenswrapper[4854]: I1007 13:52:10.808187 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:52:10 crc kubenswrapper[4854]: I1007 13:52:10.808724 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:52:18 crc kubenswrapper[4854]: E1007 13:52:18.058087 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75d91b5_4a5c_492f_9c59_a4338368ed9d.slice\": RecentStats: unable to find data in memory cache]" Oct 07 13:52:28 crc kubenswrapper[4854]: E1007 13:52:28.283674 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75d91b5_4a5c_492f_9c59_a4338368ed9d.slice\": RecentStats: unable to find data in memory cache]" Oct 07 13:52:38 crc kubenswrapper[4854]: E1007 13:52:38.525769 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75d91b5_4a5c_492f_9c59_a4338368ed9d.slice\": RecentStats: unable to find data in memory cache]" Oct 07 13:52:39 crc kubenswrapper[4854]: I1007 13:52:39.006524 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p2lsl"] Oct 07 13:52:39 crc kubenswrapper[4854]: I1007 13:52:39.009513 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:39 crc kubenswrapper[4854]: I1007 13:52:39.029552 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p2lsl"] Oct 07 13:52:39 crc kubenswrapper[4854]: I1007 13:52:39.111591 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc9d3897-13e7-47db-bf2c-25613506e0cf-catalog-content\") pod \"community-operators-p2lsl\" (UID: \"bc9d3897-13e7-47db-bf2c-25613506e0cf\") " pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:39 crc kubenswrapper[4854]: I1007 13:52:39.111687 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66bnl\" (UniqueName: \"kubernetes.io/projected/bc9d3897-13e7-47db-bf2c-25613506e0cf-kube-api-access-66bnl\") pod \"community-operators-p2lsl\" (UID: \"bc9d3897-13e7-47db-bf2c-25613506e0cf\") " pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:39 crc kubenswrapper[4854]: I1007 13:52:39.111735 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc9d3897-13e7-47db-bf2c-25613506e0cf-utilities\") pod \"community-operators-p2lsl\" (UID: \"bc9d3897-13e7-47db-bf2c-25613506e0cf\") " pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:39 crc kubenswrapper[4854]: I1007 13:52:39.213838 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc9d3897-13e7-47db-bf2c-25613506e0cf-utilities\") pod \"community-operators-p2lsl\" (UID: \"bc9d3897-13e7-47db-bf2c-25613506e0cf\") " pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:39 crc kubenswrapper[4854]: I1007 13:52:39.214125 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc9d3897-13e7-47db-bf2c-25613506e0cf-catalog-content\") pod \"community-operators-p2lsl\" (UID: \"bc9d3897-13e7-47db-bf2c-25613506e0cf\") " pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:39 crc kubenswrapper[4854]: I1007 13:52:39.214232 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66bnl\" (UniqueName: \"kubernetes.io/projected/bc9d3897-13e7-47db-bf2c-25613506e0cf-kube-api-access-66bnl\") pod \"community-operators-p2lsl\" (UID: \"bc9d3897-13e7-47db-bf2c-25613506e0cf\") " pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:39 crc kubenswrapper[4854]: I1007 13:52:39.214441 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc9d3897-13e7-47db-bf2c-25613506e0cf-utilities\") pod \"community-operators-p2lsl\" (UID: \"bc9d3897-13e7-47db-bf2c-25613506e0cf\") " pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:39 crc kubenswrapper[4854]: I1007 13:52:39.214708 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc9d3897-13e7-47db-bf2c-25613506e0cf-catalog-content\") pod \"community-operators-p2lsl\" (UID: \"bc9d3897-13e7-47db-bf2c-25613506e0cf\") " pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:39 crc kubenswrapper[4854]: I1007 13:52:39.259717 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66bnl\" (UniqueName: \"kubernetes.io/projected/bc9d3897-13e7-47db-bf2c-25613506e0cf-kube-api-access-66bnl\") pod \"community-operators-p2lsl\" (UID: \"bc9d3897-13e7-47db-bf2c-25613506e0cf\") " pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:39 crc kubenswrapper[4854]: I1007 13:52:39.369553 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:39 crc kubenswrapper[4854]: I1007 13:52:39.903949 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p2lsl"] Oct 07 13:52:39 crc kubenswrapper[4854]: W1007 13:52:39.916588 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc9d3897_13e7_47db_bf2c_25613506e0cf.slice/crio-6a4389bd26e5280e41e197a9549606eb95faa63d8893b46bd6e3d43284c9f8a6 WatchSource:0}: Error finding container 6a4389bd26e5280e41e197a9549606eb95faa63d8893b46bd6e3d43284c9f8a6: Status 404 returned error can't find the container with id 6a4389bd26e5280e41e197a9549606eb95faa63d8893b46bd6e3d43284c9f8a6 Oct 07 13:52:40 crc kubenswrapper[4854]: I1007 13:52:40.575267 4854 generic.go:334] "Generic (PLEG): container finished" podID="bc9d3897-13e7-47db-bf2c-25613506e0cf" containerID="b37b32d6df4192feac703121b21f3db69f97b8e8b00f5530066c847c370a78c4" exitCode=0 Oct 07 13:52:40 crc kubenswrapper[4854]: I1007 13:52:40.575376 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2lsl" event={"ID":"bc9d3897-13e7-47db-bf2c-25613506e0cf","Type":"ContainerDied","Data":"b37b32d6df4192feac703121b21f3db69f97b8e8b00f5530066c847c370a78c4"} Oct 07 13:52:40 crc kubenswrapper[4854]: I1007 13:52:40.575709 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2lsl" event={"ID":"bc9d3897-13e7-47db-bf2c-25613506e0cf","Type":"ContainerStarted","Data":"6a4389bd26e5280e41e197a9549606eb95faa63d8893b46bd6e3d43284c9f8a6"} Oct 07 13:52:40 crc kubenswrapper[4854]: I1007 13:52:40.578294 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:52:40 crc kubenswrapper[4854]: I1007 13:52:40.810458 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:52:40 crc kubenswrapper[4854]: I1007 13:52:40.810558 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:52:40 crc kubenswrapper[4854]: I1007 13:52:40.810657 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 13:52:40 crc kubenswrapper[4854]: I1007 13:52:40.811950 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:52:40 crc kubenswrapper[4854]: I1007 13:52:40.812091 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" gracePeriod=600 Oct 07 13:52:40 crc kubenswrapper[4854]: E1007 13:52:40.938682 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:52:41 crc kubenswrapper[4854]: I1007 13:52:41.592590 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" exitCode=0 Oct 07 13:52:41 crc kubenswrapper[4854]: I1007 13:52:41.592633 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e"} Oct 07 13:52:41 crc kubenswrapper[4854]: I1007 13:52:41.592668 4854 scope.go:117] "RemoveContainer" containerID="2fcfadcb8d4fbb46c6073e004d4af8599758985c036e716d488d12dd6011cd13" Oct 07 13:52:41 crc kubenswrapper[4854]: I1007 13:52:41.593613 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:52:41 crc kubenswrapper[4854]: E1007 13:52:41.594099 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:52:42 crc kubenswrapper[4854]: I1007 13:52:42.614410 4854 generic.go:334] "Generic (PLEG): container finished" podID="bc9d3897-13e7-47db-bf2c-25613506e0cf" containerID="3731fcb4854bfe4828df12a5971f61a22aa09a7697922c430e664a752a60cbc5" exitCode=0 Oct 07 13:52:42 crc kubenswrapper[4854]: I1007 13:52:42.614536 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2lsl" event={"ID":"bc9d3897-13e7-47db-bf2c-25613506e0cf","Type":"ContainerDied","Data":"3731fcb4854bfe4828df12a5971f61a22aa09a7697922c430e664a752a60cbc5"} Oct 07 13:52:43 crc kubenswrapper[4854]: I1007 13:52:43.629934 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2lsl" event={"ID":"bc9d3897-13e7-47db-bf2c-25613506e0cf","Type":"ContainerStarted","Data":"5908df7f134756b9e723cdb99f0495075f33fb5e91acdfa47550d229b151fdd5"} Oct 07 13:52:43 crc kubenswrapper[4854]: I1007 13:52:43.672896 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p2lsl" podStartSLOduration=3.077829536 podStartE2EDuration="5.672862162s" podCreationTimestamp="2025-10-07 13:52:38 +0000 UTC" firstStartedPulling="2025-10-07 13:52:40.577834837 +0000 UTC m=+5276.565667132" lastFinishedPulling="2025-10-07 13:52:43.172867503 +0000 UTC m=+5279.160699758" observedRunningTime="2025-10-07 13:52:43.656194426 +0000 UTC m=+5279.644026691" watchObservedRunningTime="2025-10-07 13:52:43.672862162 +0000 UTC m=+5279.660694457" Oct 07 13:52:49 crc kubenswrapper[4854]: I1007 13:52:49.370632 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:49 crc kubenswrapper[4854]: I1007 13:52:49.373824 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:49 crc kubenswrapper[4854]: I1007 13:52:49.446313 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:49 crc kubenswrapper[4854]: I1007 13:52:49.763245 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:49 crc kubenswrapper[4854]: I1007 13:52:49.813432 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p2lsl"] Oct 07 13:52:51 crc kubenswrapper[4854]: I1007 13:52:51.711117 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p2lsl" podUID="bc9d3897-13e7-47db-bf2c-25613506e0cf" containerName="registry-server" containerID="cri-o://5908df7f134756b9e723cdb99f0495075f33fb5e91acdfa47550d229b151fdd5" gracePeriod=2 Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.237815 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.380014 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc9d3897-13e7-47db-bf2c-25613506e0cf-catalog-content\") pod \"bc9d3897-13e7-47db-bf2c-25613506e0cf\" (UID: \"bc9d3897-13e7-47db-bf2c-25613506e0cf\") " Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.380830 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66bnl\" (UniqueName: \"kubernetes.io/projected/bc9d3897-13e7-47db-bf2c-25613506e0cf-kube-api-access-66bnl\") pod \"bc9d3897-13e7-47db-bf2c-25613506e0cf\" (UID: \"bc9d3897-13e7-47db-bf2c-25613506e0cf\") " Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.380889 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc9d3897-13e7-47db-bf2c-25613506e0cf-utilities\") pod \"bc9d3897-13e7-47db-bf2c-25613506e0cf\" (UID: \"bc9d3897-13e7-47db-bf2c-25613506e0cf\") " Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.383365 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc9d3897-13e7-47db-bf2c-25613506e0cf-utilities" (OuterVolumeSpecName: "utilities") pod "bc9d3897-13e7-47db-bf2c-25613506e0cf" (UID: "bc9d3897-13e7-47db-bf2c-25613506e0cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.390806 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9d3897-13e7-47db-bf2c-25613506e0cf-kube-api-access-66bnl" (OuterVolumeSpecName: "kube-api-access-66bnl") pod "bc9d3897-13e7-47db-bf2c-25613506e0cf" (UID: "bc9d3897-13e7-47db-bf2c-25613506e0cf"). InnerVolumeSpecName "kube-api-access-66bnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.458825 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc9d3897-13e7-47db-bf2c-25613506e0cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc9d3897-13e7-47db-bf2c-25613506e0cf" (UID: "bc9d3897-13e7-47db-bf2c-25613506e0cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.483068 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66bnl\" (UniqueName: \"kubernetes.io/projected/bc9d3897-13e7-47db-bf2c-25613506e0cf-kube-api-access-66bnl\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.483100 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc9d3897-13e7-47db-bf2c-25613506e0cf-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.483110 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc9d3897-13e7-47db-bf2c-25613506e0cf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.725033 4854 generic.go:334] "Generic (PLEG): container finished" podID="bc9d3897-13e7-47db-bf2c-25613506e0cf" containerID="5908df7f134756b9e723cdb99f0495075f33fb5e91acdfa47550d229b151fdd5" exitCode=0 Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.725095 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2lsl" event={"ID":"bc9d3897-13e7-47db-bf2c-25613506e0cf","Type":"ContainerDied","Data":"5908df7f134756b9e723cdb99f0495075f33fb5e91acdfa47550d229b151fdd5"} Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.725116 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p2lsl" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.725135 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2lsl" event={"ID":"bc9d3897-13e7-47db-bf2c-25613506e0cf","Type":"ContainerDied","Data":"6a4389bd26e5280e41e197a9549606eb95faa63d8893b46bd6e3d43284c9f8a6"} Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.725185 4854 scope.go:117] "RemoveContainer" containerID="5908df7f134756b9e723cdb99f0495075f33fb5e91acdfa47550d229b151fdd5" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.756633 4854 scope.go:117] "RemoveContainer" containerID="3731fcb4854bfe4828df12a5971f61a22aa09a7697922c430e664a752a60cbc5" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.783343 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p2lsl"] Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.789532 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p2lsl"] Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.801562 4854 scope.go:117] "RemoveContainer" containerID="b37b32d6df4192feac703121b21f3db69f97b8e8b00f5530066c847c370a78c4" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.850321 4854 scope.go:117] "RemoveContainer" containerID="5908df7f134756b9e723cdb99f0495075f33fb5e91acdfa47550d229b151fdd5" Oct 07 13:52:52 crc kubenswrapper[4854]: E1007 13:52:52.850946 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5908df7f134756b9e723cdb99f0495075f33fb5e91acdfa47550d229b151fdd5\": container with ID starting with 5908df7f134756b9e723cdb99f0495075f33fb5e91acdfa47550d229b151fdd5 not found: ID does not exist" containerID="5908df7f134756b9e723cdb99f0495075f33fb5e91acdfa47550d229b151fdd5" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.851047 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5908df7f134756b9e723cdb99f0495075f33fb5e91acdfa47550d229b151fdd5"} err="failed to get container status \"5908df7f134756b9e723cdb99f0495075f33fb5e91acdfa47550d229b151fdd5\": rpc error: code = NotFound desc = could not find container \"5908df7f134756b9e723cdb99f0495075f33fb5e91acdfa47550d229b151fdd5\": container with ID starting with 5908df7f134756b9e723cdb99f0495075f33fb5e91acdfa47550d229b151fdd5 not found: ID does not exist" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.851127 4854 scope.go:117] "RemoveContainer" containerID="3731fcb4854bfe4828df12a5971f61a22aa09a7697922c430e664a752a60cbc5" Oct 07 13:52:52 crc kubenswrapper[4854]: E1007 13:52:52.851694 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3731fcb4854bfe4828df12a5971f61a22aa09a7697922c430e664a752a60cbc5\": container with ID starting with 3731fcb4854bfe4828df12a5971f61a22aa09a7697922c430e664a752a60cbc5 not found: ID does not exist" containerID="3731fcb4854bfe4828df12a5971f61a22aa09a7697922c430e664a752a60cbc5" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.851774 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3731fcb4854bfe4828df12a5971f61a22aa09a7697922c430e664a752a60cbc5"} err="failed to get container status \"3731fcb4854bfe4828df12a5971f61a22aa09a7697922c430e664a752a60cbc5\": rpc error: code = NotFound desc = could not find container \"3731fcb4854bfe4828df12a5971f61a22aa09a7697922c430e664a752a60cbc5\": container with ID starting with 3731fcb4854bfe4828df12a5971f61a22aa09a7697922c430e664a752a60cbc5 not found: ID does not exist" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.851804 4854 scope.go:117] "RemoveContainer" containerID="b37b32d6df4192feac703121b21f3db69f97b8e8b00f5530066c847c370a78c4" Oct 07 13:52:52 crc kubenswrapper[4854]: E1007 13:52:52.852166 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b37b32d6df4192feac703121b21f3db69f97b8e8b00f5530066c847c370a78c4\": container with ID starting with b37b32d6df4192feac703121b21f3db69f97b8e8b00f5530066c847c370a78c4 not found: ID does not exist" containerID="b37b32d6df4192feac703121b21f3db69f97b8e8b00f5530066c847c370a78c4" Oct 07 13:52:52 crc kubenswrapper[4854]: I1007 13:52:52.852213 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37b32d6df4192feac703121b21f3db69f97b8e8b00f5530066c847c370a78c4"} err="failed to get container status \"b37b32d6df4192feac703121b21f3db69f97b8e8b00f5530066c847c370a78c4\": rpc error: code = NotFound desc = could not find container \"b37b32d6df4192feac703121b21f3db69f97b8e8b00f5530066c847c370a78c4\": container with ID starting with b37b32d6df4192feac703121b21f3db69f97b8e8b00f5530066c847c370a78c4 not found: ID does not exist" Oct 07 13:52:54 crc kubenswrapper[4854]: I1007 13:52:54.707769 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:52:54 crc kubenswrapper[4854]: E1007 13:52:54.708384 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:52:54 crc kubenswrapper[4854]: I1007 13:52:54.721858 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc9d3897-13e7-47db-bf2c-25613506e0cf" path="/var/lib/kubelet/pods/bc9d3897-13e7-47db-bf2c-25613506e0cf/volumes" Oct 07 13:53:08 crc kubenswrapper[4854]: I1007 13:53:08.703818 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:53:08 crc kubenswrapper[4854]: E1007 13:53:08.706902 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:53:20 crc kubenswrapper[4854]: I1007 13:53:20.702725 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:53:20 crc kubenswrapper[4854]: E1007 13:53:20.703642 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:53:22 crc kubenswrapper[4854]: E1007 13:53:22.535711 4854 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.243:60104->38.102.83.243:43445: write tcp 38.102.83.243:60104->38.102.83.243:43445: write: broken pipe Oct 07 13:53:31 crc kubenswrapper[4854]: I1007 13:53:31.383813 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-r5dpl"] Oct 07 13:53:31 crc kubenswrapper[4854]: E1007 13:53:31.384624 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9d3897-13e7-47db-bf2c-25613506e0cf" containerName="registry-server" Oct 07 13:53:31 crc kubenswrapper[4854]: I1007 13:53:31.384637 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9d3897-13e7-47db-bf2c-25613506e0cf" containerName="registry-server" Oct 07 13:53:31 crc kubenswrapper[4854]: E1007 13:53:31.384656 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9d3897-13e7-47db-bf2c-25613506e0cf" containerName="extract-utilities" Oct 07 13:53:31 crc kubenswrapper[4854]: I1007 13:53:31.384662 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9d3897-13e7-47db-bf2c-25613506e0cf" containerName="extract-utilities" Oct 07 13:53:31 crc kubenswrapper[4854]: E1007 13:53:31.384673 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9d3897-13e7-47db-bf2c-25613506e0cf" containerName="extract-content" Oct 07 13:53:31 crc kubenswrapper[4854]: I1007 13:53:31.384680 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9d3897-13e7-47db-bf2c-25613506e0cf" containerName="extract-content" Oct 07 13:53:31 crc kubenswrapper[4854]: I1007 13:53:31.384859 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9d3897-13e7-47db-bf2c-25613506e0cf" containerName="registry-server" Oct 07 13:53:31 crc kubenswrapper[4854]: I1007 13:53:31.385384 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r5dpl" Oct 07 13:53:31 crc kubenswrapper[4854]: I1007 13:53:31.392448 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-r5dpl"] Oct 07 13:53:31 crc kubenswrapper[4854]: I1007 13:53:31.445845 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl72w\" (UniqueName: \"kubernetes.io/projected/16b73e39-9c15-4fa3-9e93-896d656a9d48-kube-api-access-zl72w\") pod \"barbican-db-create-r5dpl\" (UID: \"16b73e39-9c15-4fa3-9e93-896d656a9d48\") " pod="openstack/barbican-db-create-r5dpl" Oct 07 13:53:31 crc kubenswrapper[4854]: I1007 13:53:31.546572 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl72w\" (UniqueName: \"kubernetes.io/projected/16b73e39-9c15-4fa3-9e93-896d656a9d48-kube-api-access-zl72w\") pod \"barbican-db-create-r5dpl\" (UID: \"16b73e39-9c15-4fa3-9e93-896d656a9d48\") " pod="openstack/barbican-db-create-r5dpl" Oct 07 13:53:31 crc kubenswrapper[4854]: I1007 13:53:31.573999 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl72w\" (UniqueName: \"kubernetes.io/projected/16b73e39-9c15-4fa3-9e93-896d656a9d48-kube-api-access-zl72w\") pod \"barbican-db-create-r5dpl\" (UID: \"16b73e39-9c15-4fa3-9e93-896d656a9d48\") " pod="openstack/barbican-db-create-r5dpl" Oct 07 13:53:31 crc kubenswrapper[4854]: I1007 13:53:31.702610 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r5dpl" Oct 07 13:53:31 crc kubenswrapper[4854]: I1007 13:53:31.704020 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:53:31 crc kubenswrapper[4854]: E1007 13:53:31.704377 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:53:32 crc kubenswrapper[4854]: I1007 13:53:32.192629 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-r5dpl"] Oct 07 13:53:33 crc kubenswrapper[4854]: I1007 13:53:33.182542 4854 generic.go:334] "Generic (PLEG): container finished" podID="16b73e39-9c15-4fa3-9e93-896d656a9d48" containerID="94898441204b9ad0926883493e28808ab3574e7a59ae8f51998dd9feb030da96" exitCode=0 Oct 07 13:53:33 crc kubenswrapper[4854]: I1007 13:53:33.182806 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r5dpl" event={"ID":"16b73e39-9c15-4fa3-9e93-896d656a9d48","Type":"ContainerDied","Data":"94898441204b9ad0926883493e28808ab3574e7a59ae8f51998dd9feb030da96"} Oct 07 13:53:33 crc kubenswrapper[4854]: I1007 13:53:33.182831 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r5dpl" event={"ID":"16b73e39-9c15-4fa3-9e93-896d656a9d48","Type":"ContainerStarted","Data":"b9f0097282d0a09ea20f5ac3da588cfe56d35d4c785a1620c87e6f61cbde7eae"} Oct 07 13:53:34 crc kubenswrapper[4854]: I1007 13:53:34.620189 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r5dpl" Oct 07 13:53:34 crc kubenswrapper[4854]: I1007 13:53:34.805841 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl72w\" (UniqueName: \"kubernetes.io/projected/16b73e39-9c15-4fa3-9e93-896d656a9d48-kube-api-access-zl72w\") pod \"16b73e39-9c15-4fa3-9e93-896d656a9d48\" (UID: \"16b73e39-9c15-4fa3-9e93-896d656a9d48\") " Oct 07 13:53:34 crc kubenswrapper[4854]: I1007 13:53:34.814603 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b73e39-9c15-4fa3-9e93-896d656a9d48-kube-api-access-zl72w" (OuterVolumeSpecName: "kube-api-access-zl72w") pod "16b73e39-9c15-4fa3-9e93-896d656a9d48" (UID: "16b73e39-9c15-4fa3-9e93-896d656a9d48"). InnerVolumeSpecName "kube-api-access-zl72w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:34 crc kubenswrapper[4854]: I1007 13:53:34.908718 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl72w\" (UniqueName: \"kubernetes.io/projected/16b73e39-9c15-4fa3-9e93-896d656a9d48-kube-api-access-zl72w\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:35 crc kubenswrapper[4854]: I1007 13:53:35.206998 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r5dpl" event={"ID":"16b73e39-9c15-4fa3-9e93-896d656a9d48","Type":"ContainerDied","Data":"b9f0097282d0a09ea20f5ac3da588cfe56d35d4c785a1620c87e6f61cbde7eae"} Oct 07 13:53:35 crc kubenswrapper[4854]: I1007 13:53:35.207268 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r5dpl" Oct 07 13:53:35 crc kubenswrapper[4854]: I1007 13:53:35.207273 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9f0097282d0a09ea20f5ac3da588cfe56d35d4c785a1620c87e6f61cbde7eae" Oct 07 13:53:41 crc kubenswrapper[4854]: I1007 13:53:41.419513 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-540c-account-create-dvvgd"] Oct 07 13:53:41 crc kubenswrapper[4854]: E1007 13:53:41.420684 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b73e39-9c15-4fa3-9e93-896d656a9d48" containerName="mariadb-database-create" Oct 07 13:53:41 crc kubenswrapper[4854]: I1007 13:53:41.420709 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b73e39-9c15-4fa3-9e93-896d656a9d48" containerName="mariadb-database-create" Oct 07 13:53:41 crc kubenswrapper[4854]: I1007 13:53:41.421036 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b73e39-9c15-4fa3-9e93-896d656a9d48" containerName="mariadb-database-create" Oct 07 13:53:41 crc kubenswrapper[4854]: I1007 13:53:41.422313 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-540c-account-create-dvvgd" Oct 07 13:53:41 crc kubenswrapper[4854]: I1007 13:53:41.426636 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 07 13:53:41 crc kubenswrapper[4854]: I1007 13:53:41.427843 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-540c-account-create-dvvgd"] Oct 07 13:53:41 crc kubenswrapper[4854]: I1007 13:53:41.532501 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vk75\" (UniqueName: \"kubernetes.io/projected/5ec29f36-a768-4b39-beba-e680db595dbf-kube-api-access-9vk75\") pod \"barbican-540c-account-create-dvvgd\" (UID: \"5ec29f36-a768-4b39-beba-e680db595dbf\") " pod="openstack/barbican-540c-account-create-dvvgd" Oct 07 13:53:41 crc kubenswrapper[4854]: I1007 13:53:41.634889 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vk75\" (UniqueName: \"kubernetes.io/projected/5ec29f36-a768-4b39-beba-e680db595dbf-kube-api-access-9vk75\") pod \"barbican-540c-account-create-dvvgd\" (UID: \"5ec29f36-a768-4b39-beba-e680db595dbf\") " pod="openstack/barbican-540c-account-create-dvvgd" Oct 07 13:53:41 crc kubenswrapper[4854]: I1007 13:53:41.676135 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vk75\" (UniqueName: \"kubernetes.io/projected/5ec29f36-a768-4b39-beba-e680db595dbf-kube-api-access-9vk75\") pod \"barbican-540c-account-create-dvvgd\" (UID: \"5ec29f36-a768-4b39-beba-e680db595dbf\") " pod="openstack/barbican-540c-account-create-dvvgd" Oct 07 13:53:41 crc kubenswrapper[4854]: I1007 13:53:41.768850 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-540c-account-create-dvvgd" Oct 07 13:53:42 crc kubenswrapper[4854]: I1007 13:53:42.077817 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-540c-account-create-dvvgd"] Oct 07 13:53:42 crc kubenswrapper[4854]: W1007 13:53:42.090600 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ec29f36_a768_4b39_beba_e680db595dbf.slice/crio-92587f04839cf672897b858e407e6a2dfce3e0fb3af05e67bd48bfa70a09ca83 WatchSource:0}: Error finding container 92587f04839cf672897b858e407e6a2dfce3e0fb3af05e67bd48bfa70a09ca83: Status 404 returned error can't find the container with id 92587f04839cf672897b858e407e6a2dfce3e0fb3af05e67bd48bfa70a09ca83 Oct 07 13:53:42 crc kubenswrapper[4854]: I1007 13:53:42.280941 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-540c-account-create-dvvgd" event={"ID":"5ec29f36-a768-4b39-beba-e680db595dbf","Type":"ContainerStarted","Data":"92587f04839cf672897b858e407e6a2dfce3e0fb3af05e67bd48bfa70a09ca83"} Oct 07 13:53:42 crc kubenswrapper[4854]: I1007 13:53:42.703748 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:53:42 crc kubenswrapper[4854]: E1007 13:53:42.704261 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:53:43 crc kubenswrapper[4854]: I1007 13:53:43.291784 4854 generic.go:334] "Generic (PLEG): container finished" podID="5ec29f36-a768-4b39-beba-e680db595dbf" containerID="a9f7a9e2f357dedfca8c3ddacc7e07c3ff5a3263c7476c830c5ac6002d376f66" exitCode=0 Oct 07 13:53:43 crc kubenswrapper[4854]: I1007 13:53:43.291885 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-540c-account-create-dvvgd" event={"ID":"5ec29f36-a768-4b39-beba-e680db595dbf","Type":"ContainerDied","Data":"a9f7a9e2f357dedfca8c3ddacc7e07c3ff5a3263c7476c830c5ac6002d376f66"} Oct 07 13:53:44 crc kubenswrapper[4854]: I1007 13:53:44.691240 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-540c-account-create-dvvgd" Oct 07 13:53:44 crc kubenswrapper[4854]: I1007 13:53:44.798619 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vk75\" (UniqueName: \"kubernetes.io/projected/5ec29f36-a768-4b39-beba-e680db595dbf-kube-api-access-9vk75\") pod \"5ec29f36-a768-4b39-beba-e680db595dbf\" (UID: \"5ec29f36-a768-4b39-beba-e680db595dbf\") " Oct 07 13:53:44 crc kubenswrapper[4854]: I1007 13:53:44.803646 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec29f36-a768-4b39-beba-e680db595dbf-kube-api-access-9vk75" (OuterVolumeSpecName: "kube-api-access-9vk75") pod "5ec29f36-a768-4b39-beba-e680db595dbf" (UID: "5ec29f36-a768-4b39-beba-e680db595dbf"). InnerVolumeSpecName "kube-api-access-9vk75". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:44 crc kubenswrapper[4854]: I1007 13:53:44.900687 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vk75\" (UniqueName: \"kubernetes.io/projected/5ec29f36-a768-4b39-beba-e680db595dbf-kube-api-access-9vk75\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:45 crc kubenswrapper[4854]: I1007 13:53:45.318232 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-540c-account-create-dvvgd" event={"ID":"5ec29f36-a768-4b39-beba-e680db595dbf","Type":"ContainerDied","Data":"92587f04839cf672897b858e407e6a2dfce3e0fb3af05e67bd48bfa70a09ca83"} Oct 07 13:53:45 crc kubenswrapper[4854]: I1007 13:53:45.318312 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92587f04839cf672897b858e407e6a2dfce3e0fb3af05e67bd48bfa70a09ca83" Oct 07 13:53:45 crc kubenswrapper[4854]: I1007 13:53:45.318420 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-540c-account-create-dvvgd" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.678220 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-npzds"] Oct 07 13:53:46 crc kubenswrapper[4854]: E1007 13:53:46.679174 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec29f36-a768-4b39-beba-e680db595dbf" containerName="mariadb-account-create" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.679201 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec29f36-a768-4b39-beba-e680db595dbf" containerName="mariadb-account-create" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.679512 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec29f36-a768-4b39-beba-e680db595dbf" containerName="mariadb-account-create" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.680583 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-npzds" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.686011 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tw7bv" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.686404 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.691666 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-npzds"] Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.836203 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-combined-ca-bundle\") pod \"barbican-db-sync-npzds\" (UID: \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\") " pod="openstack/barbican-db-sync-npzds" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.836340 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9sf8\" (UniqueName: \"kubernetes.io/projected/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-kube-api-access-d9sf8\") pod \"barbican-db-sync-npzds\" (UID: \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\") " pod="openstack/barbican-db-sync-npzds" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.836414 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-db-sync-config-data\") pod \"barbican-db-sync-npzds\" (UID: \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\") " pod="openstack/barbican-db-sync-npzds" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.937779 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-combined-ca-bundle\") pod \"barbican-db-sync-npzds\" (UID: \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\") " pod="openstack/barbican-db-sync-npzds" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.937853 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9sf8\" (UniqueName: \"kubernetes.io/projected/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-kube-api-access-d9sf8\") pod \"barbican-db-sync-npzds\" (UID: \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\") " pod="openstack/barbican-db-sync-npzds" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.937903 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-db-sync-config-data\") pod \"barbican-db-sync-npzds\" (UID: \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\") " pod="openstack/barbican-db-sync-npzds" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.943687 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-db-sync-config-data\") pod \"barbican-db-sync-npzds\" (UID: \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\") " pod="openstack/barbican-db-sync-npzds" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.946001 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-combined-ca-bundle\") pod \"barbican-db-sync-npzds\" (UID: \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\") " pod="openstack/barbican-db-sync-npzds" Oct 07 13:53:46 crc kubenswrapper[4854]: I1007 13:53:46.963949 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9sf8\" (UniqueName: \"kubernetes.io/projected/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-kube-api-access-d9sf8\") pod \"barbican-db-sync-npzds\" (UID: \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\") " pod="openstack/barbican-db-sync-npzds" Oct 07 13:53:47 crc kubenswrapper[4854]: I1007 13:53:47.001034 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-npzds" Oct 07 13:53:47 crc kubenswrapper[4854]: I1007 13:53:47.491832 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-npzds"] Oct 07 13:53:47 crc kubenswrapper[4854]: W1007 13:53:47.503543 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda560e8e_470c_4f4e_b14a_e90b4b0a40fc.slice/crio-a3393ec1f60a627e8e7b471cb5b74ef44e5c98d79628ff572abce7795c5903f9 WatchSource:0}: Error finding container a3393ec1f60a627e8e7b471cb5b74ef44e5c98d79628ff572abce7795c5903f9: Status 404 returned error can't find the container with id a3393ec1f60a627e8e7b471cb5b74ef44e5c98d79628ff572abce7795c5903f9 Oct 07 13:53:48 crc kubenswrapper[4854]: I1007 13:53:48.351331 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-npzds" event={"ID":"da560e8e-470c-4f4e-b14a-e90b4b0a40fc","Type":"ContainerStarted","Data":"c54d529ac6202c0e1e2098f8fd876e8a2c2c23ac1c6e793a62b7236a8e9db0c4"} Oct 07 13:53:48 crc kubenswrapper[4854]: I1007 13:53:48.351709 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-npzds" event={"ID":"da560e8e-470c-4f4e-b14a-e90b4b0a40fc","Type":"ContainerStarted","Data":"a3393ec1f60a627e8e7b471cb5b74ef44e5c98d79628ff572abce7795c5903f9"} Oct 07 13:53:48 crc kubenswrapper[4854]: I1007 13:53:48.376348 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-npzds" podStartSLOduration=2.376326361 podStartE2EDuration="2.376326361s" podCreationTimestamp="2025-10-07 13:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:53:48.368263571 +0000 UTC m=+5344.356095826" watchObservedRunningTime="2025-10-07 13:53:48.376326361 +0000 UTC m=+5344.364158616" Oct 07 13:53:49 crc kubenswrapper[4854]: I1007 13:53:49.362764 4854 generic.go:334] "Generic (PLEG): container finished" podID="da560e8e-470c-4f4e-b14a-e90b4b0a40fc" containerID="c54d529ac6202c0e1e2098f8fd876e8a2c2c23ac1c6e793a62b7236a8e9db0c4" exitCode=0 Oct 07 13:53:49 crc kubenswrapper[4854]: I1007 13:53:49.362918 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-npzds" event={"ID":"da560e8e-470c-4f4e-b14a-e90b4b0a40fc","Type":"ContainerDied","Data":"c54d529ac6202c0e1e2098f8fd876e8a2c2c23ac1c6e793a62b7236a8e9db0c4"} Oct 07 13:53:50 crc kubenswrapper[4854]: I1007 13:53:50.720924 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-npzds" Oct 07 13:53:50 crc kubenswrapper[4854]: I1007 13:53:50.807677 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-combined-ca-bundle\") pod \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\" (UID: \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\") " Oct 07 13:53:50 crc kubenswrapper[4854]: I1007 13:53:50.807829 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9sf8\" (UniqueName: \"kubernetes.io/projected/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-kube-api-access-d9sf8\") pod \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\" (UID: \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\") " Oct 07 13:53:50 crc kubenswrapper[4854]: I1007 13:53:50.807894 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-db-sync-config-data\") pod \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\" (UID: \"da560e8e-470c-4f4e-b14a-e90b4b0a40fc\") " Oct 07 13:53:50 crc kubenswrapper[4854]: I1007 13:53:50.813577 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "da560e8e-470c-4f4e-b14a-e90b4b0a40fc" (UID: "da560e8e-470c-4f4e-b14a-e90b4b0a40fc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:50 crc kubenswrapper[4854]: I1007 13:53:50.814175 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-kube-api-access-d9sf8" (OuterVolumeSpecName: "kube-api-access-d9sf8") pod "da560e8e-470c-4f4e-b14a-e90b4b0a40fc" (UID: "da560e8e-470c-4f4e-b14a-e90b4b0a40fc"). InnerVolumeSpecName "kube-api-access-d9sf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:50 crc kubenswrapper[4854]: I1007 13:53:50.850189 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da560e8e-470c-4f4e-b14a-e90b4b0a40fc" (UID: "da560e8e-470c-4f4e-b14a-e90b4b0a40fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:50 crc kubenswrapper[4854]: I1007 13:53:50.910141 4854 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:50 crc kubenswrapper[4854]: I1007 13:53:50.910195 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:50 crc kubenswrapper[4854]: I1007 13:53:50.910208 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9sf8\" (UniqueName: \"kubernetes.io/projected/da560e8e-470c-4f4e-b14a-e90b4b0a40fc-kube-api-access-d9sf8\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.385179 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-npzds" event={"ID":"da560e8e-470c-4f4e-b14a-e90b4b0a40fc","Type":"ContainerDied","Data":"a3393ec1f60a627e8e7b471cb5b74ef44e5c98d79628ff572abce7795c5903f9"} Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.385468 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3393ec1f60a627e8e7b471cb5b74ef44e5c98d79628ff572abce7795c5903f9" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.385264 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-npzds" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.637633 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6c6fc899c8-gbvnz"] Oct 07 13:53:51 crc kubenswrapper[4854]: E1007 13:53:51.638054 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da560e8e-470c-4f4e-b14a-e90b4b0a40fc" containerName="barbican-db-sync" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.638074 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="da560e8e-470c-4f4e-b14a-e90b4b0a40fc" containerName="barbican-db-sync" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.638281 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="da560e8e-470c-4f4e-b14a-e90b4b0a40fc" containerName="barbican-db-sync" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.639344 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.641122 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tw7bv" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.642876 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.643190 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.647035 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c6fc899c8-gbvnz"] Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.718201 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6cb9795ff9-628qg"] Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.719603 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.721264 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.724970 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjhnf\" (UniqueName: \"kubernetes.io/projected/43fe2335-824f-4cad-bfff-ea8487237d61-kube-api-access-qjhnf\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.725203 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43fe2335-824f-4cad-bfff-ea8487237d61-config-data-custom\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.725306 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43fe2335-824f-4cad-bfff-ea8487237d61-config-data\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.725372 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43fe2335-824f-4cad-bfff-ea8487237d61-combined-ca-bundle\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.725426 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43fe2335-824f-4cad-bfff-ea8487237d61-logs\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.741561 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6cb9795ff9-628qg"] Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.776475 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c4cbb9589-4zg2l"] Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.777998 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.794228 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c4cbb9589-4zg2l"] Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.864023 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjhnf\" (UniqueName: \"kubernetes.io/projected/43fe2335-824f-4cad-bfff-ea8487237d61-kube-api-access-qjhnf\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.864309 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43fe2335-824f-4cad-bfff-ea8487237d61-config-data-custom\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.864396 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43fe2335-824f-4cad-bfff-ea8487237d61-config-data\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.864474 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43fe2335-824f-4cad-bfff-ea8487237d61-combined-ca-bundle\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.864511 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7636878-b435-49bd-850a-c610d62c62fe-combined-ca-bundle\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.864575 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7636878-b435-49bd-850a-c610d62c62fe-config-data\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.864609 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43fe2335-824f-4cad-bfff-ea8487237d61-logs\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.864887 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6f8m\" (UniqueName: \"kubernetes.io/projected/d7636878-b435-49bd-850a-c610d62c62fe-kube-api-access-g6f8m\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.865012 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43fe2335-824f-4cad-bfff-ea8487237d61-logs\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.865161 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7636878-b435-49bd-850a-c610d62c62fe-logs\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.865220 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7636878-b435-49bd-850a-c610d62c62fe-config-data-custom\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.884337 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43fe2335-824f-4cad-bfff-ea8487237d61-config-data\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.885302 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43fe2335-824f-4cad-bfff-ea8487237d61-config-data-custom\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.886483 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43fe2335-824f-4cad-bfff-ea8487237d61-combined-ca-bundle\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.896751 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-574d8f6594-5smkj"] Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.898191 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.900333 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjhnf\" (UniqueName: \"kubernetes.io/projected/43fe2335-824f-4cad-bfff-ea8487237d61-kube-api-access-qjhnf\") pod \"barbican-keystone-listener-6c6fc899c8-gbvnz\" (UID: \"43fe2335-824f-4cad-bfff-ea8487237d61\") " pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.900654 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.910827 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-574d8f6594-5smkj"] Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.966826 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7636878-b435-49bd-850a-c610d62c62fe-config-data-custom\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.966890 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.966913 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-config\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.966934 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz4sj\" (UniqueName: \"kubernetes.io/projected/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-kube-api-access-sz4sj\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.966954 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-dns-svc\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.967128 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7636878-b435-49bd-850a-c610d62c62fe-combined-ca-bundle\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.967201 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.967261 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7636878-b435-49bd-850a-c610d62c62fe-config-data\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.967321 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6f8m\" (UniqueName: \"kubernetes.io/projected/d7636878-b435-49bd-850a-c610d62c62fe-kube-api-access-g6f8m\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.967413 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7636878-b435-49bd-850a-c610d62c62fe-logs\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.967956 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7636878-b435-49bd-850a-c610d62c62fe-logs\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.970697 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7636878-b435-49bd-850a-c610d62c62fe-config-data-custom\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.971296 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7636878-b435-49bd-850a-c610d62c62fe-combined-ca-bundle\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.973237 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7636878-b435-49bd-850a-c610d62c62fe-config-data\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.981813 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6f8m\" (UniqueName: \"kubernetes.io/projected/d7636878-b435-49bd-850a-c610d62c62fe-kube-api-access-g6f8m\") pod \"barbican-worker-6cb9795ff9-628qg\" (UID: \"d7636878-b435-49bd-850a-c610d62c62fe\") " pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:51 crc kubenswrapper[4854]: I1007 13:53:51.982795 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.044738 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6cb9795ff9-628qg" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.068548 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/745402b7-4980-4af2-9c25-3195444f8960-config-data\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.068599 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.068619 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-config\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.068642 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz4sj\" (UniqueName: \"kubernetes.io/projected/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-kube-api-access-sz4sj\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.068662 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-dns-svc\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.068704 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.068741 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/745402b7-4980-4af2-9c25-3195444f8960-logs\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.068758 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/745402b7-4980-4af2-9c25-3195444f8960-config-data-custom\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.068806 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/745402b7-4980-4af2-9c25-3195444f8960-combined-ca-bundle\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.068826 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pddk6\" (UniqueName: \"kubernetes.io/projected/745402b7-4980-4af2-9c25-3195444f8960-kube-api-access-pddk6\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.069771 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-dns-svc\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.069806 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-config\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.069832 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.070011 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.090170 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz4sj\" (UniqueName: \"kubernetes.io/projected/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-kube-api-access-sz4sj\") pod \"dnsmasq-dns-5c4cbb9589-4zg2l\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.093583 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.172610 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/745402b7-4980-4af2-9c25-3195444f8960-logs\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.172673 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/745402b7-4980-4af2-9c25-3195444f8960-config-data-custom\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.172748 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/745402b7-4980-4af2-9c25-3195444f8960-combined-ca-bundle\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.172770 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pddk6\" (UniqueName: \"kubernetes.io/projected/745402b7-4980-4af2-9c25-3195444f8960-kube-api-access-pddk6\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.172927 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/745402b7-4980-4af2-9c25-3195444f8960-config-data\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.174023 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/745402b7-4980-4af2-9c25-3195444f8960-logs\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.178377 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/745402b7-4980-4af2-9c25-3195444f8960-combined-ca-bundle\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.182423 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/745402b7-4980-4af2-9c25-3195444f8960-config-data\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.182950 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/745402b7-4980-4af2-9c25-3195444f8960-config-data-custom\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.195730 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pddk6\" (UniqueName: \"kubernetes.io/projected/745402b7-4980-4af2-9c25-3195444f8960-kube-api-access-pddk6\") pod \"barbican-api-574d8f6594-5smkj\" (UID: \"745402b7-4980-4af2-9c25-3195444f8960\") " pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.252519 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.464434 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c6fc899c8-gbvnz"] Oct 07 13:53:52 crc kubenswrapper[4854]: W1007 13:53:52.470967 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43fe2335_824f_4cad_bfff_ea8487237d61.slice/crio-27945d2f6a497421c5a2c0d0b664e8635ec37784f73669f9bd8752e14add3d55 WatchSource:0}: Error finding container 27945d2f6a497421c5a2c0d0b664e8635ec37784f73669f9bd8752e14add3d55: Status 404 returned error can't find the container with id 27945d2f6a497421c5a2c0d0b664e8635ec37784f73669f9bd8752e14add3d55 Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.525866 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6cb9795ff9-628qg"] Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.628472 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c4cbb9589-4zg2l"] Oct 07 13:53:52 crc kubenswrapper[4854]: I1007 13:53:52.840794 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-574d8f6594-5smkj"] Oct 07 13:53:52 crc kubenswrapper[4854]: W1007 13:53:52.853399 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod745402b7_4980_4af2_9c25_3195444f8960.slice/crio-14e4b75843b7814186c71f84f8dd3ac10672377548e780d94895b9477bf2fe74 WatchSource:0}: Error finding container 14e4b75843b7814186c71f84f8dd3ac10672377548e780d94895b9477bf2fe74: Status 404 returned error can't find the container with id 14e4b75843b7814186c71f84f8dd3ac10672377548e780d94895b9477bf2fe74 Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.406627 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6cb9795ff9-628qg" event={"ID":"d7636878-b435-49bd-850a-c610d62c62fe","Type":"ContainerStarted","Data":"95c2675e535fa619c79fccb4fad230f9e84edcbef4177cfc71bd90f846f58f0d"} Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.407017 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6cb9795ff9-628qg" event={"ID":"d7636878-b435-49bd-850a-c610d62c62fe","Type":"ContainerStarted","Data":"3efcb0d764bc37246383e4a32555e3c22ab9e246256c78aa5b32b60637de2420"} Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.407035 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6cb9795ff9-628qg" event={"ID":"d7636878-b435-49bd-850a-c610d62c62fe","Type":"ContainerStarted","Data":"73d77100d09da634aca53cab10af33f1ecb2822d4a7adbdd4561a7d23b5a7cc2"} Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.413559 4854 generic.go:334] "Generic (PLEG): container finished" podID="236c6cf4-03fe-4c4a-b1d4-7a68519c509b" containerID="cf7591d792c13b3eb0d9267fd27664542d4c41c3491a7b153541d5dee1b1ce35" exitCode=0 Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.413629 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" event={"ID":"236c6cf4-03fe-4c4a-b1d4-7a68519c509b","Type":"ContainerDied","Data":"cf7591d792c13b3eb0d9267fd27664542d4c41c3491a7b153541d5dee1b1ce35"} Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.413682 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" event={"ID":"236c6cf4-03fe-4c4a-b1d4-7a68519c509b","Type":"ContainerStarted","Data":"bbf5bf8968e1a5744d146313b1b5f0a741913184b2e3e25bacbc654b149efb2d"} Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.417955 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574d8f6594-5smkj" event={"ID":"745402b7-4980-4af2-9c25-3195444f8960","Type":"ContainerStarted","Data":"5eaee5f9948031ebe818f09279a2e956b704192f0cb9d341d7060cbe4af15248"} Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.418014 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574d8f6594-5smkj" event={"ID":"745402b7-4980-4af2-9c25-3195444f8960","Type":"ContainerStarted","Data":"a009e7040ef5af0077e483a8c4d85e5b1c5879df7dbd2d292453fc2d4559c438"} Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.418035 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574d8f6594-5smkj" event={"ID":"745402b7-4980-4af2-9c25-3195444f8960","Type":"ContainerStarted","Data":"14e4b75843b7814186c71f84f8dd3ac10672377548e780d94895b9477bf2fe74"} Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.418270 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.419110 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.426633 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6cb9795ff9-628qg" podStartSLOduration=2.426610807 podStartE2EDuration="2.426610807s" podCreationTimestamp="2025-10-07 13:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:53:53.419939527 +0000 UTC m=+5349.407771782" watchObservedRunningTime="2025-10-07 13:53:53.426610807 +0000 UTC m=+5349.414443062" Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.427017 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" event={"ID":"43fe2335-824f-4cad-bfff-ea8487237d61","Type":"ContainerStarted","Data":"7a5feeea8f49ebb8c3a60bb9caf2d009f72ee5b7a1c391b2493a23c8f459ef7d"} Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.428124 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" event={"ID":"43fe2335-824f-4cad-bfff-ea8487237d61","Type":"ContainerStarted","Data":"5afeccbc54c1faf652fb092f96ca9cc2d33fb07425771484e3303db3b2daa81d"} Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.428170 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" event={"ID":"43fe2335-824f-4cad-bfff-ea8487237d61","Type":"ContainerStarted","Data":"27945d2f6a497421c5a2c0d0b664e8635ec37784f73669f9bd8752e14add3d55"} Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.448733 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-574d8f6594-5smkj" podStartSLOduration=2.448666167 podStartE2EDuration="2.448666167s" podCreationTimestamp="2025-10-07 13:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:53:53.439367102 +0000 UTC m=+5349.427199347" watchObservedRunningTime="2025-10-07 13:53:53.448666167 +0000 UTC m=+5349.436498422" Oct 07 13:53:53 crc kubenswrapper[4854]: I1007 13:53:53.486337 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6c6fc899c8-gbvnz" podStartSLOduration=2.486321004 podStartE2EDuration="2.486321004s" podCreationTimestamp="2025-10-07 13:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:53:53.486171249 +0000 UTC m=+5349.474003524" watchObservedRunningTime="2025-10-07 13:53:53.486321004 +0000 UTC m=+5349.474153259" Oct 07 13:53:54 crc kubenswrapper[4854]: I1007 13:53:54.439804 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" event={"ID":"236c6cf4-03fe-4c4a-b1d4-7a68519c509b","Type":"ContainerStarted","Data":"15f2a324b3ce1b1b7bef3b4ffdf9cc6118ebe7432415cd9ff7238cdb732a107e"} Oct 07 13:53:54 crc kubenswrapper[4854]: I1007 13:53:54.440240 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:53:54 crc kubenswrapper[4854]: I1007 13:53:54.462303 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" podStartSLOduration=3.4622850769999998 podStartE2EDuration="3.462285077s" podCreationTimestamp="2025-10-07 13:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:53:54.456896183 +0000 UTC m=+5350.444728438" watchObservedRunningTime="2025-10-07 13:53:54.462285077 +0000 UTC m=+5350.450117332" Oct 07 13:53:55 crc kubenswrapper[4854]: I1007 13:53:55.702510 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:53:55 crc kubenswrapper[4854]: E1007 13:53:55.702752 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.096362 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.167651 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85887f4b95-lk5hg"] Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.168305 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" podUID="fefbc96f-0f15-4ad9-813f-763a96c25e30" containerName="dnsmasq-dns" containerID="cri-o://944fdd35ec9d291c34e62c80116adb620f505803872c14e3f989209fef4cf521" gracePeriod=10 Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.546980 4854 generic.go:334] "Generic (PLEG): container finished" podID="fefbc96f-0f15-4ad9-813f-763a96c25e30" containerID="944fdd35ec9d291c34e62c80116adb620f505803872c14e3f989209fef4cf521" exitCode=0 Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.547067 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" event={"ID":"fefbc96f-0f15-4ad9-813f-763a96c25e30","Type":"ContainerDied","Data":"944fdd35ec9d291c34e62c80116adb620f505803872c14e3f989209fef4cf521"} Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.619307 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.776723 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-config\") pod \"fefbc96f-0f15-4ad9-813f-763a96c25e30\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.776782 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-ovsdbserver-nb\") pod \"fefbc96f-0f15-4ad9-813f-763a96c25e30\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.776825 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-ovsdbserver-sb\") pod \"fefbc96f-0f15-4ad9-813f-763a96c25e30\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.776927 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-dns-svc\") pod \"fefbc96f-0f15-4ad9-813f-763a96c25e30\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.777008 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s6jf\" (UniqueName: \"kubernetes.io/projected/fefbc96f-0f15-4ad9-813f-763a96c25e30-kube-api-access-7s6jf\") pod \"fefbc96f-0f15-4ad9-813f-763a96c25e30\" (UID: \"fefbc96f-0f15-4ad9-813f-763a96c25e30\") " Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.796559 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fefbc96f-0f15-4ad9-813f-763a96c25e30-kube-api-access-7s6jf" (OuterVolumeSpecName: "kube-api-access-7s6jf") pod "fefbc96f-0f15-4ad9-813f-763a96c25e30" (UID: "fefbc96f-0f15-4ad9-813f-763a96c25e30"). InnerVolumeSpecName "kube-api-access-7s6jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.821100 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-config" (OuterVolumeSpecName: "config") pod "fefbc96f-0f15-4ad9-813f-763a96c25e30" (UID: "fefbc96f-0f15-4ad9-813f-763a96c25e30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.824619 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fefbc96f-0f15-4ad9-813f-763a96c25e30" (UID: "fefbc96f-0f15-4ad9-813f-763a96c25e30"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.838699 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fefbc96f-0f15-4ad9-813f-763a96c25e30" (UID: "fefbc96f-0f15-4ad9-813f-763a96c25e30"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.844264 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fefbc96f-0f15-4ad9-813f-763a96c25e30" (UID: "fefbc96f-0f15-4ad9-813f-763a96c25e30"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.879422 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.879454 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s6jf\" (UniqueName: \"kubernetes.io/projected/fefbc96f-0f15-4ad9-813f-763a96c25e30-kube-api-access-7s6jf\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.879464 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.879473 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:02 crc kubenswrapper[4854]: I1007 13:54:02.879483 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fefbc96f-0f15-4ad9-813f-763a96c25e30-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:03 crc kubenswrapper[4854]: I1007 13:54:03.570799 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" event={"ID":"fefbc96f-0f15-4ad9-813f-763a96c25e30","Type":"ContainerDied","Data":"0eda816e3f5d26a2ff581d4def7c110ed0aedf96a8b0192e11a807fdd6adfaf7"} Oct 07 13:54:03 crc kubenswrapper[4854]: I1007 13:54:03.570862 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85887f4b95-lk5hg" Oct 07 13:54:03 crc kubenswrapper[4854]: I1007 13:54:03.570905 4854 scope.go:117] "RemoveContainer" containerID="944fdd35ec9d291c34e62c80116adb620f505803872c14e3f989209fef4cf521" Oct 07 13:54:03 crc kubenswrapper[4854]: I1007 13:54:03.604465 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:54:03 crc kubenswrapper[4854]: I1007 13:54:03.627295 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85887f4b95-lk5hg"] Oct 07 13:54:03 crc kubenswrapper[4854]: I1007 13:54:03.627917 4854 scope.go:117] "RemoveContainer" containerID="49927adb42284504a9f85efd83a85d5e15c25ec75515b8158d731b454c713685" Oct 07 13:54:03 crc kubenswrapper[4854]: I1007 13:54:03.641894 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85887f4b95-lk5hg"] Oct 07 13:54:03 crc kubenswrapper[4854]: I1007 13:54:03.781846 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-574d8f6594-5smkj" Oct 07 13:54:04 crc kubenswrapper[4854]: I1007 13:54:04.731710 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fefbc96f-0f15-4ad9-813f-763a96c25e30" path="/var/lib/kubelet/pods/fefbc96f-0f15-4ad9-813f-763a96c25e30/volumes" Oct 07 13:54:08 crc kubenswrapper[4854]: I1007 13:54:08.702332 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:54:08 crc kubenswrapper[4854]: E1007 13:54:08.702840 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:54:18 crc kubenswrapper[4854]: I1007 13:54:18.135024 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-n2lx2"] Oct 07 13:54:18 crc kubenswrapper[4854]: E1007 13:54:18.136356 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fefbc96f-0f15-4ad9-813f-763a96c25e30" containerName="init" Oct 07 13:54:18 crc kubenswrapper[4854]: I1007 13:54:18.136381 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="fefbc96f-0f15-4ad9-813f-763a96c25e30" containerName="init" Oct 07 13:54:18 crc kubenswrapper[4854]: E1007 13:54:18.136406 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fefbc96f-0f15-4ad9-813f-763a96c25e30" containerName="dnsmasq-dns" Oct 07 13:54:18 crc kubenswrapper[4854]: I1007 13:54:18.136415 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="fefbc96f-0f15-4ad9-813f-763a96c25e30" containerName="dnsmasq-dns" Oct 07 13:54:18 crc kubenswrapper[4854]: I1007 13:54:18.136682 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="fefbc96f-0f15-4ad9-813f-763a96c25e30" containerName="dnsmasq-dns" Oct 07 13:54:18 crc kubenswrapper[4854]: I1007 13:54:18.137719 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-n2lx2" Oct 07 13:54:18 crc kubenswrapper[4854]: I1007 13:54:18.144586 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-n2lx2"] Oct 07 13:54:18 crc kubenswrapper[4854]: I1007 13:54:18.267410 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ckqp\" (UniqueName: \"kubernetes.io/projected/977c4588-d10e-4792-a3e5-111bbf45f37f-kube-api-access-8ckqp\") pod \"neutron-db-create-n2lx2\" (UID: \"977c4588-d10e-4792-a3e5-111bbf45f37f\") " pod="openstack/neutron-db-create-n2lx2" Oct 07 13:54:18 crc kubenswrapper[4854]: I1007 13:54:18.369836 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ckqp\" (UniqueName: \"kubernetes.io/projected/977c4588-d10e-4792-a3e5-111bbf45f37f-kube-api-access-8ckqp\") pod \"neutron-db-create-n2lx2\" (UID: \"977c4588-d10e-4792-a3e5-111bbf45f37f\") " pod="openstack/neutron-db-create-n2lx2" Oct 07 13:54:18 crc kubenswrapper[4854]: I1007 13:54:18.408041 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ckqp\" (UniqueName: \"kubernetes.io/projected/977c4588-d10e-4792-a3e5-111bbf45f37f-kube-api-access-8ckqp\") pod \"neutron-db-create-n2lx2\" (UID: \"977c4588-d10e-4792-a3e5-111bbf45f37f\") " pod="openstack/neutron-db-create-n2lx2" Oct 07 13:54:18 crc kubenswrapper[4854]: I1007 13:54:18.467884 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-n2lx2" Oct 07 13:54:18 crc kubenswrapper[4854]: I1007 13:54:18.947521 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-n2lx2"] Oct 07 13:54:19 crc kubenswrapper[4854]: I1007 13:54:19.768614 4854 generic.go:334] "Generic (PLEG): container finished" podID="977c4588-d10e-4792-a3e5-111bbf45f37f" containerID="c6730326036e7cf76206447bfc9286ccf192805e8a7ebd87432d6869c2f90e8c" exitCode=0 Oct 07 13:54:19 crc kubenswrapper[4854]: I1007 13:54:19.768704 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-n2lx2" event={"ID":"977c4588-d10e-4792-a3e5-111bbf45f37f","Type":"ContainerDied","Data":"c6730326036e7cf76206447bfc9286ccf192805e8a7ebd87432d6869c2f90e8c"} Oct 07 13:54:19 crc kubenswrapper[4854]: I1007 13:54:19.769365 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-n2lx2" event={"ID":"977c4588-d10e-4792-a3e5-111bbf45f37f","Type":"ContainerStarted","Data":"d6891c1feb02e52b592f4d28a7b613f4a655cc0c6480461bbb5e28c105883711"} Oct 07 13:54:21 crc kubenswrapper[4854]: I1007 13:54:21.264672 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-n2lx2" Oct 07 13:54:21 crc kubenswrapper[4854]: I1007 13:54:21.443613 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ckqp\" (UniqueName: \"kubernetes.io/projected/977c4588-d10e-4792-a3e5-111bbf45f37f-kube-api-access-8ckqp\") pod \"977c4588-d10e-4792-a3e5-111bbf45f37f\" (UID: \"977c4588-d10e-4792-a3e5-111bbf45f37f\") " Oct 07 13:54:21 crc kubenswrapper[4854]: I1007 13:54:21.452750 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/977c4588-d10e-4792-a3e5-111bbf45f37f-kube-api-access-8ckqp" (OuterVolumeSpecName: "kube-api-access-8ckqp") pod "977c4588-d10e-4792-a3e5-111bbf45f37f" (UID: "977c4588-d10e-4792-a3e5-111bbf45f37f"). InnerVolumeSpecName "kube-api-access-8ckqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:54:21 crc kubenswrapper[4854]: I1007 13:54:21.546221 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ckqp\" (UniqueName: \"kubernetes.io/projected/977c4588-d10e-4792-a3e5-111bbf45f37f-kube-api-access-8ckqp\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:21 crc kubenswrapper[4854]: I1007 13:54:21.792516 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-n2lx2" event={"ID":"977c4588-d10e-4792-a3e5-111bbf45f37f","Type":"ContainerDied","Data":"d6891c1feb02e52b592f4d28a7b613f4a655cc0c6480461bbb5e28c105883711"} Oct 07 13:54:21 crc kubenswrapper[4854]: I1007 13:54:21.792555 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6891c1feb02e52b592f4d28a7b613f4a655cc0c6480461bbb5e28c105883711" Oct 07 13:54:21 crc kubenswrapper[4854]: I1007 13:54:21.792916 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-n2lx2" Oct 07 13:54:23 crc kubenswrapper[4854]: I1007 13:54:23.702368 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:54:23 crc kubenswrapper[4854]: E1007 13:54:23.702790 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:54:28 crc kubenswrapper[4854]: I1007 13:54:28.264113 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d74c-account-create-kchkm"] Oct 07 13:54:28 crc kubenswrapper[4854]: E1007 13:54:28.265237 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977c4588-d10e-4792-a3e5-111bbf45f37f" containerName="mariadb-database-create" Oct 07 13:54:28 crc kubenswrapper[4854]: I1007 13:54:28.265270 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="977c4588-d10e-4792-a3e5-111bbf45f37f" containerName="mariadb-database-create" Oct 07 13:54:28 crc kubenswrapper[4854]: I1007 13:54:28.265736 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="977c4588-d10e-4792-a3e5-111bbf45f37f" containerName="mariadb-database-create" Oct 07 13:54:28 crc kubenswrapper[4854]: I1007 13:54:28.267082 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d74c-account-create-kchkm" Oct 07 13:54:28 crc kubenswrapper[4854]: I1007 13:54:28.284284 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d74c-account-create-kchkm"] Oct 07 13:54:28 crc kubenswrapper[4854]: I1007 13:54:28.298572 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 07 13:54:28 crc kubenswrapper[4854]: I1007 13:54:28.371857 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whd8x\" (UniqueName: \"kubernetes.io/projected/be5bf13e-a91f-45c2-b7b7-5acaa12b0c49-kube-api-access-whd8x\") pod \"neutron-d74c-account-create-kchkm\" (UID: \"be5bf13e-a91f-45c2-b7b7-5acaa12b0c49\") " pod="openstack/neutron-d74c-account-create-kchkm" Oct 07 13:54:28 crc kubenswrapper[4854]: I1007 13:54:28.474050 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whd8x\" (UniqueName: \"kubernetes.io/projected/be5bf13e-a91f-45c2-b7b7-5acaa12b0c49-kube-api-access-whd8x\") pod \"neutron-d74c-account-create-kchkm\" (UID: \"be5bf13e-a91f-45c2-b7b7-5acaa12b0c49\") " pod="openstack/neutron-d74c-account-create-kchkm" Oct 07 13:54:28 crc kubenswrapper[4854]: I1007 13:54:28.492975 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whd8x\" (UniqueName: \"kubernetes.io/projected/be5bf13e-a91f-45c2-b7b7-5acaa12b0c49-kube-api-access-whd8x\") pod \"neutron-d74c-account-create-kchkm\" (UID: \"be5bf13e-a91f-45c2-b7b7-5acaa12b0c49\") " pod="openstack/neutron-d74c-account-create-kchkm" Oct 07 13:54:28 crc kubenswrapper[4854]: I1007 13:54:28.598118 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d74c-account-create-kchkm" Oct 07 13:54:29 crc kubenswrapper[4854]: I1007 13:54:29.044495 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d74c-account-create-kchkm"] Oct 07 13:54:29 crc kubenswrapper[4854]: I1007 13:54:29.879060 4854 generic.go:334] "Generic (PLEG): container finished" podID="be5bf13e-a91f-45c2-b7b7-5acaa12b0c49" containerID="bef464f12e82227f4575a1e08ee74e815686e974f3903c927ef00f78a0e4fa41" exitCode=0 Oct 07 13:54:29 crc kubenswrapper[4854]: I1007 13:54:29.879137 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74c-account-create-kchkm" event={"ID":"be5bf13e-a91f-45c2-b7b7-5acaa12b0c49","Type":"ContainerDied","Data":"bef464f12e82227f4575a1e08ee74e815686e974f3903c927ef00f78a0e4fa41"} Oct 07 13:54:29 crc kubenswrapper[4854]: I1007 13:54:29.879210 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74c-account-create-kchkm" event={"ID":"be5bf13e-a91f-45c2-b7b7-5acaa12b0c49","Type":"ContainerStarted","Data":"694b1c5fcd5cbe9e032a1c04d6577449b736417d877409056c36fdc5743d946f"} Oct 07 13:54:31 crc kubenswrapper[4854]: I1007 13:54:31.262183 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d74c-account-create-kchkm" Oct 07 13:54:31 crc kubenswrapper[4854]: I1007 13:54:31.428893 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whd8x\" (UniqueName: \"kubernetes.io/projected/be5bf13e-a91f-45c2-b7b7-5acaa12b0c49-kube-api-access-whd8x\") pod \"be5bf13e-a91f-45c2-b7b7-5acaa12b0c49\" (UID: \"be5bf13e-a91f-45c2-b7b7-5acaa12b0c49\") " Oct 07 13:54:31 crc kubenswrapper[4854]: I1007 13:54:31.437692 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5bf13e-a91f-45c2-b7b7-5acaa12b0c49-kube-api-access-whd8x" (OuterVolumeSpecName: "kube-api-access-whd8x") pod "be5bf13e-a91f-45c2-b7b7-5acaa12b0c49" (UID: "be5bf13e-a91f-45c2-b7b7-5acaa12b0c49"). InnerVolumeSpecName "kube-api-access-whd8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:54:31 crc kubenswrapper[4854]: I1007 13:54:31.531985 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whd8x\" (UniqueName: \"kubernetes.io/projected/be5bf13e-a91f-45c2-b7b7-5acaa12b0c49-kube-api-access-whd8x\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:31 crc kubenswrapper[4854]: I1007 13:54:31.902806 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74c-account-create-kchkm" event={"ID":"be5bf13e-a91f-45c2-b7b7-5acaa12b0c49","Type":"ContainerDied","Data":"694b1c5fcd5cbe9e032a1c04d6577449b736417d877409056c36fdc5743d946f"} Oct 07 13:54:31 crc kubenswrapper[4854]: I1007 13:54:31.902852 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="694b1c5fcd5cbe9e032a1c04d6577449b736417d877409056c36fdc5743d946f" Oct 07 13:54:31 crc kubenswrapper[4854]: I1007 13:54:31.902948 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d74c-account-create-kchkm" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.392038 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-gbrck"] Oct 07 13:54:33 crc kubenswrapper[4854]: E1007 13:54:33.393069 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5bf13e-a91f-45c2-b7b7-5acaa12b0c49" containerName="mariadb-account-create" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.393103 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5bf13e-a91f-45c2-b7b7-5acaa12b0c49" containerName="mariadb-account-create" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.393576 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5bf13e-a91f-45c2-b7b7-5acaa12b0c49" containerName="mariadb-account-create" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.394874 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gbrck" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.397087 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.397989 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6929f" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.398253 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.405707 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-gbrck"] Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.569405 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-config\") pod \"neutron-db-sync-gbrck\" (UID: \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\") " pod="openstack/neutron-db-sync-gbrck" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.569676 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dkgq\" (UniqueName: \"kubernetes.io/projected/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-kube-api-access-2dkgq\") pod \"neutron-db-sync-gbrck\" (UID: \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\") " pod="openstack/neutron-db-sync-gbrck" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.569818 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-combined-ca-bundle\") pod \"neutron-db-sync-gbrck\" (UID: \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\") " pod="openstack/neutron-db-sync-gbrck" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.671019 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-config\") pod \"neutron-db-sync-gbrck\" (UID: \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\") " pod="openstack/neutron-db-sync-gbrck" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.671214 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dkgq\" (UniqueName: \"kubernetes.io/projected/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-kube-api-access-2dkgq\") pod \"neutron-db-sync-gbrck\" (UID: \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\") " pod="openstack/neutron-db-sync-gbrck" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.671281 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-combined-ca-bundle\") pod \"neutron-db-sync-gbrck\" (UID: \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\") " pod="openstack/neutron-db-sync-gbrck" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.677326 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-combined-ca-bundle\") pod \"neutron-db-sync-gbrck\" (UID: \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\") " pod="openstack/neutron-db-sync-gbrck" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.685556 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-config\") pod \"neutron-db-sync-gbrck\" (UID: \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\") " pod="openstack/neutron-db-sync-gbrck" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.688872 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dkgq\" (UniqueName: \"kubernetes.io/projected/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-kube-api-access-2dkgq\") pod \"neutron-db-sync-gbrck\" (UID: \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\") " pod="openstack/neutron-db-sync-gbrck" Oct 07 13:54:33 crc kubenswrapper[4854]: I1007 13:54:33.721829 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gbrck" Oct 07 13:54:34 crc kubenswrapper[4854]: I1007 13:54:34.222416 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-gbrck"] Oct 07 13:54:34 crc kubenswrapper[4854]: I1007 13:54:34.940705 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gbrck" event={"ID":"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac","Type":"ContainerStarted","Data":"17bf3e4416c2cf0f463c74e0acd70e269e1dcdbc5a1c014d14def2c2f65ec64f"} Oct 07 13:54:34 crc kubenswrapper[4854]: I1007 13:54:34.941031 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gbrck" event={"ID":"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac","Type":"ContainerStarted","Data":"620bdbd0592316aa93c743451ae654e7e83ff26f55a296ee074deae8a9748b04"} Oct 07 13:54:34 crc kubenswrapper[4854]: I1007 13:54:34.968566 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-gbrck" podStartSLOduration=1.968548791 podStartE2EDuration="1.968548791s" podCreationTimestamp="2025-10-07 13:54:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:54:34.963221779 +0000 UTC m=+5390.951054034" watchObservedRunningTime="2025-10-07 13:54:34.968548791 +0000 UTC m=+5390.956381046" Oct 07 13:54:36 crc kubenswrapper[4854]: I1007 13:54:36.703294 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:54:36 crc kubenswrapper[4854]: E1007 13:54:36.703921 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:54:38 crc kubenswrapper[4854]: I1007 13:54:38.985263 4854 generic.go:334] "Generic (PLEG): container finished" podID="48dd4b01-cc3a-47cd-b66f-3eed3eed17ac" containerID="17bf3e4416c2cf0f463c74e0acd70e269e1dcdbc5a1c014d14def2c2f65ec64f" exitCode=0 Oct 07 13:54:38 crc kubenswrapper[4854]: I1007 13:54:38.985332 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gbrck" event={"ID":"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac","Type":"ContainerDied","Data":"17bf3e4416c2cf0f463c74e0acd70e269e1dcdbc5a1c014d14def2c2f65ec64f"} Oct 07 13:54:40 crc kubenswrapper[4854]: I1007 13:54:40.347846 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gbrck" Oct 07 13:54:40 crc kubenswrapper[4854]: I1007 13:54:40.494423 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-combined-ca-bundle\") pod \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\" (UID: \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\") " Oct 07 13:54:40 crc kubenswrapper[4854]: I1007 13:54:40.494528 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dkgq\" (UniqueName: \"kubernetes.io/projected/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-kube-api-access-2dkgq\") pod \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\" (UID: \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\") " Oct 07 13:54:40 crc kubenswrapper[4854]: I1007 13:54:40.494615 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-config\") pod \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\" (UID: \"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac\") " Oct 07 13:54:40 crc kubenswrapper[4854]: I1007 13:54:40.500767 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-kube-api-access-2dkgq" (OuterVolumeSpecName: "kube-api-access-2dkgq") pod "48dd4b01-cc3a-47cd-b66f-3eed3eed17ac" (UID: "48dd4b01-cc3a-47cd-b66f-3eed3eed17ac"). InnerVolumeSpecName "kube-api-access-2dkgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:54:40 crc kubenswrapper[4854]: I1007 13:54:40.517432 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-config" (OuterVolumeSpecName: "config") pod "48dd4b01-cc3a-47cd-b66f-3eed3eed17ac" (UID: "48dd4b01-cc3a-47cd-b66f-3eed3eed17ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:54:40 crc kubenswrapper[4854]: I1007 13:54:40.517649 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48dd4b01-cc3a-47cd-b66f-3eed3eed17ac" (UID: "48dd4b01-cc3a-47cd-b66f-3eed3eed17ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:54:40 crc kubenswrapper[4854]: I1007 13:54:40.596903 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:40 crc kubenswrapper[4854]: I1007 13:54:40.596951 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dkgq\" (UniqueName: \"kubernetes.io/projected/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-kube-api-access-2dkgq\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:40 crc kubenswrapper[4854]: I1007 13:54:40.596968 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.009243 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gbrck" event={"ID":"48dd4b01-cc3a-47cd-b66f-3eed3eed17ac","Type":"ContainerDied","Data":"620bdbd0592316aa93c743451ae654e7e83ff26f55a296ee074deae8a9748b04"} Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.009309 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="620bdbd0592316aa93c743451ae654e7e83ff26f55a296ee074deae8a9748b04" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.009360 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gbrck" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.254340 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65cd69c4d9-w5szp"] Oct 07 13:54:41 crc kubenswrapper[4854]: E1007 13:54:41.254694 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48dd4b01-cc3a-47cd-b66f-3eed3eed17ac" containerName="neutron-db-sync" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.254710 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="48dd4b01-cc3a-47cd-b66f-3eed3eed17ac" containerName="neutron-db-sync" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.254874 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="48dd4b01-cc3a-47cd-b66f-3eed3eed17ac" containerName="neutron-db-sync" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.255822 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.281861 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65cd69c4d9-w5szp"] Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.320846 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-config\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.320919 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-ovsdbserver-nb\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.320966 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtkjn\" (UniqueName: \"kubernetes.io/projected/107eee8d-eb8a-4e06-973c-92521472ce75-kube-api-access-mtkjn\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.321016 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-dns-svc\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.321033 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-ovsdbserver-sb\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.406500 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d49dfff85-kb4jj"] Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.410932 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.418126 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.418219 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6929f" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.422901 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtkjn\" (UniqueName: \"kubernetes.io/projected/107eee8d-eb8a-4e06-973c-92521472ce75-kube-api-access-mtkjn\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.422997 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-dns-svc\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.423029 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-ovsdbserver-sb\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.423178 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-config\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.423213 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-ovsdbserver-nb\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.424001 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-ovsdbserver-sb\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.424442 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-dns-svc\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.424524 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-ovsdbserver-nb\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.424535 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-config\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.426299 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d49dfff85-kb4jj"] Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.437966 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.446185 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtkjn\" (UniqueName: \"kubernetes.io/projected/107eee8d-eb8a-4e06-973c-92521472ce75-kube-api-access-mtkjn\") pod \"dnsmasq-dns-65cd69c4d9-w5szp\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.524188 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clcts\" (UniqueName: \"kubernetes.io/projected/8fe6746e-d7d8-41be-bb1a-63f0aa67044a-kube-api-access-clcts\") pod \"neutron-6d49dfff85-kb4jj\" (UID: \"8fe6746e-d7d8-41be-bb1a-63f0aa67044a\") " pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.524477 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8fe6746e-d7d8-41be-bb1a-63f0aa67044a-httpd-config\") pod \"neutron-6d49dfff85-kb4jj\" (UID: \"8fe6746e-d7d8-41be-bb1a-63f0aa67044a\") " pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.524559 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fe6746e-d7d8-41be-bb1a-63f0aa67044a-config\") pod \"neutron-6d49dfff85-kb4jj\" (UID: \"8fe6746e-d7d8-41be-bb1a-63f0aa67044a\") " pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.524598 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe6746e-d7d8-41be-bb1a-63f0aa67044a-combined-ca-bundle\") pod \"neutron-6d49dfff85-kb4jj\" (UID: \"8fe6746e-d7d8-41be-bb1a-63f0aa67044a\") " pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.625584 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8fe6746e-d7d8-41be-bb1a-63f0aa67044a-httpd-config\") pod \"neutron-6d49dfff85-kb4jj\" (UID: \"8fe6746e-d7d8-41be-bb1a-63f0aa67044a\") " pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.625638 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fe6746e-d7d8-41be-bb1a-63f0aa67044a-config\") pod \"neutron-6d49dfff85-kb4jj\" (UID: \"8fe6746e-d7d8-41be-bb1a-63f0aa67044a\") " pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.625675 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe6746e-d7d8-41be-bb1a-63f0aa67044a-combined-ca-bundle\") pod \"neutron-6d49dfff85-kb4jj\" (UID: \"8fe6746e-d7d8-41be-bb1a-63f0aa67044a\") " pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.625726 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clcts\" (UniqueName: \"kubernetes.io/projected/8fe6746e-d7d8-41be-bb1a-63f0aa67044a-kube-api-access-clcts\") pod \"neutron-6d49dfff85-kb4jj\" (UID: \"8fe6746e-d7d8-41be-bb1a-63f0aa67044a\") " pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.629727 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8fe6746e-d7d8-41be-bb1a-63f0aa67044a-httpd-config\") pod \"neutron-6d49dfff85-kb4jj\" (UID: \"8fe6746e-d7d8-41be-bb1a-63f0aa67044a\") " pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.630077 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe6746e-d7d8-41be-bb1a-63f0aa67044a-combined-ca-bundle\") pod \"neutron-6d49dfff85-kb4jj\" (UID: \"8fe6746e-d7d8-41be-bb1a-63f0aa67044a\") " pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.630731 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.631023 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fe6746e-d7d8-41be-bb1a-63f0aa67044a-config\") pod \"neutron-6d49dfff85-kb4jj\" (UID: \"8fe6746e-d7d8-41be-bb1a-63f0aa67044a\") " pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.645324 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clcts\" (UniqueName: \"kubernetes.io/projected/8fe6746e-d7d8-41be-bb1a-63f0aa67044a-kube-api-access-clcts\") pod \"neutron-6d49dfff85-kb4jj\" (UID: \"8fe6746e-d7d8-41be-bb1a-63f0aa67044a\") " pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:41 crc kubenswrapper[4854]: I1007 13:54:41.743010 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:42 crc kubenswrapper[4854]: I1007 13:54:42.129975 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65cd69c4d9-w5szp"] Oct 07 13:54:42 crc kubenswrapper[4854]: I1007 13:54:42.310683 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d49dfff85-kb4jj"] Oct 07 13:54:42 crc kubenswrapper[4854]: W1007 13:54:42.313482 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fe6746e_d7d8_41be_bb1a_63f0aa67044a.slice/crio-ad0b04d8e65a2f290d49a1c8c9b886ee06b213ca9fb22d28ec33f5695deb4a62 WatchSource:0}: Error finding container ad0b04d8e65a2f290d49a1c8c9b886ee06b213ca9fb22d28ec33f5695deb4a62: Status 404 returned error can't find the container with id ad0b04d8e65a2f290d49a1c8c9b886ee06b213ca9fb22d28ec33f5695deb4a62 Oct 07 13:54:43 crc kubenswrapper[4854]: I1007 13:54:43.029077 4854 generic.go:334] "Generic (PLEG): container finished" podID="107eee8d-eb8a-4e06-973c-92521472ce75" containerID="619c69f8dd1682d3820a73b821bd9820740d2977dc4c4c8907b03ded28b4f1ba" exitCode=0 Oct 07 13:54:43 crc kubenswrapper[4854]: I1007 13:54:43.029410 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" event={"ID":"107eee8d-eb8a-4e06-973c-92521472ce75","Type":"ContainerDied","Data":"619c69f8dd1682d3820a73b821bd9820740d2977dc4c4c8907b03ded28b4f1ba"} Oct 07 13:54:43 crc kubenswrapper[4854]: I1007 13:54:43.029436 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" event={"ID":"107eee8d-eb8a-4e06-973c-92521472ce75","Type":"ContainerStarted","Data":"f8e19b59e326fbbc2f5cc236b8df9292738b907d8633d1067ac594893545c29c"} Oct 07 13:54:43 crc kubenswrapper[4854]: I1007 13:54:43.033306 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d49dfff85-kb4jj" event={"ID":"8fe6746e-d7d8-41be-bb1a-63f0aa67044a","Type":"ContainerStarted","Data":"cccb72a8123d76d6cf93a67b394aeb18e5b2b0ff50c1fe1c7059229bf9a76bb8"} Oct 07 13:54:43 crc kubenswrapper[4854]: I1007 13:54:43.033359 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d49dfff85-kb4jj" event={"ID":"8fe6746e-d7d8-41be-bb1a-63f0aa67044a","Type":"ContainerStarted","Data":"9ee3542d7d5e58c80877ef400d1d9c8215508ae98a5a0915923d2b053f01dfdb"} Oct 07 13:54:43 crc kubenswrapper[4854]: I1007 13:54:43.033372 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d49dfff85-kb4jj" event={"ID":"8fe6746e-d7d8-41be-bb1a-63f0aa67044a","Type":"ContainerStarted","Data":"ad0b04d8e65a2f290d49a1c8c9b886ee06b213ca9fb22d28ec33f5695deb4a62"} Oct 07 13:54:43 crc kubenswrapper[4854]: I1007 13:54:43.033541 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:54:43 crc kubenswrapper[4854]: I1007 13:54:43.094631 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d49dfff85-kb4jj" podStartSLOduration=2.094609872 podStartE2EDuration="2.094609872s" podCreationTimestamp="2025-10-07 13:54:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:54:43.090554136 +0000 UTC m=+5399.078386391" watchObservedRunningTime="2025-10-07 13:54:43.094609872 +0000 UTC m=+5399.082442137" Oct 07 13:54:44 crc kubenswrapper[4854]: I1007 13:54:44.055730 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" event={"ID":"107eee8d-eb8a-4e06-973c-92521472ce75","Type":"ContainerStarted","Data":"b6a4459a1bcbe5a5ff513fe9ee90698e65a2ab75c621c24c3c7f733da0329897"} Oct 07 13:54:44 crc kubenswrapper[4854]: I1007 13:54:44.088276 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" podStartSLOduration=3.08825128 podStartE2EDuration="3.08825128s" podCreationTimestamp="2025-10-07 13:54:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:54:44.087331864 +0000 UTC m=+5400.075164139" watchObservedRunningTime="2025-10-07 13:54:44.08825128 +0000 UTC m=+5400.076083535" Oct 07 13:54:45 crc kubenswrapper[4854]: I1007 13:54:45.063826 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:49 crc kubenswrapper[4854]: I1007 13:54:49.703475 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:54:49 crc kubenswrapper[4854]: E1007 13:54:49.704657 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:54:51 crc kubenswrapper[4854]: I1007 13:54:51.633412 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:54:51 crc kubenswrapper[4854]: I1007 13:54:51.735911 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c4cbb9589-4zg2l"] Oct 07 13:54:51 crc kubenswrapper[4854]: I1007 13:54:51.736424 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" podUID="236c6cf4-03fe-4c4a-b1d4-7a68519c509b" containerName="dnsmasq-dns" containerID="cri-o://15f2a324b3ce1b1b7bef3b4ffdf9cc6118ebe7432415cd9ff7238cdb732a107e" gracePeriod=10 Oct 07 13:54:51 crc kubenswrapper[4854]: E1007 13:54:51.999426 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod236c6cf4_03fe_4c4a_b1d4_7a68519c509b.slice/crio-conmon-15f2a324b3ce1b1b7bef3b4ffdf9cc6118ebe7432415cd9ff7238cdb732a107e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod236c6cf4_03fe_4c4a_b1d4_7a68519c509b.slice/crio-15f2a324b3ce1b1b7bef3b4ffdf9cc6118ebe7432415cd9ff7238cdb732a107e.scope\": RecentStats: unable to find data in memory cache]" Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.095014 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" podUID="236c6cf4-03fe-4c4a-b1d4-7a68519c509b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.32:5353: connect: connection refused" Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.133323 4854 generic.go:334] "Generic (PLEG): container finished" podID="236c6cf4-03fe-4c4a-b1d4-7a68519c509b" containerID="15f2a324b3ce1b1b7bef3b4ffdf9cc6118ebe7432415cd9ff7238cdb732a107e" exitCode=0 Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.133371 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" event={"ID":"236c6cf4-03fe-4c4a-b1d4-7a68519c509b","Type":"ContainerDied","Data":"15f2a324b3ce1b1b7bef3b4ffdf9cc6118ebe7432415cd9ff7238cdb732a107e"} Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.372871 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.447399 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz4sj\" (UniqueName: \"kubernetes.io/projected/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-kube-api-access-sz4sj\") pod \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.447472 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-dns-svc\") pod \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.447536 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-ovsdbserver-sb\") pod \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.447559 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-ovsdbserver-nb\") pod \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.447605 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-config\") pod \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\" (UID: \"236c6cf4-03fe-4c4a-b1d4-7a68519c509b\") " Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.473002 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-kube-api-access-sz4sj" (OuterVolumeSpecName: "kube-api-access-sz4sj") pod "236c6cf4-03fe-4c4a-b1d4-7a68519c509b" (UID: "236c6cf4-03fe-4c4a-b1d4-7a68519c509b"). InnerVolumeSpecName "kube-api-access-sz4sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.509223 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "236c6cf4-03fe-4c4a-b1d4-7a68519c509b" (UID: "236c6cf4-03fe-4c4a-b1d4-7a68519c509b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.509744 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "236c6cf4-03fe-4c4a-b1d4-7a68519c509b" (UID: "236c6cf4-03fe-4c4a-b1d4-7a68519c509b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.515705 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-config" (OuterVolumeSpecName: "config") pod "236c6cf4-03fe-4c4a-b1d4-7a68519c509b" (UID: "236c6cf4-03fe-4c4a-b1d4-7a68519c509b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.549412 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz4sj\" (UniqueName: \"kubernetes.io/projected/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-kube-api-access-sz4sj\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.549450 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.549460 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.549468 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.550179 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "236c6cf4-03fe-4c4a-b1d4-7a68519c509b" (UID: "236c6cf4-03fe-4c4a-b1d4-7a68519c509b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:54:52 crc kubenswrapper[4854]: I1007 13:54:52.650991 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/236c6cf4-03fe-4c4a-b1d4-7a68519c509b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:54:53 crc kubenswrapper[4854]: I1007 13:54:53.147503 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" event={"ID":"236c6cf4-03fe-4c4a-b1d4-7a68519c509b","Type":"ContainerDied","Data":"bbf5bf8968e1a5744d146313b1b5f0a741913184b2e3e25bacbc654b149efb2d"} Oct 07 13:54:53 crc kubenswrapper[4854]: I1007 13:54:53.147568 4854 scope.go:117] "RemoveContainer" containerID="15f2a324b3ce1b1b7bef3b4ffdf9cc6118ebe7432415cd9ff7238cdb732a107e" Oct 07 13:54:53 crc kubenswrapper[4854]: I1007 13:54:53.147634 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4cbb9589-4zg2l" Oct 07 13:54:53 crc kubenswrapper[4854]: I1007 13:54:53.179614 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c4cbb9589-4zg2l"] Oct 07 13:54:53 crc kubenswrapper[4854]: I1007 13:54:53.185481 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c4cbb9589-4zg2l"] Oct 07 13:54:53 crc kubenswrapper[4854]: I1007 13:54:53.189419 4854 scope.go:117] "RemoveContainer" containerID="cf7591d792c13b3eb0d9267fd27664542d4c41c3491a7b153541d5dee1b1ce35" Oct 07 13:54:54 crc kubenswrapper[4854]: I1007 13:54:54.721066 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="236c6cf4-03fe-4c4a-b1d4-7a68519c509b" path="/var/lib/kubelet/pods/236c6cf4-03fe-4c4a-b1d4-7a68519c509b/volumes" Oct 07 13:55:02 crc kubenswrapper[4854]: I1007 13:55:02.704136 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:55:02 crc kubenswrapper[4854]: E1007 13:55:02.705401 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:55:11 crc kubenswrapper[4854]: I1007 13:55:11.759679 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d49dfff85-kb4jj" Oct 07 13:55:15 crc kubenswrapper[4854]: I1007 13:55:15.703522 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:55:15 crc kubenswrapper[4854]: E1007 13:55:15.704393 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:55:19 crc kubenswrapper[4854]: I1007 13:55:19.907421 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-qjpwd"] Oct 07 13:55:19 crc kubenswrapper[4854]: E1007 13:55:19.908168 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236c6cf4-03fe-4c4a-b1d4-7a68519c509b" containerName="init" Oct 07 13:55:19 crc kubenswrapper[4854]: I1007 13:55:19.908183 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="236c6cf4-03fe-4c4a-b1d4-7a68519c509b" containerName="init" Oct 07 13:55:19 crc kubenswrapper[4854]: E1007 13:55:19.908229 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236c6cf4-03fe-4c4a-b1d4-7a68519c509b" containerName="dnsmasq-dns" Oct 07 13:55:19 crc kubenswrapper[4854]: I1007 13:55:19.908239 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="236c6cf4-03fe-4c4a-b1d4-7a68519c509b" containerName="dnsmasq-dns" Oct 07 13:55:19 crc kubenswrapper[4854]: I1007 13:55:19.908428 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="236c6cf4-03fe-4c4a-b1d4-7a68519c509b" containerName="dnsmasq-dns" Oct 07 13:55:19 crc kubenswrapper[4854]: I1007 13:55:19.909268 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qjpwd" Oct 07 13:55:19 crc kubenswrapper[4854]: I1007 13:55:19.915670 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qjpwd"] Oct 07 13:55:20 crc kubenswrapper[4854]: I1007 13:55:20.052391 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dldl8\" (UniqueName: \"kubernetes.io/projected/d0dc3885-3e3b-40c9-a58c-d255e2f321f6-kube-api-access-dldl8\") pod \"glance-db-create-qjpwd\" (UID: \"d0dc3885-3e3b-40c9-a58c-d255e2f321f6\") " pod="openstack/glance-db-create-qjpwd" Oct 07 13:55:20 crc kubenswrapper[4854]: I1007 13:55:20.154176 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dldl8\" (UniqueName: \"kubernetes.io/projected/d0dc3885-3e3b-40c9-a58c-d255e2f321f6-kube-api-access-dldl8\") pod \"glance-db-create-qjpwd\" (UID: \"d0dc3885-3e3b-40c9-a58c-d255e2f321f6\") " pod="openstack/glance-db-create-qjpwd" Oct 07 13:55:20 crc kubenswrapper[4854]: I1007 13:55:20.182325 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dldl8\" (UniqueName: \"kubernetes.io/projected/d0dc3885-3e3b-40c9-a58c-d255e2f321f6-kube-api-access-dldl8\") pod \"glance-db-create-qjpwd\" (UID: \"d0dc3885-3e3b-40c9-a58c-d255e2f321f6\") " pod="openstack/glance-db-create-qjpwd" Oct 07 13:55:20 crc kubenswrapper[4854]: I1007 13:55:20.241963 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qjpwd" Oct 07 13:55:20 crc kubenswrapper[4854]: I1007 13:55:20.744525 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qjpwd"] Oct 07 13:55:21 crc kubenswrapper[4854]: I1007 13:55:21.456624 4854 generic.go:334] "Generic (PLEG): container finished" podID="d0dc3885-3e3b-40c9-a58c-d255e2f321f6" containerID="9417fa9bccf9488ffc05b9934733eeb027211979eb3310c30edda1b45ad53f81" exitCode=0 Oct 07 13:55:21 crc kubenswrapper[4854]: I1007 13:55:21.456811 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qjpwd" event={"ID":"d0dc3885-3e3b-40c9-a58c-d255e2f321f6","Type":"ContainerDied","Data":"9417fa9bccf9488ffc05b9934733eeb027211979eb3310c30edda1b45ad53f81"} Oct 07 13:55:21 crc kubenswrapper[4854]: I1007 13:55:21.458045 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qjpwd" event={"ID":"d0dc3885-3e3b-40c9-a58c-d255e2f321f6","Type":"ContainerStarted","Data":"d8683953128d2d2d0626f7ab85a90a3f00325854f71823fd761498c03057554a"} Oct 07 13:55:22 crc kubenswrapper[4854]: I1007 13:55:22.861540 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qjpwd" Oct 07 13:55:23 crc kubenswrapper[4854]: I1007 13:55:23.012977 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dldl8\" (UniqueName: \"kubernetes.io/projected/d0dc3885-3e3b-40c9-a58c-d255e2f321f6-kube-api-access-dldl8\") pod \"d0dc3885-3e3b-40c9-a58c-d255e2f321f6\" (UID: \"d0dc3885-3e3b-40c9-a58c-d255e2f321f6\") " Oct 07 13:55:23 crc kubenswrapper[4854]: I1007 13:55:23.020343 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0dc3885-3e3b-40c9-a58c-d255e2f321f6-kube-api-access-dldl8" (OuterVolumeSpecName: "kube-api-access-dldl8") pod "d0dc3885-3e3b-40c9-a58c-d255e2f321f6" (UID: "d0dc3885-3e3b-40c9-a58c-d255e2f321f6"). InnerVolumeSpecName "kube-api-access-dldl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:55:23 crc kubenswrapper[4854]: I1007 13:55:23.114906 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dldl8\" (UniqueName: \"kubernetes.io/projected/d0dc3885-3e3b-40c9-a58c-d255e2f321f6-kube-api-access-dldl8\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:23 crc kubenswrapper[4854]: I1007 13:55:23.484065 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qjpwd" event={"ID":"d0dc3885-3e3b-40c9-a58c-d255e2f321f6","Type":"ContainerDied","Data":"d8683953128d2d2d0626f7ab85a90a3f00325854f71823fd761498c03057554a"} Oct 07 13:55:23 crc kubenswrapper[4854]: I1007 13:55:23.484403 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8683953128d2d2d0626f7ab85a90a3f00325854f71823fd761498c03057554a" Oct 07 13:55:23 crc kubenswrapper[4854]: I1007 13:55:23.484167 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qjpwd" Oct 07 13:55:28 crc kubenswrapper[4854]: I1007 13:55:28.704391 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:55:28 crc kubenswrapper[4854]: E1007 13:55:28.705490 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:55:30 crc kubenswrapper[4854]: I1007 13:55:30.038120 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5199-account-create-tqt8m"] Oct 07 13:55:30 crc kubenswrapper[4854]: E1007 13:55:30.038817 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0dc3885-3e3b-40c9-a58c-d255e2f321f6" containerName="mariadb-database-create" Oct 07 13:55:30 crc kubenswrapper[4854]: I1007 13:55:30.038832 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0dc3885-3e3b-40c9-a58c-d255e2f321f6" containerName="mariadb-database-create" Oct 07 13:55:30 crc kubenswrapper[4854]: I1007 13:55:30.039024 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0dc3885-3e3b-40c9-a58c-d255e2f321f6" containerName="mariadb-database-create" Oct 07 13:55:30 crc kubenswrapper[4854]: I1007 13:55:30.039605 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5199-account-create-tqt8m" Oct 07 13:55:30 crc kubenswrapper[4854]: I1007 13:55:30.041481 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 07 13:55:30 crc kubenswrapper[4854]: I1007 13:55:30.043763 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqcrz\" (UniqueName: \"kubernetes.io/projected/4699aaa5-6e90-4333-bb68-b9294ff720d0-kube-api-access-jqcrz\") pod \"glance-5199-account-create-tqt8m\" (UID: \"4699aaa5-6e90-4333-bb68-b9294ff720d0\") " pod="openstack/glance-5199-account-create-tqt8m" Oct 07 13:55:30 crc kubenswrapper[4854]: I1007 13:55:30.047527 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5199-account-create-tqt8m"] Oct 07 13:55:30 crc kubenswrapper[4854]: I1007 13:55:30.144851 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqcrz\" (UniqueName: \"kubernetes.io/projected/4699aaa5-6e90-4333-bb68-b9294ff720d0-kube-api-access-jqcrz\") pod \"glance-5199-account-create-tqt8m\" (UID: \"4699aaa5-6e90-4333-bb68-b9294ff720d0\") " pod="openstack/glance-5199-account-create-tqt8m" Oct 07 13:55:30 crc kubenswrapper[4854]: I1007 13:55:30.166027 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqcrz\" (UniqueName: \"kubernetes.io/projected/4699aaa5-6e90-4333-bb68-b9294ff720d0-kube-api-access-jqcrz\") pod \"glance-5199-account-create-tqt8m\" (UID: \"4699aaa5-6e90-4333-bb68-b9294ff720d0\") " pod="openstack/glance-5199-account-create-tqt8m" Oct 07 13:55:30 crc kubenswrapper[4854]: I1007 13:55:30.369701 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5199-account-create-tqt8m" Oct 07 13:55:30 crc kubenswrapper[4854]: I1007 13:55:30.845243 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5199-account-create-tqt8m"] Oct 07 13:55:30 crc kubenswrapper[4854]: W1007 13:55:30.858633 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4699aaa5_6e90_4333_bb68_b9294ff720d0.slice/crio-00494857472c31da7777828102d6f7c4058919c0c3845984d192a6cb73dbc531 WatchSource:0}: Error finding container 00494857472c31da7777828102d6f7c4058919c0c3845984d192a6cb73dbc531: Status 404 returned error can't find the container with id 00494857472c31da7777828102d6f7c4058919c0c3845984d192a6cb73dbc531 Oct 07 13:55:31 crc kubenswrapper[4854]: I1007 13:55:31.549350 4854 generic.go:334] "Generic (PLEG): container finished" podID="4699aaa5-6e90-4333-bb68-b9294ff720d0" containerID="b895906b4b9e14129973bb658195cdd113392aed549dac8ab95bdd59464938db" exitCode=0 Oct 07 13:55:31 crc kubenswrapper[4854]: I1007 13:55:31.549409 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5199-account-create-tqt8m" event={"ID":"4699aaa5-6e90-4333-bb68-b9294ff720d0","Type":"ContainerDied","Data":"b895906b4b9e14129973bb658195cdd113392aed549dac8ab95bdd59464938db"} Oct 07 13:55:31 crc kubenswrapper[4854]: I1007 13:55:31.549683 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5199-account-create-tqt8m" event={"ID":"4699aaa5-6e90-4333-bb68-b9294ff720d0","Type":"ContainerStarted","Data":"00494857472c31da7777828102d6f7c4058919c0c3845984d192a6cb73dbc531"} Oct 07 13:55:32 crc kubenswrapper[4854]: I1007 13:55:32.933282 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5199-account-create-tqt8m" Oct 07 13:55:33 crc kubenswrapper[4854]: I1007 13:55:33.096037 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqcrz\" (UniqueName: \"kubernetes.io/projected/4699aaa5-6e90-4333-bb68-b9294ff720d0-kube-api-access-jqcrz\") pod \"4699aaa5-6e90-4333-bb68-b9294ff720d0\" (UID: \"4699aaa5-6e90-4333-bb68-b9294ff720d0\") " Oct 07 13:55:33 crc kubenswrapper[4854]: I1007 13:55:33.111417 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4699aaa5-6e90-4333-bb68-b9294ff720d0-kube-api-access-jqcrz" (OuterVolumeSpecName: "kube-api-access-jqcrz") pod "4699aaa5-6e90-4333-bb68-b9294ff720d0" (UID: "4699aaa5-6e90-4333-bb68-b9294ff720d0"). InnerVolumeSpecName "kube-api-access-jqcrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:55:33 crc kubenswrapper[4854]: I1007 13:55:33.198342 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqcrz\" (UniqueName: \"kubernetes.io/projected/4699aaa5-6e90-4333-bb68-b9294ff720d0-kube-api-access-jqcrz\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:33 crc kubenswrapper[4854]: I1007 13:55:33.572875 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5199-account-create-tqt8m" event={"ID":"4699aaa5-6e90-4333-bb68-b9294ff720d0","Type":"ContainerDied","Data":"00494857472c31da7777828102d6f7c4058919c0c3845984d192a6cb73dbc531"} Oct 07 13:55:33 crc kubenswrapper[4854]: I1007 13:55:33.572932 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5199-account-create-tqt8m" Oct 07 13:55:33 crc kubenswrapper[4854]: I1007 13:55:33.572944 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00494857472c31da7777828102d6f7c4058919c0c3845984d192a6cb73dbc531" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.112990 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-gmm8k"] Oct 07 13:55:35 crc kubenswrapper[4854]: E1007 13:55:35.113886 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4699aaa5-6e90-4333-bb68-b9294ff720d0" containerName="mariadb-account-create" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.113911 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4699aaa5-6e90-4333-bb68-b9294ff720d0" containerName="mariadb-account-create" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.114223 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="4699aaa5-6e90-4333-bb68-b9294ff720d0" containerName="mariadb-account-create" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.115063 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.117753 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.126464 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b2xjl" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.128411 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gmm8k"] Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.236085 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7vc4\" (UniqueName: \"kubernetes.io/projected/a12bba49-d1b4-469f-96fa-c74a02c4f509-kube-api-access-m7vc4\") pod \"glance-db-sync-gmm8k\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.236443 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-config-data\") pod \"glance-db-sync-gmm8k\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.236566 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-combined-ca-bundle\") pod \"glance-db-sync-gmm8k\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.236653 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-db-sync-config-data\") pod \"glance-db-sync-gmm8k\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.338808 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7vc4\" (UniqueName: \"kubernetes.io/projected/a12bba49-d1b4-469f-96fa-c74a02c4f509-kube-api-access-m7vc4\") pod \"glance-db-sync-gmm8k\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.338932 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-config-data\") pod \"glance-db-sync-gmm8k\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.338972 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-combined-ca-bundle\") pod \"glance-db-sync-gmm8k\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.338997 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-db-sync-config-data\") pod \"glance-db-sync-gmm8k\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.358522 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-config-data\") pod \"glance-db-sync-gmm8k\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.358871 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-db-sync-config-data\") pod \"glance-db-sync-gmm8k\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.360051 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-combined-ca-bundle\") pod \"glance-db-sync-gmm8k\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.370566 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7vc4\" (UniqueName: \"kubernetes.io/projected/a12bba49-d1b4-469f-96fa-c74a02c4f509-kube-api-access-m7vc4\") pod \"glance-db-sync-gmm8k\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:35 crc kubenswrapper[4854]: I1007 13:55:35.433601 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:36 crc kubenswrapper[4854]: I1007 13:55:36.072266 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gmm8k"] Oct 07 13:55:36 crc kubenswrapper[4854]: I1007 13:55:36.652209 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gmm8k" event={"ID":"a12bba49-d1b4-469f-96fa-c74a02c4f509","Type":"ContainerStarted","Data":"3aacd9218c357a9a595eb9b8f50e2ccdf5c28c8a92851849b885bdc36a1d7cf7"} Oct 07 13:55:36 crc kubenswrapper[4854]: I1007 13:55:36.652676 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gmm8k" event={"ID":"a12bba49-d1b4-469f-96fa-c74a02c4f509","Type":"ContainerStarted","Data":"91560d656c1d6a5c03e671ae4d7ded0cfb9eeda27ba2b40a97b35da2f20b5b7a"} Oct 07 13:55:36 crc kubenswrapper[4854]: I1007 13:55:36.676587 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-gmm8k" podStartSLOduration=1.6765635250000002 podStartE2EDuration="1.676563525s" podCreationTimestamp="2025-10-07 13:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:36.672379465 +0000 UTC m=+5452.660211730" watchObservedRunningTime="2025-10-07 13:55:36.676563525 +0000 UTC m=+5452.664395790" Oct 07 13:55:40 crc kubenswrapper[4854]: I1007 13:55:40.695983 4854 generic.go:334] "Generic (PLEG): container finished" podID="a12bba49-d1b4-469f-96fa-c74a02c4f509" containerID="3aacd9218c357a9a595eb9b8f50e2ccdf5c28c8a92851849b885bdc36a1d7cf7" exitCode=0 Oct 07 13:55:40 crc kubenswrapper[4854]: I1007 13:55:40.696076 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gmm8k" event={"ID":"a12bba49-d1b4-469f-96fa-c74a02c4f509","Type":"ContainerDied","Data":"3aacd9218c357a9a595eb9b8f50e2ccdf5c28c8a92851849b885bdc36a1d7cf7"} Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.207439 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.223287 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-combined-ca-bundle\") pod \"a12bba49-d1b4-469f-96fa-c74a02c4f509\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.223369 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7vc4\" (UniqueName: \"kubernetes.io/projected/a12bba49-d1b4-469f-96fa-c74a02c4f509-kube-api-access-m7vc4\") pod \"a12bba49-d1b4-469f-96fa-c74a02c4f509\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.223466 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-config-data\") pod \"a12bba49-d1b4-469f-96fa-c74a02c4f509\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.223493 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-db-sync-config-data\") pod \"a12bba49-d1b4-469f-96fa-c74a02c4f509\" (UID: \"a12bba49-d1b4-469f-96fa-c74a02c4f509\") " Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.234460 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a12bba49-d1b4-469f-96fa-c74a02c4f509" (UID: "a12bba49-d1b4-469f-96fa-c74a02c4f509"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.235718 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a12bba49-d1b4-469f-96fa-c74a02c4f509-kube-api-access-m7vc4" (OuterVolumeSpecName: "kube-api-access-m7vc4") pod "a12bba49-d1b4-469f-96fa-c74a02c4f509" (UID: "a12bba49-d1b4-469f-96fa-c74a02c4f509"). InnerVolumeSpecName "kube-api-access-m7vc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.266296 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a12bba49-d1b4-469f-96fa-c74a02c4f509" (UID: "a12bba49-d1b4-469f-96fa-c74a02c4f509"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.293178 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-config-data" (OuterVolumeSpecName: "config-data") pod "a12bba49-d1b4-469f-96fa-c74a02c4f509" (UID: "a12bba49-d1b4-469f-96fa-c74a02c4f509"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.324851 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.324897 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7vc4\" (UniqueName: \"kubernetes.io/projected/a12bba49-d1b4-469f-96fa-c74a02c4f509-kube-api-access-m7vc4\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.324914 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.324930 4854 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a12bba49-d1b4-469f-96fa-c74a02c4f509-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.702909 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:55:42 crc kubenswrapper[4854]: E1007 13:55:42.703654 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.721308 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gmm8k" event={"ID":"a12bba49-d1b4-469f-96fa-c74a02c4f509","Type":"ContainerDied","Data":"91560d656c1d6a5c03e671ae4d7ded0cfb9eeda27ba2b40a97b35da2f20b5b7a"} Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.721377 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91560d656c1d6a5c03e671ae4d7ded0cfb9eeda27ba2b40a97b35da2f20b5b7a" Oct 07 13:55:42 crc kubenswrapper[4854]: I1007 13:55:42.721415 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gmm8k" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.121177 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-574bb48d57-zvm4r"] Oct 07 13:55:43 crc kubenswrapper[4854]: E1007 13:55:43.121614 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12bba49-d1b4-469f-96fa-c74a02c4f509" containerName="glance-db-sync" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.121631 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12bba49-d1b4-469f-96fa-c74a02c4f509" containerName="glance-db-sync" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.121796 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12bba49-d1b4-469f-96fa-c74a02c4f509" containerName="glance-db-sync" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.122752 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.138850 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.140675 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.141435 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-config\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.141488 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-ovsdbserver-sb\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.141538 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-ovsdbserver-nb\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.141567 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6r8w\" (UniqueName: \"kubernetes.io/projected/9aa2f194-429f-467c-8802-273b1ee8b633-kube-api-access-z6r8w\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.141611 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-dns-svc\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.148408 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.148673 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.148815 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b2xjl" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.155548 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.177613 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.190319 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-574bb48d57-zvm4r"] Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.242823 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.242900 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.242954 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-config\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.242989 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-ovsdbserver-sb\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.243016 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.243120 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-ovsdbserver-nb\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.243224 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/540ba45d-ad38-4582-995f-3cc8952f70b7-ceph\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.243261 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6r8w\" (UniqueName: \"kubernetes.io/projected/9aa2f194-429f-467c-8802-273b1ee8b633-kube-api-access-z6r8w\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.243283 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/540ba45d-ad38-4582-995f-3cc8952f70b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.243334 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/540ba45d-ad38-4582-995f-3cc8952f70b7-logs\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.243371 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-dns-svc\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.243391 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltvlv\" (UniqueName: \"kubernetes.io/projected/540ba45d-ad38-4582-995f-3cc8952f70b7-kube-api-access-ltvlv\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.244010 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-config\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.244308 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-ovsdbserver-nb\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.244589 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-dns-svc\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.245075 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-ovsdbserver-sb\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.266314 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6r8w\" (UniqueName: \"kubernetes.io/projected/9aa2f194-429f-467c-8802-273b1ee8b633-kube-api-access-z6r8w\") pod \"dnsmasq-dns-574bb48d57-zvm4r\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.303310 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.304931 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.310374 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.319544 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.344187 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.344231 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.344269 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.344289 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.344312 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.344339 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.344361 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.344383 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.344419 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/540ba45d-ad38-4582-995f-3cc8952f70b7-ceph\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.344440 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/540ba45d-ad38-4582-995f-3cc8952f70b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.344461 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/540ba45d-ad38-4582-995f-3cc8952f70b7-logs\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.344481 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltvlv\" (UniqueName: \"kubernetes.io/projected/540ba45d-ad38-4582-995f-3cc8952f70b7-kube-api-access-ltvlv\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.344511 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdwts\" (UniqueName: \"kubernetes.io/projected/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-kube-api-access-sdwts\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.344537 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.346323 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/540ba45d-ad38-4582-995f-3cc8952f70b7-logs\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.346372 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/540ba45d-ad38-4582-995f-3cc8952f70b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.358329 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.358793 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.358935 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.359295 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/540ba45d-ad38-4582-995f-3cc8952f70b7-ceph\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.363224 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltvlv\" (UniqueName: \"kubernetes.io/projected/540ba45d-ad38-4582-995f-3cc8952f70b7-kube-api-access-ltvlv\") pod \"glance-default-external-api-0\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.445033 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.445088 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.445184 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdwts\" (UniqueName: \"kubernetes.io/projected/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-kube-api-access-sdwts\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.445209 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.445228 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.445247 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.445282 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.445610 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.445683 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.449478 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.450884 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.451450 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.451778 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.461708 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdwts\" (UniqueName: \"kubernetes.io/projected/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-kube-api-access-sdwts\") pod \"glance-default-internal-api-0\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.468521 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.499256 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 13:55:43 crc kubenswrapper[4854]: I1007 13:55:43.633118 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 13:55:44 crc kubenswrapper[4854]: I1007 13:55:44.045407 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-574bb48d57-zvm4r"] Oct 07 13:55:44 crc kubenswrapper[4854]: I1007 13:55:44.099625 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 13:55:44 crc kubenswrapper[4854]: I1007 13:55:44.549854 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 13:55:44 crc kubenswrapper[4854]: I1007 13:55:44.807243 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" event={"ID":"9aa2f194-429f-467c-8802-273b1ee8b633","Type":"ContainerDied","Data":"b6c297c3b68447f4330cbfe5ba573f366c5be7f173754660db3099a8b04ac0aa"} Oct 07 13:55:44 crc kubenswrapper[4854]: I1007 13:55:44.806030 4854 generic.go:334] "Generic (PLEG): container finished" podID="9aa2f194-429f-467c-8802-273b1ee8b633" containerID="b6c297c3b68447f4330cbfe5ba573f366c5be7f173754660db3099a8b04ac0aa" exitCode=0 Oct 07 13:55:44 crc kubenswrapper[4854]: I1007 13:55:44.807503 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" event={"ID":"9aa2f194-429f-467c-8802-273b1ee8b633","Type":"ContainerStarted","Data":"06cd65627bb3ef81afe0fabe19d3e24d14b2783766277d0e4f6058ae4441d97d"} Oct 07 13:55:44 crc kubenswrapper[4854]: I1007 13:55:44.814379 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"540ba45d-ad38-4582-995f-3cc8952f70b7","Type":"ContainerStarted","Data":"2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22"} Oct 07 13:55:44 crc kubenswrapper[4854]: I1007 13:55:44.814423 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"540ba45d-ad38-4582-995f-3cc8952f70b7","Type":"ContainerStarted","Data":"e931289052252ff46f4a6a363149d1ac3c8967f7a7e12bb4fc6bd2f5f0b9da59"} Oct 07 13:55:45 crc kubenswrapper[4854]: I1007 13:55:45.122336 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 13:55:45 crc kubenswrapper[4854]: W1007 13:55:45.139344 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a05b143_d4bf_4a71_bcc6_ab90d97fddf1.slice/crio-3e6cde4fc69d9338b6329429f07ef3e8c67a1035042904ec914beb5e829bc85f WatchSource:0}: Error finding container 3e6cde4fc69d9338b6329429f07ef3e8c67a1035042904ec914beb5e829bc85f: Status 404 returned error can't find the container with id 3e6cde4fc69d9338b6329429f07ef3e8c67a1035042904ec914beb5e829bc85f Oct 07 13:55:45 crc kubenswrapper[4854]: I1007 13:55:45.825746 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" event={"ID":"9aa2f194-429f-467c-8802-273b1ee8b633","Type":"ContainerStarted","Data":"d028e05a3d21ca7d28719abff4cf97190f2d637b1baa10a9f7169f65c5d1262b"} Oct 07 13:55:45 crc kubenswrapper[4854]: I1007 13:55:45.826103 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:45 crc kubenswrapper[4854]: I1007 13:55:45.829087 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"540ba45d-ad38-4582-995f-3cc8952f70b7","Type":"ContainerStarted","Data":"f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc"} Oct 07 13:55:45 crc kubenswrapper[4854]: I1007 13:55:45.829215 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="540ba45d-ad38-4582-995f-3cc8952f70b7" containerName="glance-log" containerID="cri-o://2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22" gracePeriod=30 Oct 07 13:55:45 crc kubenswrapper[4854]: I1007 13:55:45.829381 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="540ba45d-ad38-4582-995f-3cc8952f70b7" containerName="glance-httpd" containerID="cri-o://f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc" gracePeriod=30 Oct 07 13:55:45 crc kubenswrapper[4854]: I1007 13:55:45.832342 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1","Type":"ContainerStarted","Data":"5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb"} Oct 07 13:55:45 crc kubenswrapper[4854]: I1007 13:55:45.832395 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1","Type":"ContainerStarted","Data":"3e6cde4fc69d9338b6329429f07ef3e8c67a1035042904ec914beb5e829bc85f"} Oct 07 13:55:45 crc kubenswrapper[4854]: I1007 13:55:45.851101 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" podStartSLOduration=2.85107796 podStartE2EDuration="2.85107796s" podCreationTimestamp="2025-10-07 13:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:45.842387422 +0000 UTC m=+5461.830219687" watchObservedRunningTime="2025-10-07 13:55:45.85107796 +0000 UTC m=+5461.838910215" Oct 07 13:55:45 crc kubenswrapper[4854]: I1007 13:55:45.865736 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.865714848 podStartE2EDuration="2.865714848s" podCreationTimestamp="2025-10-07 13:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:45.861813027 +0000 UTC m=+5461.849645282" watchObservedRunningTime="2025-10-07 13:55:45.865714848 +0000 UTC m=+5461.853547103" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.591493 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.708119 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/540ba45d-ad38-4582-995f-3cc8952f70b7-ceph\") pod \"540ba45d-ad38-4582-995f-3cc8952f70b7\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.708192 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-combined-ca-bundle\") pod \"540ba45d-ad38-4582-995f-3cc8952f70b7\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.708259 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-config-data\") pod \"540ba45d-ad38-4582-995f-3cc8952f70b7\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.708314 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltvlv\" (UniqueName: \"kubernetes.io/projected/540ba45d-ad38-4582-995f-3cc8952f70b7-kube-api-access-ltvlv\") pod \"540ba45d-ad38-4582-995f-3cc8952f70b7\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.708360 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/540ba45d-ad38-4582-995f-3cc8952f70b7-logs\") pod \"540ba45d-ad38-4582-995f-3cc8952f70b7\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.708475 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-scripts\") pod \"540ba45d-ad38-4582-995f-3cc8952f70b7\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.708561 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/540ba45d-ad38-4582-995f-3cc8952f70b7-httpd-run\") pod \"540ba45d-ad38-4582-995f-3cc8952f70b7\" (UID: \"540ba45d-ad38-4582-995f-3cc8952f70b7\") " Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.709264 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/540ba45d-ad38-4582-995f-3cc8952f70b7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "540ba45d-ad38-4582-995f-3cc8952f70b7" (UID: "540ba45d-ad38-4582-995f-3cc8952f70b7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.712647 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/540ba45d-ad38-4582-995f-3cc8952f70b7-ceph" (OuterVolumeSpecName: "ceph") pod "540ba45d-ad38-4582-995f-3cc8952f70b7" (UID: "540ba45d-ad38-4582-995f-3cc8952f70b7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.713201 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/540ba45d-ad38-4582-995f-3cc8952f70b7-logs" (OuterVolumeSpecName: "logs") pod "540ba45d-ad38-4582-995f-3cc8952f70b7" (UID: "540ba45d-ad38-4582-995f-3cc8952f70b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.718808 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/540ba45d-ad38-4582-995f-3cc8952f70b7-kube-api-access-ltvlv" (OuterVolumeSpecName: "kube-api-access-ltvlv") pod "540ba45d-ad38-4582-995f-3cc8952f70b7" (UID: "540ba45d-ad38-4582-995f-3cc8952f70b7"). InnerVolumeSpecName "kube-api-access-ltvlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.720235 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-scripts" (OuterVolumeSpecName: "scripts") pod "540ba45d-ad38-4582-995f-3cc8952f70b7" (UID: "540ba45d-ad38-4582-995f-3cc8952f70b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.747198 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "540ba45d-ad38-4582-995f-3cc8952f70b7" (UID: "540ba45d-ad38-4582-995f-3cc8952f70b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.755224 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-config-data" (OuterVolumeSpecName: "config-data") pod "540ba45d-ad38-4582-995f-3cc8952f70b7" (UID: "540ba45d-ad38-4582-995f-3cc8952f70b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.810433 4854 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/540ba45d-ad38-4582-995f-3cc8952f70b7-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.810472 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/540ba45d-ad38-4582-995f-3cc8952f70b7-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.810482 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.810493 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.810504 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltvlv\" (UniqueName: \"kubernetes.io/projected/540ba45d-ad38-4582-995f-3cc8952f70b7-kube-api-access-ltvlv\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.810515 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/540ba45d-ad38-4582-995f-3cc8952f70b7-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.810539 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/540ba45d-ad38-4582-995f-3cc8952f70b7-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.842219 4854 generic.go:334] "Generic (PLEG): container finished" podID="540ba45d-ad38-4582-995f-3cc8952f70b7" containerID="f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc" exitCode=0 Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.842262 4854 generic.go:334] "Generic (PLEG): container finished" podID="540ba45d-ad38-4582-995f-3cc8952f70b7" containerID="2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22" exitCode=143 Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.842297 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.842333 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"540ba45d-ad38-4582-995f-3cc8952f70b7","Type":"ContainerDied","Data":"f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc"} Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.842372 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"540ba45d-ad38-4582-995f-3cc8952f70b7","Type":"ContainerDied","Data":"2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22"} Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.842391 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"540ba45d-ad38-4582-995f-3cc8952f70b7","Type":"ContainerDied","Data":"e931289052252ff46f4a6a363149d1ac3c8967f7a7e12bb4fc6bd2f5f0b9da59"} Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.842414 4854 scope.go:117] "RemoveContainer" containerID="f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.845540 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1","Type":"ContainerStarted","Data":"f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5"} Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.865632 4854 scope.go:117] "RemoveContainer" containerID="2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.866166 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.866120049 podStartE2EDuration="3.866120049s" podCreationTimestamp="2025-10-07 13:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:46.862800704 +0000 UTC m=+5462.850632959" watchObservedRunningTime="2025-10-07 13:55:46.866120049 +0000 UTC m=+5462.853952314" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.889125 4854 scope.go:117] "RemoveContainer" containerID="f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc" Oct 07 13:55:46 crc kubenswrapper[4854]: E1007 13:55:46.889534 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc\": container with ID starting with f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc not found: ID does not exist" containerID="f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.889662 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc"} err="failed to get container status \"f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc\": rpc error: code = NotFound desc = could not find container \"f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc\": container with ID starting with f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc not found: ID does not exist" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.889793 4854 scope.go:117] "RemoveContainer" containerID="2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22" Oct 07 13:55:46 crc kubenswrapper[4854]: E1007 13:55:46.890197 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22\": container with ID starting with 2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22 not found: ID does not exist" containerID="2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.890231 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22"} err="failed to get container status \"2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22\": rpc error: code = NotFound desc = could not find container \"2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22\": container with ID starting with 2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22 not found: ID does not exist" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.890257 4854 scope.go:117] "RemoveContainer" containerID="f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.890584 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc"} err="failed to get container status \"f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc\": rpc error: code = NotFound desc = could not find container \"f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc\": container with ID starting with f9842cd92904d034714e0830cfc0a2e4ca46dd09c76789054b5f1b70517810fc not found: ID does not exist" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.890626 4854 scope.go:117] "RemoveContainer" containerID="2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.891326 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22"} err="failed to get container status \"2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22\": rpc error: code = NotFound desc = could not find container \"2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22\": container with ID starting with 2804aa5ffaf38dd8aeb06e9cdb2eed986e32e0045a93dd4ad3c8d3e4b855bc22 not found: ID does not exist" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.894647 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.907519 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.926964 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 13:55:46 crc kubenswrapper[4854]: E1007 13:55:46.928024 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="540ba45d-ad38-4582-995f-3cc8952f70b7" containerName="glance-log" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.928050 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="540ba45d-ad38-4582-995f-3cc8952f70b7" containerName="glance-log" Oct 07 13:55:46 crc kubenswrapper[4854]: E1007 13:55:46.928071 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="540ba45d-ad38-4582-995f-3cc8952f70b7" containerName="glance-httpd" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.928081 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="540ba45d-ad38-4582-995f-3cc8952f70b7" containerName="glance-httpd" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.929375 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="540ba45d-ad38-4582-995f-3cc8952f70b7" containerName="glance-log" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.929435 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="540ba45d-ad38-4582-995f-3cc8952f70b7" containerName="glance-httpd" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.942197 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.945836 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 13:55:46 crc kubenswrapper[4854]: I1007 13:55:46.960591 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.014308 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-scripts\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.014384 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-config-data\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.014414 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e598d141-22f5-4f79-8f6f-ca3f5051e894-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.014483 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9j9z\" (UniqueName: \"kubernetes.io/projected/e598d141-22f5-4f79-8f6f-ca3f5051e894-kube-api-access-j9j9z\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.014506 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e598d141-22f5-4f79-8f6f-ca3f5051e894-ceph\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.014569 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.014608 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e598d141-22f5-4f79-8f6f-ca3f5051e894-logs\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.115578 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9j9z\" (UniqueName: \"kubernetes.io/projected/e598d141-22f5-4f79-8f6f-ca3f5051e894-kube-api-access-j9j9z\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.115629 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e598d141-22f5-4f79-8f6f-ca3f5051e894-ceph\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.115689 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.115727 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e598d141-22f5-4f79-8f6f-ca3f5051e894-logs\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.115793 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-scripts\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.116981 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e598d141-22f5-4f79-8f6f-ca3f5051e894-logs\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.117064 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-config-data\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.117090 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e598d141-22f5-4f79-8f6f-ca3f5051e894-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.117595 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e598d141-22f5-4f79-8f6f-ca3f5051e894-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.119881 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e598d141-22f5-4f79-8f6f-ca3f5051e894-ceph\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.120049 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-scripts\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.121232 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-config-data\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.121954 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.131063 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9j9z\" (UniqueName: \"kubernetes.io/projected/e598d141-22f5-4f79-8f6f-ca3f5051e894-kube-api-access-j9j9z\") pod \"glance-default-external-api-0\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.271283 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.345066 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 13:55:47 crc kubenswrapper[4854]: I1007 13:55:47.838302 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 13:55:47 crc kubenswrapper[4854]: W1007 13:55:47.863911 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode598d141_22f5_4f79_8f6f_ca3f5051e894.slice/crio-cbb3222cd52901334f071a9da5dee43576b9aba82ab3e08bc733a55f2bd28c6d WatchSource:0}: Error finding container cbb3222cd52901334f071a9da5dee43576b9aba82ab3e08bc733a55f2bd28c6d: Status 404 returned error can't find the container with id cbb3222cd52901334f071a9da5dee43576b9aba82ab3e08bc733a55f2bd28c6d Oct 07 13:55:48 crc kubenswrapper[4854]: I1007 13:55:48.723577 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="540ba45d-ad38-4582-995f-3cc8952f70b7" path="/var/lib/kubelet/pods/540ba45d-ad38-4582-995f-3cc8952f70b7/volumes" Oct 07 13:55:48 crc kubenswrapper[4854]: I1007 13:55:48.881467 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e598d141-22f5-4f79-8f6f-ca3f5051e894","Type":"ContainerStarted","Data":"b87e87cd1bf9464c9fcfc1bb9b5ffa1bc656a45581b40934961e33158b1b568e"} Oct 07 13:55:48 crc kubenswrapper[4854]: I1007 13:55:48.881520 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e598d141-22f5-4f79-8f6f-ca3f5051e894","Type":"ContainerStarted","Data":"cbb3222cd52901334f071a9da5dee43576b9aba82ab3e08bc733a55f2bd28c6d"} Oct 07 13:55:48 crc kubenswrapper[4854]: I1007 13:55:48.881612 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" containerName="glance-log" containerID="cri-o://5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb" gracePeriod=30 Oct 07 13:55:48 crc kubenswrapper[4854]: I1007 13:55:48.881651 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" containerName="glance-httpd" containerID="cri-o://f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5" gracePeriod=30 Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.498568 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.567609 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-combined-ca-bundle\") pod \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.567698 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-config-data\") pod \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.567755 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdwts\" (UniqueName: \"kubernetes.io/projected/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-kube-api-access-sdwts\") pod \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.567781 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-scripts\") pod \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.567950 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-httpd-run\") pod \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.567985 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-ceph\") pod \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.568029 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-logs\") pod \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\" (UID: \"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1\") " Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.568536 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" (UID: "7a05b143-d4bf-4a71-bcc6-ab90d97fddf1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.568766 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-logs" (OuterVolumeSpecName: "logs") pod "7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" (UID: "7a05b143-d4bf-4a71-bcc6-ab90d97fddf1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.572822 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-ceph" (OuterVolumeSpecName: "ceph") pod "7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" (UID: "7a05b143-d4bf-4a71-bcc6-ab90d97fddf1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.574366 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-kube-api-access-sdwts" (OuterVolumeSpecName: "kube-api-access-sdwts") pod "7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" (UID: "7a05b143-d4bf-4a71-bcc6-ab90d97fddf1"). InnerVolumeSpecName "kube-api-access-sdwts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.574962 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-scripts" (OuterVolumeSpecName: "scripts") pod "7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" (UID: "7a05b143-d4bf-4a71-bcc6-ab90d97fddf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.605943 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" (UID: "7a05b143-d4bf-4a71-bcc6-ab90d97fddf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.615537 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-config-data" (OuterVolumeSpecName: "config-data") pod "7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" (UID: "7a05b143-d4bf-4a71-bcc6-ab90d97fddf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.677273 4854 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.677308 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.677316 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.677324 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.677334 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.677344 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdwts\" (UniqueName: \"kubernetes.io/projected/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-kube-api-access-sdwts\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.677352 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.890926 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e598d141-22f5-4f79-8f6f-ca3f5051e894","Type":"ContainerStarted","Data":"169ec0c025bfa7ffec1106bc9a13be9fbd77d7b8286f71799c28c7c03dc9a0eb"} Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.893721 4854 generic.go:334] "Generic (PLEG): container finished" podID="7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" containerID="f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5" exitCode=0 Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.893747 4854 generic.go:334] "Generic (PLEG): container finished" podID="7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" containerID="5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb" exitCode=143 Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.893771 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1","Type":"ContainerDied","Data":"f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5"} Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.893835 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1","Type":"ContainerDied","Data":"5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb"} Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.893847 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a05b143-d4bf-4a71-bcc6-ab90d97fddf1","Type":"ContainerDied","Data":"3e6cde4fc69d9338b6329429f07ef3e8c67a1035042904ec914beb5e829bc85f"} Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.893867 4854 scope.go:117] "RemoveContainer" containerID="f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.894122 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.915244 4854 scope.go:117] "RemoveContainer" containerID="5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.928257 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.928236143 podStartE2EDuration="3.928236143s" podCreationTimestamp="2025-10-07 13:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:49.914646125 +0000 UTC m=+5465.902478380" watchObservedRunningTime="2025-10-07 13:55:49.928236143 +0000 UTC m=+5465.916068398" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.934631 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.941812 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.943005 4854 scope.go:117] "RemoveContainer" containerID="f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5" Oct 07 13:55:49 crc kubenswrapper[4854]: E1007 13:55:49.943523 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5\": container with ID starting with f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5 not found: ID does not exist" containerID="f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.943558 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5"} err="failed to get container status \"f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5\": rpc error: code = NotFound desc = could not find container \"f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5\": container with ID starting with f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5 not found: ID does not exist" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.943589 4854 scope.go:117] "RemoveContainer" containerID="5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb" Oct 07 13:55:49 crc kubenswrapper[4854]: E1007 13:55:49.946381 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb\": container with ID starting with 5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb not found: ID does not exist" containerID="5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.946437 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb"} err="failed to get container status \"5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb\": rpc error: code = NotFound desc = could not find container \"5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb\": container with ID starting with 5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb not found: ID does not exist" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.946469 4854 scope.go:117] "RemoveContainer" containerID="f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.947115 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5"} err="failed to get container status \"f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5\": rpc error: code = NotFound desc = could not find container \"f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5\": container with ID starting with f18b8bd8acf8fd5801719f077a13db622c8fdb504024940331379a967b4c5cc5 not found: ID does not exist" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.947169 4854 scope.go:117] "RemoveContainer" containerID="5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.947568 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb"} err="failed to get container status \"5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb\": rpc error: code = NotFound desc = could not find container \"5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb\": container with ID starting with 5bd1297536f4907a7cdc9a7cfbd46cde9ec9f3d78109c49a6f9476188fe6f4eb not found: ID does not exist" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.958014 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 13:55:49 crc kubenswrapper[4854]: E1007 13:55:49.958412 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" containerName="glance-httpd" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.958429 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" containerName="glance-httpd" Oct 07 13:55:49 crc kubenswrapper[4854]: E1007 13:55:49.958447 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" containerName="glance-log" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.958455 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" containerName="glance-log" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.958604 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" containerName="glance-httpd" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.958627 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" containerName="glance-log" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.959477 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.963287 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 13:55:49 crc kubenswrapper[4854]: I1007 13:55:49.976825 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.085422 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.085479 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.085502 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.085536 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhtkd\" (UniqueName: \"kubernetes.io/projected/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-kube-api-access-lhtkd\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.085557 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.085600 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.085633 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.187116 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.187210 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.187239 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.187291 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhtkd\" (UniqueName: \"kubernetes.io/projected/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-kube-api-access-lhtkd\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.187322 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.187384 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.187431 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.188442 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.188960 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.192764 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.193655 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.195205 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-ceph\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.196774 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.209024 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhtkd\" (UniqueName: \"kubernetes.io/projected/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-kube-api-access-lhtkd\") pod \"glance-default-internal-api-0\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.285487 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.741258 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a05b143-d4bf-4a71-bcc6-ab90d97fddf1" path="/var/lib/kubelet/pods/7a05b143-d4bf-4a71-bcc6-ab90d97fddf1/volumes" Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.830266 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 13:55:50 crc kubenswrapper[4854]: W1007 13:55:50.836818 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e9c69fe_fbae_4bb3_9b45_99adc96401fb.slice/crio-8a6c8c31589d0c20774cc13ea1acef279ba367856b90af62b5dc865aff9e128c WatchSource:0}: Error finding container 8a6c8c31589d0c20774cc13ea1acef279ba367856b90af62b5dc865aff9e128c: Status 404 returned error can't find the container with id 8a6c8c31589d0c20774cc13ea1acef279ba367856b90af62b5dc865aff9e128c Oct 07 13:55:50 crc kubenswrapper[4854]: I1007 13:55:50.904997 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6e9c69fe-fbae-4bb3-9b45-99adc96401fb","Type":"ContainerStarted","Data":"8a6c8c31589d0c20774cc13ea1acef279ba367856b90af62b5dc865aff9e128c"} Oct 07 13:55:51 crc kubenswrapper[4854]: I1007 13:55:51.916278 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6e9c69fe-fbae-4bb3-9b45-99adc96401fb","Type":"ContainerStarted","Data":"5650f623e058e72a787570039c9691294f481f5d28b04702b86e50542b55c0e7"} Oct 07 13:55:52 crc kubenswrapper[4854]: I1007 13:55:52.929536 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6e9c69fe-fbae-4bb3-9b45-99adc96401fb","Type":"ContainerStarted","Data":"b86f007cec90b15b2a5e447fdd890d97dcf68481f31dcd42468995c12137b3e6"} Oct 07 13:55:52 crc kubenswrapper[4854]: I1007 13:55:52.967488 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.967463624 podStartE2EDuration="3.967463624s" podCreationTimestamp="2025-10-07 13:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:52.950563691 +0000 UTC m=+5468.938395966" watchObservedRunningTime="2025-10-07 13:55:52.967463624 +0000 UTC m=+5468.955295889" Oct 07 13:55:53 crc kubenswrapper[4854]: I1007 13:55:53.469375 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:55:53 crc kubenswrapper[4854]: I1007 13:55:53.555057 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65cd69c4d9-w5szp"] Oct 07 13:55:53 crc kubenswrapper[4854]: I1007 13:55:53.555310 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" podUID="107eee8d-eb8a-4e06-973c-92521472ce75" containerName="dnsmasq-dns" containerID="cri-o://b6a4459a1bcbe5a5ff513fe9ee90698e65a2ab75c621c24c3c7f733da0329897" gracePeriod=10 Oct 07 13:55:53 crc kubenswrapper[4854]: I1007 13:55:53.702219 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:55:53 crc kubenswrapper[4854]: E1007 13:55:53.702788 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:55:53 crc kubenswrapper[4854]: I1007 13:55:53.946688 4854 generic.go:334] "Generic (PLEG): container finished" podID="107eee8d-eb8a-4e06-973c-92521472ce75" containerID="b6a4459a1bcbe5a5ff513fe9ee90698e65a2ab75c621c24c3c7f733da0329897" exitCode=0 Oct 07 13:55:53 crc kubenswrapper[4854]: I1007 13:55:53.946838 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" event={"ID":"107eee8d-eb8a-4e06-973c-92521472ce75","Type":"ContainerDied","Data":"b6a4459a1bcbe5a5ff513fe9ee90698e65a2ab75c621c24c3c7f733da0329897"} Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.046955 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.152639 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-dns-svc\") pod \"107eee8d-eb8a-4e06-973c-92521472ce75\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.152806 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-config\") pod \"107eee8d-eb8a-4e06-973c-92521472ce75\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.152900 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-ovsdbserver-nb\") pod \"107eee8d-eb8a-4e06-973c-92521472ce75\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.152924 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtkjn\" (UniqueName: \"kubernetes.io/projected/107eee8d-eb8a-4e06-973c-92521472ce75-kube-api-access-mtkjn\") pod \"107eee8d-eb8a-4e06-973c-92521472ce75\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.152981 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-ovsdbserver-sb\") pod \"107eee8d-eb8a-4e06-973c-92521472ce75\" (UID: \"107eee8d-eb8a-4e06-973c-92521472ce75\") " Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.162552 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107eee8d-eb8a-4e06-973c-92521472ce75-kube-api-access-mtkjn" (OuterVolumeSpecName: "kube-api-access-mtkjn") pod "107eee8d-eb8a-4e06-973c-92521472ce75" (UID: "107eee8d-eb8a-4e06-973c-92521472ce75"). InnerVolumeSpecName "kube-api-access-mtkjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.212773 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "107eee8d-eb8a-4e06-973c-92521472ce75" (UID: "107eee8d-eb8a-4e06-973c-92521472ce75"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.216219 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "107eee8d-eb8a-4e06-973c-92521472ce75" (UID: "107eee8d-eb8a-4e06-973c-92521472ce75"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.218618 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "107eee8d-eb8a-4e06-973c-92521472ce75" (UID: "107eee8d-eb8a-4e06-973c-92521472ce75"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.223734 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-config" (OuterVolumeSpecName: "config") pod "107eee8d-eb8a-4e06-973c-92521472ce75" (UID: "107eee8d-eb8a-4e06-973c-92521472ce75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.255432 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.255461 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.255474 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtkjn\" (UniqueName: \"kubernetes.io/projected/107eee8d-eb8a-4e06-973c-92521472ce75-kube-api-access-mtkjn\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.255483 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.255491 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/107eee8d-eb8a-4e06-973c-92521472ce75-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.958960 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" event={"ID":"107eee8d-eb8a-4e06-973c-92521472ce75","Type":"ContainerDied","Data":"f8e19b59e326fbbc2f5cc236b8df9292738b907d8633d1067ac594893545c29c"} Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.959104 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cd69c4d9-w5szp" Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.959521 4854 scope.go:117] "RemoveContainer" containerID="b6a4459a1bcbe5a5ff513fe9ee90698e65a2ab75c621c24c3c7f733da0329897" Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.987885 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65cd69c4d9-w5szp"] Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.990701 4854 scope.go:117] "RemoveContainer" containerID="619c69f8dd1682d3820a73b821bd9820740d2977dc4c4c8907b03ded28b4f1ba" Oct 07 13:55:54 crc kubenswrapper[4854]: I1007 13:55:54.997541 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65cd69c4d9-w5szp"] Oct 07 13:55:56 crc kubenswrapper[4854]: I1007 13:55:56.718451 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107eee8d-eb8a-4e06-973c-92521472ce75" path="/var/lib/kubelet/pods/107eee8d-eb8a-4e06-973c-92521472ce75/volumes" Oct 07 13:55:57 crc kubenswrapper[4854]: I1007 13:55:57.271707 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 13:55:57 crc kubenswrapper[4854]: I1007 13:55:57.271752 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 13:55:57 crc kubenswrapper[4854]: I1007 13:55:57.298227 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 13:55:57 crc kubenswrapper[4854]: I1007 13:55:57.308076 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 13:55:57 crc kubenswrapper[4854]: I1007 13:55:57.989561 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 13:55:57 crc kubenswrapper[4854]: I1007 13:55:57.989639 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 13:55:59 crc kubenswrapper[4854]: I1007 13:55:59.842997 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 13:55:59 crc kubenswrapper[4854]: I1007 13:55:59.972232 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 13:56:00 crc kubenswrapper[4854]: I1007 13:56:00.285840 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 13:56:00 crc kubenswrapper[4854]: I1007 13:56:00.285890 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 13:56:00 crc kubenswrapper[4854]: I1007 13:56:00.323730 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 13:56:00 crc kubenswrapper[4854]: I1007 13:56:00.343444 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 13:56:01 crc kubenswrapper[4854]: I1007 13:56:01.040060 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 13:56:01 crc kubenswrapper[4854]: I1007 13:56:01.040135 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 13:56:02 crc kubenswrapper[4854]: I1007 13:56:02.877560 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 13:56:02 crc kubenswrapper[4854]: I1007 13:56:02.896707 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 13:56:06 crc kubenswrapper[4854]: I1007 13:56:06.703223 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:56:06 crc kubenswrapper[4854]: E1007 13:56:06.704001 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:56:11 crc kubenswrapper[4854]: I1007 13:56:11.710231 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-58fb9"] Oct 07 13:56:11 crc kubenswrapper[4854]: E1007 13:56:11.711255 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107eee8d-eb8a-4e06-973c-92521472ce75" containerName="init" Oct 07 13:56:11 crc kubenswrapper[4854]: I1007 13:56:11.711273 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="107eee8d-eb8a-4e06-973c-92521472ce75" containerName="init" Oct 07 13:56:11 crc kubenswrapper[4854]: E1007 13:56:11.711310 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107eee8d-eb8a-4e06-973c-92521472ce75" containerName="dnsmasq-dns" Oct 07 13:56:11 crc kubenswrapper[4854]: I1007 13:56:11.711339 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="107eee8d-eb8a-4e06-973c-92521472ce75" containerName="dnsmasq-dns" Oct 07 13:56:11 crc kubenswrapper[4854]: I1007 13:56:11.711596 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="107eee8d-eb8a-4e06-973c-92521472ce75" containerName="dnsmasq-dns" Oct 07 13:56:11 crc kubenswrapper[4854]: I1007 13:56:11.712588 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-58fb9" Oct 07 13:56:11 crc kubenswrapper[4854]: I1007 13:56:11.722777 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-58fb9"] Oct 07 13:56:11 crc kubenswrapper[4854]: I1007 13:56:11.796598 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4m56\" (UniqueName: \"kubernetes.io/projected/32f4a848-c854-4263-98be-8d6ac718eff6-kube-api-access-c4m56\") pod \"placement-db-create-58fb9\" (UID: \"32f4a848-c854-4263-98be-8d6ac718eff6\") " pod="openstack/placement-db-create-58fb9" Oct 07 13:56:11 crc kubenswrapper[4854]: I1007 13:56:11.898491 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4m56\" (UniqueName: \"kubernetes.io/projected/32f4a848-c854-4263-98be-8d6ac718eff6-kube-api-access-c4m56\") pod \"placement-db-create-58fb9\" (UID: \"32f4a848-c854-4263-98be-8d6ac718eff6\") " pod="openstack/placement-db-create-58fb9" Oct 07 13:56:11 crc kubenswrapper[4854]: I1007 13:56:11.922499 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4m56\" (UniqueName: \"kubernetes.io/projected/32f4a848-c854-4263-98be-8d6ac718eff6-kube-api-access-c4m56\") pod \"placement-db-create-58fb9\" (UID: \"32f4a848-c854-4263-98be-8d6ac718eff6\") " pod="openstack/placement-db-create-58fb9" Oct 07 13:56:12 crc kubenswrapper[4854]: I1007 13:56:12.042722 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-58fb9" Oct 07 13:56:12 crc kubenswrapper[4854]: I1007 13:56:12.373127 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-58fb9"] Oct 07 13:56:12 crc kubenswrapper[4854]: W1007 13:56:12.376285 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32f4a848_c854_4263_98be_8d6ac718eff6.slice/crio-f703f4dbd1e4cfb5e36e86d01d4825aa448d0543b44fa3380001b34953ee10e1 WatchSource:0}: Error finding container f703f4dbd1e4cfb5e36e86d01d4825aa448d0543b44fa3380001b34953ee10e1: Status 404 returned error can't find the container with id f703f4dbd1e4cfb5e36e86d01d4825aa448d0543b44fa3380001b34953ee10e1 Oct 07 13:56:13 crc kubenswrapper[4854]: I1007 13:56:13.171913 4854 generic.go:334] "Generic (PLEG): container finished" podID="32f4a848-c854-4263-98be-8d6ac718eff6" containerID="4e44f53ef93e55b13229f45f8c6a7879f7ee6658377eb002d71600ec998e067c" exitCode=0 Oct 07 13:56:13 crc kubenswrapper[4854]: I1007 13:56:13.171961 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-58fb9" event={"ID":"32f4a848-c854-4263-98be-8d6ac718eff6","Type":"ContainerDied","Data":"4e44f53ef93e55b13229f45f8c6a7879f7ee6658377eb002d71600ec998e067c"} Oct 07 13:56:13 crc kubenswrapper[4854]: I1007 13:56:13.171994 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-58fb9" event={"ID":"32f4a848-c854-4263-98be-8d6ac718eff6","Type":"ContainerStarted","Data":"f703f4dbd1e4cfb5e36e86d01d4825aa448d0543b44fa3380001b34953ee10e1"} Oct 07 13:56:14 crc kubenswrapper[4854]: I1007 13:56:14.582921 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-58fb9" Oct 07 13:56:14 crc kubenswrapper[4854]: I1007 13:56:14.656963 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4m56\" (UniqueName: \"kubernetes.io/projected/32f4a848-c854-4263-98be-8d6ac718eff6-kube-api-access-c4m56\") pod \"32f4a848-c854-4263-98be-8d6ac718eff6\" (UID: \"32f4a848-c854-4263-98be-8d6ac718eff6\") " Oct 07 13:56:14 crc kubenswrapper[4854]: I1007 13:56:14.666203 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32f4a848-c854-4263-98be-8d6ac718eff6-kube-api-access-c4m56" (OuterVolumeSpecName: "kube-api-access-c4m56") pod "32f4a848-c854-4263-98be-8d6ac718eff6" (UID: "32f4a848-c854-4263-98be-8d6ac718eff6"). InnerVolumeSpecName "kube-api-access-c4m56". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:56:14 crc kubenswrapper[4854]: I1007 13:56:14.759280 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4m56\" (UniqueName: \"kubernetes.io/projected/32f4a848-c854-4263-98be-8d6ac718eff6-kube-api-access-c4m56\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:15 crc kubenswrapper[4854]: I1007 13:56:15.197508 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-58fb9" event={"ID":"32f4a848-c854-4263-98be-8d6ac718eff6","Type":"ContainerDied","Data":"f703f4dbd1e4cfb5e36e86d01d4825aa448d0543b44fa3380001b34953ee10e1"} Oct 07 13:56:15 crc kubenswrapper[4854]: I1007 13:56:15.197567 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f703f4dbd1e4cfb5e36e86d01d4825aa448d0543b44fa3380001b34953ee10e1" Oct 07 13:56:15 crc kubenswrapper[4854]: I1007 13:56:15.197652 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-58fb9" Oct 07 13:56:18 crc kubenswrapper[4854]: I1007 13:56:18.703794 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:56:18 crc kubenswrapper[4854]: E1007 13:56:18.704697 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:56:21 crc kubenswrapper[4854]: I1007 13:56:21.771508 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-336c-account-create-679lq"] Oct 07 13:56:21 crc kubenswrapper[4854]: E1007 13:56:21.772498 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f4a848-c854-4263-98be-8d6ac718eff6" containerName="mariadb-database-create" Oct 07 13:56:21 crc kubenswrapper[4854]: I1007 13:56:21.772520 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f4a848-c854-4263-98be-8d6ac718eff6" containerName="mariadb-database-create" Oct 07 13:56:21 crc kubenswrapper[4854]: I1007 13:56:21.772836 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f4a848-c854-4263-98be-8d6ac718eff6" containerName="mariadb-database-create" Oct 07 13:56:21 crc kubenswrapper[4854]: I1007 13:56:21.774073 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-336c-account-create-679lq" Oct 07 13:56:21 crc kubenswrapper[4854]: I1007 13:56:21.780963 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-336c-account-create-679lq"] Oct 07 13:56:21 crc kubenswrapper[4854]: I1007 13:56:21.785591 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 07 13:56:21 crc kubenswrapper[4854]: I1007 13:56:21.903039 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkfxh\" (UniqueName: \"kubernetes.io/projected/8de4cb82-f1ed-484b-ad54-53bab54436c6-kube-api-access-pkfxh\") pod \"placement-336c-account-create-679lq\" (UID: \"8de4cb82-f1ed-484b-ad54-53bab54436c6\") " pod="openstack/placement-336c-account-create-679lq" Oct 07 13:56:21 crc kubenswrapper[4854]: I1007 13:56:21.966787 4854 scope.go:117] "RemoveContainer" containerID="5d7c695acd025dac0a728ba21da2df3779cd37281da99052676fc8850d8306e6" Oct 07 13:56:21 crc kubenswrapper[4854]: I1007 13:56:21.989097 4854 scope.go:117] "RemoveContainer" containerID="f64584e039f9e3bb3a5480e3f5d3e5fa7f8ee5cf1043284bdd5f2a0ae210f71f" Oct 07 13:56:22 crc kubenswrapper[4854]: I1007 13:56:22.006857 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkfxh\" (UniqueName: \"kubernetes.io/projected/8de4cb82-f1ed-484b-ad54-53bab54436c6-kube-api-access-pkfxh\") pod \"placement-336c-account-create-679lq\" (UID: \"8de4cb82-f1ed-484b-ad54-53bab54436c6\") " pod="openstack/placement-336c-account-create-679lq" Oct 07 13:56:22 crc kubenswrapper[4854]: I1007 13:56:22.035411 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkfxh\" (UniqueName: \"kubernetes.io/projected/8de4cb82-f1ed-484b-ad54-53bab54436c6-kube-api-access-pkfxh\") pod \"placement-336c-account-create-679lq\" (UID: \"8de4cb82-f1ed-484b-ad54-53bab54436c6\") " pod="openstack/placement-336c-account-create-679lq" Oct 07 13:56:22 crc kubenswrapper[4854]: I1007 13:56:22.097298 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-336c-account-create-679lq" Oct 07 13:56:22 crc kubenswrapper[4854]: I1007 13:56:22.363367 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-336c-account-create-679lq"] Oct 07 13:56:22 crc kubenswrapper[4854]: W1007 13:56:22.367357 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8de4cb82_f1ed_484b_ad54_53bab54436c6.slice/crio-0d3d444195dfa0a470982a2e47dbf91a045103b079cf0e28732fd7760cea14b8 WatchSource:0}: Error finding container 0d3d444195dfa0a470982a2e47dbf91a045103b079cf0e28732fd7760cea14b8: Status 404 returned error can't find the container with id 0d3d444195dfa0a470982a2e47dbf91a045103b079cf0e28732fd7760cea14b8 Oct 07 13:56:23 crc kubenswrapper[4854]: I1007 13:56:23.285573 4854 generic.go:334] "Generic (PLEG): container finished" podID="8de4cb82-f1ed-484b-ad54-53bab54436c6" containerID="fb33b8b9f7f2fe3595900494c9d909bc664f86175b0a618f1b1e0254a4c8ebda" exitCode=0 Oct 07 13:56:23 crc kubenswrapper[4854]: I1007 13:56:23.285672 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-336c-account-create-679lq" event={"ID":"8de4cb82-f1ed-484b-ad54-53bab54436c6","Type":"ContainerDied","Data":"fb33b8b9f7f2fe3595900494c9d909bc664f86175b0a618f1b1e0254a4c8ebda"} Oct 07 13:56:23 crc kubenswrapper[4854]: I1007 13:56:23.286008 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-336c-account-create-679lq" event={"ID":"8de4cb82-f1ed-484b-ad54-53bab54436c6","Type":"ContainerStarted","Data":"0d3d444195dfa0a470982a2e47dbf91a045103b079cf0e28732fd7760cea14b8"} Oct 07 13:56:24 crc kubenswrapper[4854]: I1007 13:56:24.711859 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-336c-account-create-679lq" Oct 07 13:56:24 crc kubenswrapper[4854]: I1007 13:56:24.867323 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkfxh\" (UniqueName: \"kubernetes.io/projected/8de4cb82-f1ed-484b-ad54-53bab54436c6-kube-api-access-pkfxh\") pod \"8de4cb82-f1ed-484b-ad54-53bab54436c6\" (UID: \"8de4cb82-f1ed-484b-ad54-53bab54436c6\") " Oct 07 13:56:24 crc kubenswrapper[4854]: I1007 13:56:24.874536 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de4cb82-f1ed-484b-ad54-53bab54436c6-kube-api-access-pkfxh" (OuterVolumeSpecName: "kube-api-access-pkfxh") pod "8de4cb82-f1ed-484b-ad54-53bab54436c6" (UID: "8de4cb82-f1ed-484b-ad54-53bab54436c6"). InnerVolumeSpecName "kube-api-access-pkfxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:56:24 crc kubenswrapper[4854]: I1007 13:56:24.970825 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkfxh\" (UniqueName: \"kubernetes.io/projected/8de4cb82-f1ed-484b-ad54-53bab54436c6-kube-api-access-pkfxh\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:25 crc kubenswrapper[4854]: I1007 13:56:25.311539 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-336c-account-create-679lq" event={"ID":"8de4cb82-f1ed-484b-ad54-53bab54436c6","Type":"ContainerDied","Data":"0d3d444195dfa0a470982a2e47dbf91a045103b079cf0e28732fd7760cea14b8"} Oct 07 13:56:25 crc kubenswrapper[4854]: I1007 13:56:25.311640 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d3d444195dfa0a470982a2e47dbf91a045103b079cf0e28732fd7760cea14b8" Oct 07 13:56:25 crc kubenswrapper[4854]: I1007 13:56:25.311647 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-336c-account-create-679lq" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.199430 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7dfbc9b597-t66bm"] Oct 07 13:56:27 crc kubenswrapper[4854]: E1007 13:56:27.200221 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de4cb82-f1ed-484b-ad54-53bab54436c6" containerName="mariadb-account-create" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.200238 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de4cb82-f1ed-484b-ad54-53bab54436c6" containerName="mariadb-account-create" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.213604 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de4cb82-f1ed-484b-ad54-53bab54436c6" containerName="mariadb-account-create" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.214507 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dfbc9b597-t66bm"] Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.214592 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.241102 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-7g4fc"] Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.242472 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.245688 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-g56mn" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.246442 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.250502 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.254367 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7g4fc"] Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.327532 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-dns-svc\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.327592 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-ovsdbserver-nb\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.327622 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-config\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.327708 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrnws\" (UniqueName: \"kubernetes.io/projected/9309351f-1023-49a9-b6ff-205232665c04-kube-api-access-rrnws\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.327922 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-ovsdbserver-sb\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.429851 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t989\" (UniqueName: \"kubernetes.io/projected/a15fef45-0955-4710-9c77-a73aea90e94a-kube-api-access-9t989\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.429935 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-ovsdbserver-sb\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.429978 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-combined-ca-bundle\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.430012 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-dns-svc\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.430030 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15fef45-0955-4710-9c77-a73aea90e94a-logs\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.430063 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-config-data\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.430082 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-ovsdbserver-nb\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.430109 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-config\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.430131 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrnws\" (UniqueName: \"kubernetes.io/projected/9309351f-1023-49a9-b6ff-205232665c04-kube-api-access-rrnws\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.430158 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-scripts\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.430953 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-ovsdbserver-sb\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.431485 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-dns-svc\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.432001 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-ovsdbserver-nb\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.432536 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-config\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.457404 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrnws\" (UniqueName: \"kubernetes.io/projected/9309351f-1023-49a9-b6ff-205232665c04-kube-api-access-rrnws\") pod \"dnsmasq-dns-7dfbc9b597-t66bm\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.532234 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-combined-ca-bundle\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.532541 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15fef45-0955-4710-9c77-a73aea90e94a-logs\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.532580 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-config-data\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.532617 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-scripts\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.532642 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t989\" (UniqueName: \"kubernetes.io/projected/a15fef45-0955-4710-9c77-a73aea90e94a-kube-api-access-9t989\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.533223 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15fef45-0955-4710-9c77-a73aea90e94a-logs\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.537637 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-scripts\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.538182 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-config-data\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.538930 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.541823 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-combined-ca-bundle\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.555240 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t989\" (UniqueName: \"kubernetes.io/projected/a15fef45-0955-4710-9c77-a73aea90e94a-kube-api-access-9t989\") pod \"placement-db-sync-7g4fc\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:27 crc kubenswrapper[4854]: I1007 13:56:27.560522 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:28 crc kubenswrapper[4854]: I1007 13:56:28.011549 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7g4fc"] Oct 07 13:56:28 crc kubenswrapper[4854]: I1007 13:56:28.020378 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dfbc9b597-t66bm"] Oct 07 13:56:28 crc kubenswrapper[4854]: W1007 13:56:28.021562 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda15fef45_0955_4710_9c77_a73aea90e94a.slice/crio-538d01ec591a753c145bb33ebebc0b0a4eabb0029dbb9408c85273301c30c670 WatchSource:0}: Error finding container 538d01ec591a753c145bb33ebebc0b0a4eabb0029dbb9408c85273301c30c670: Status 404 returned error can't find the container with id 538d01ec591a753c145bb33ebebc0b0a4eabb0029dbb9408c85273301c30c670 Oct 07 13:56:28 crc kubenswrapper[4854]: W1007 13:56:28.021819 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9309351f_1023_49a9_b6ff_205232665c04.slice/crio-c29995054050c131aca52b43c8981da649e0ba25ffeaf57e968c14fe1b216440 WatchSource:0}: Error finding container c29995054050c131aca52b43c8981da649e0ba25ffeaf57e968c14fe1b216440: Status 404 returned error can't find the container with id c29995054050c131aca52b43c8981da649e0ba25ffeaf57e968c14fe1b216440 Oct 07 13:56:28 crc kubenswrapper[4854]: I1007 13:56:28.340245 4854 generic.go:334] "Generic (PLEG): container finished" podID="9309351f-1023-49a9-b6ff-205232665c04" containerID="18ebb44fc33eb07c52e333aec1d675d152dcaead7e17401fdff86eceea16933e" exitCode=0 Oct 07 13:56:28 crc kubenswrapper[4854]: I1007 13:56:28.340337 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" event={"ID":"9309351f-1023-49a9-b6ff-205232665c04","Type":"ContainerDied","Data":"18ebb44fc33eb07c52e333aec1d675d152dcaead7e17401fdff86eceea16933e"} Oct 07 13:56:28 crc kubenswrapper[4854]: I1007 13:56:28.340401 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" event={"ID":"9309351f-1023-49a9-b6ff-205232665c04","Type":"ContainerStarted","Data":"c29995054050c131aca52b43c8981da649e0ba25ffeaf57e968c14fe1b216440"} Oct 07 13:56:28 crc kubenswrapper[4854]: I1007 13:56:28.341827 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7g4fc" event={"ID":"a15fef45-0955-4710-9c77-a73aea90e94a","Type":"ContainerStarted","Data":"6f09e8ba6f54f1fe26a8b948a03572e9a4e13a42a054bf343d28f68255c8912b"} Oct 07 13:56:28 crc kubenswrapper[4854]: I1007 13:56:28.341857 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7g4fc" event={"ID":"a15fef45-0955-4710-9c77-a73aea90e94a","Type":"ContainerStarted","Data":"538d01ec591a753c145bb33ebebc0b0a4eabb0029dbb9408c85273301c30c670"} Oct 07 13:56:28 crc kubenswrapper[4854]: I1007 13:56:28.387313 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-7g4fc" podStartSLOduration=1.3872946609999999 podStartE2EDuration="1.387294661s" podCreationTimestamp="2025-10-07 13:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:28.382448343 +0000 UTC m=+5504.370280608" watchObservedRunningTime="2025-10-07 13:56:28.387294661 +0000 UTC m=+5504.375126916" Oct 07 13:56:29 crc kubenswrapper[4854]: I1007 13:56:29.354517 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" event={"ID":"9309351f-1023-49a9-b6ff-205232665c04","Type":"ContainerStarted","Data":"18b0a4d37522dcd0c2daa09cbadfcd71e73f6cb9d57b2ecedcb1490898b463d3"} Oct 07 13:56:29 crc kubenswrapper[4854]: I1007 13:56:29.354701 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:29 crc kubenswrapper[4854]: I1007 13:56:29.391724 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" podStartSLOduration=2.3916856859999998 podStartE2EDuration="2.391685686s" podCreationTimestamp="2025-10-07 13:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:29.378840939 +0000 UTC m=+5505.366673234" watchObservedRunningTime="2025-10-07 13:56:29.391685686 +0000 UTC m=+5505.379518031" Oct 07 13:56:30 crc kubenswrapper[4854]: I1007 13:56:30.368511 4854 generic.go:334] "Generic (PLEG): container finished" podID="a15fef45-0955-4710-9c77-a73aea90e94a" containerID="6f09e8ba6f54f1fe26a8b948a03572e9a4e13a42a054bf343d28f68255c8912b" exitCode=0 Oct 07 13:56:30 crc kubenswrapper[4854]: I1007 13:56:30.368637 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7g4fc" event={"ID":"a15fef45-0955-4710-9c77-a73aea90e94a","Type":"ContainerDied","Data":"6f09e8ba6f54f1fe26a8b948a03572e9a4e13a42a054bf343d28f68255c8912b"} Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:31.758493 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:31.927013 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-scripts\") pod \"a15fef45-0955-4710-9c77-a73aea90e94a\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:31.927096 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-combined-ca-bundle\") pod \"a15fef45-0955-4710-9c77-a73aea90e94a\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:31.927212 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-config-data\") pod \"a15fef45-0955-4710-9c77-a73aea90e94a\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:31.927303 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t989\" (UniqueName: \"kubernetes.io/projected/a15fef45-0955-4710-9c77-a73aea90e94a-kube-api-access-9t989\") pod \"a15fef45-0955-4710-9c77-a73aea90e94a\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:31.927356 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15fef45-0955-4710-9c77-a73aea90e94a-logs\") pod \"a15fef45-0955-4710-9c77-a73aea90e94a\" (UID: \"a15fef45-0955-4710-9c77-a73aea90e94a\") " Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:31.927745 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a15fef45-0955-4710-9c77-a73aea90e94a-logs" (OuterVolumeSpecName: "logs") pod "a15fef45-0955-4710-9c77-a73aea90e94a" (UID: "a15fef45-0955-4710-9c77-a73aea90e94a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:31.928117 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15fef45-0955-4710-9c77-a73aea90e94a-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:31.932397 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15fef45-0955-4710-9c77-a73aea90e94a-kube-api-access-9t989" (OuterVolumeSpecName: "kube-api-access-9t989") pod "a15fef45-0955-4710-9c77-a73aea90e94a" (UID: "a15fef45-0955-4710-9c77-a73aea90e94a"). InnerVolumeSpecName "kube-api-access-9t989". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:31.932711 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-scripts" (OuterVolumeSpecName: "scripts") pod "a15fef45-0955-4710-9c77-a73aea90e94a" (UID: "a15fef45-0955-4710-9c77-a73aea90e94a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:31.952550 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-config-data" (OuterVolumeSpecName: "config-data") pod "a15fef45-0955-4710-9c77-a73aea90e94a" (UID: "a15fef45-0955-4710-9c77-a73aea90e94a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:31.954306 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a15fef45-0955-4710-9c77-a73aea90e94a" (UID: "a15fef45-0955-4710-9c77-a73aea90e94a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.030363 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t989\" (UniqueName: \"kubernetes.io/projected/a15fef45-0955-4710-9c77-a73aea90e94a-kube-api-access-9t989\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.030705 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.030721 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.030733 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15fef45-0955-4710-9c77-a73aea90e94a-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.392980 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7g4fc" event={"ID":"a15fef45-0955-4710-9c77-a73aea90e94a","Type":"ContainerDied","Data":"538d01ec591a753c145bb33ebebc0b0a4eabb0029dbb9408c85273301c30c670"} Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.393037 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="538d01ec591a753c145bb33ebebc0b0a4eabb0029dbb9408c85273301c30c670" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.393127 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7g4fc" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.502440 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-bc8c5c6c6-8plcb"] Oct 07 13:56:32 crc kubenswrapper[4854]: E1007 13:56:32.503005 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15fef45-0955-4710-9c77-a73aea90e94a" containerName="placement-db-sync" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.503032 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15fef45-0955-4710-9c77-a73aea90e94a" containerName="placement-db-sync" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.503313 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15fef45-0955-4710-9c77-a73aea90e94a" containerName="placement-db-sync" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.505305 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.509777 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.509817 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.509994 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-g56mn" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.522297 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bc8c5c6c6-8plcb"] Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.640841 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803e95e2-2ec6-4a78-8083-e327c47e478f-combined-ca-bundle\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.640896 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/803e95e2-2ec6-4a78-8083-e327c47e478f-scripts\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.640964 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803e95e2-2ec6-4a78-8083-e327c47e478f-config-data\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.641124 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9wxj\" (UniqueName: \"kubernetes.io/projected/803e95e2-2ec6-4a78-8083-e327c47e478f-kube-api-access-v9wxj\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.641468 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/803e95e2-2ec6-4a78-8083-e327c47e478f-logs\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.703063 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:56:32 crc kubenswrapper[4854]: E1007 13:56:32.703473 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.743588 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/803e95e2-2ec6-4a78-8083-e327c47e478f-logs\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.743732 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803e95e2-2ec6-4a78-8083-e327c47e478f-combined-ca-bundle\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.743772 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/803e95e2-2ec6-4a78-8083-e327c47e478f-scripts\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.743831 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803e95e2-2ec6-4a78-8083-e327c47e478f-config-data\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.743857 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9wxj\" (UniqueName: \"kubernetes.io/projected/803e95e2-2ec6-4a78-8083-e327c47e478f-kube-api-access-v9wxj\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.744210 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/803e95e2-2ec6-4a78-8083-e327c47e478f-logs\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.747986 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/803e95e2-2ec6-4a78-8083-e327c47e478f-scripts\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.748811 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803e95e2-2ec6-4a78-8083-e327c47e478f-combined-ca-bundle\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.753336 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803e95e2-2ec6-4a78-8083-e327c47e478f-config-data\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.779098 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9wxj\" (UniqueName: \"kubernetes.io/projected/803e95e2-2ec6-4a78-8083-e327c47e478f-kube-api-access-v9wxj\") pod \"placement-bc8c5c6c6-8plcb\" (UID: \"803e95e2-2ec6-4a78-8083-e327c47e478f\") " pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:32 crc kubenswrapper[4854]: I1007 13:56:32.839593 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:33 crc kubenswrapper[4854]: I1007 13:56:33.363932 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bc8c5c6c6-8plcb"] Oct 07 13:56:33 crc kubenswrapper[4854]: I1007 13:56:33.419989 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bc8c5c6c6-8plcb" event={"ID":"803e95e2-2ec6-4a78-8083-e327c47e478f","Type":"ContainerStarted","Data":"2206a52b04927451c087179f1aba1b529fa379e8f86381f54e14953ec9594f88"} Oct 07 13:56:34 crc kubenswrapper[4854]: I1007 13:56:34.431236 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bc8c5c6c6-8plcb" event={"ID":"803e95e2-2ec6-4a78-8083-e327c47e478f","Type":"ContainerStarted","Data":"4f0e530e577a8ddb38ad3b75797214eb5702fde352f6472fc6bc7c01068d9a09"} Oct 07 13:56:34 crc kubenswrapper[4854]: I1007 13:56:34.431878 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:34 crc kubenswrapper[4854]: I1007 13:56:34.431896 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:56:34 crc kubenswrapper[4854]: I1007 13:56:34.431906 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bc8c5c6c6-8plcb" event={"ID":"803e95e2-2ec6-4a78-8083-e327c47e478f","Type":"ContainerStarted","Data":"cb116afa18e7e7c2b09790ef381b8b0484faff9c72cfdf660677aa6d107f928b"} Oct 07 13:56:34 crc kubenswrapper[4854]: I1007 13:56:34.464200 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-bc8c5c6c6-8plcb" podStartSLOduration=2.464176685 podStartE2EDuration="2.464176685s" podCreationTimestamp="2025-10-07 13:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:34.459061719 +0000 UTC m=+5510.446893994" watchObservedRunningTime="2025-10-07 13:56:34.464176685 +0000 UTC m=+5510.452008960" Oct 07 13:56:36 crc kubenswrapper[4854]: I1007 13:56:36.934454 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dwv82"] Oct 07 13:56:36 crc kubenswrapper[4854]: I1007 13:56:36.936911 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:36 crc kubenswrapper[4854]: I1007 13:56:36.954783 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwv82"] Oct 07 13:56:37 crc kubenswrapper[4854]: I1007 13:56:37.028919 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3707344-a7d5-4678-afc0-ee83b239c6c8-utilities\") pod \"redhat-marketplace-dwv82\" (UID: \"a3707344-a7d5-4678-afc0-ee83b239c6c8\") " pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:37 crc kubenswrapper[4854]: I1007 13:56:37.029076 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3707344-a7d5-4678-afc0-ee83b239c6c8-catalog-content\") pod \"redhat-marketplace-dwv82\" (UID: \"a3707344-a7d5-4678-afc0-ee83b239c6c8\") " pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:37 crc kubenswrapper[4854]: I1007 13:56:37.029115 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxj49\" (UniqueName: \"kubernetes.io/projected/a3707344-a7d5-4678-afc0-ee83b239c6c8-kube-api-access-mxj49\") pod \"redhat-marketplace-dwv82\" (UID: \"a3707344-a7d5-4678-afc0-ee83b239c6c8\") " pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:37 crc kubenswrapper[4854]: I1007 13:56:37.131412 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3707344-a7d5-4678-afc0-ee83b239c6c8-catalog-content\") pod \"redhat-marketplace-dwv82\" (UID: \"a3707344-a7d5-4678-afc0-ee83b239c6c8\") " pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:37 crc kubenswrapper[4854]: I1007 13:56:37.131476 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxj49\" (UniqueName: \"kubernetes.io/projected/a3707344-a7d5-4678-afc0-ee83b239c6c8-kube-api-access-mxj49\") pod \"redhat-marketplace-dwv82\" (UID: \"a3707344-a7d5-4678-afc0-ee83b239c6c8\") " pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:37 crc kubenswrapper[4854]: I1007 13:56:37.131598 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3707344-a7d5-4678-afc0-ee83b239c6c8-utilities\") pod \"redhat-marketplace-dwv82\" (UID: \"a3707344-a7d5-4678-afc0-ee83b239c6c8\") " pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:37 crc kubenswrapper[4854]: I1007 13:56:37.132029 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3707344-a7d5-4678-afc0-ee83b239c6c8-catalog-content\") pod \"redhat-marketplace-dwv82\" (UID: \"a3707344-a7d5-4678-afc0-ee83b239c6c8\") " pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:37 crc kubenswrapper[4854]: I1007 13:56:37.132104 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3707344-a7d5-4678-afc0-ee83b239c6c8-utilities\") pod \"redhat-marketplace-dwv82\" (UID: \"a3707344-a7d5-4678-afc0-ee83b239c6c8\") " pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:37 crc kubenswrapper[4854]: I1007 13:56:37.154164 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxj49\" (UniqueName: \"kubernetes.io/projected/a3707344-a7d5-4678-afc0-ee83b239c6c8-kube-api-access-mxj49\") pod \"redhat-marketplace-dwv82\" (UID: \"a3707344-a7d5-4678-afc0-ee83b239c6c8\") " pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:37 crc kubenswrapper[4854]: I1007 13:56:37.277703 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:37 crc kubenswrapper[4854]: I1007 13:56:37.540315 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:56:37 crc kubenswrapper[4854]: I1007 13:56:37.589324 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-574bb48d57-zvm4r"] Oct 07 13:56:37 crc kubenswrapper[4854]: I1007 13:56:37.589552 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" podUID="9aa2f194-429f-467c-8802-273b1ee8b633" containerName="dnsmasq-dns" containerID="cri-o://d028e05a3d21ca7d28719abff4cf97190f2d637b1baa10a9f7169f65c5d1262b" gracePeriod=10 Oct 07 13:56:37 crc kubenswrapper[4854]: I1007 13:56:37.727382 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwv82"] Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.031504 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.147705 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-ovsdbserver-nb\") pod \"9aa2f194-429f-467c-8802-273b1ee8b633\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.147825 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6r8w\" (UniqueName: \"kubernetes.io/projected/9aa2f194-429f-467c-8802-273b1ee8b633-kube-api-access-z6r8w\") pod \"9aa2f194-429f-467c-8802-273b1ee8b633\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.147857 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-dns-svc\") pod \"9aa2f194-429f-467c-8802-273b1ee8b633\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.147927 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-config\") pod \"9aa2f194-429f-467c-8802-273b1ee8b633\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.147958 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-ovsdbserver-sb\") pod \"9aa2f194-429f-467c-8802-273b1ee8b633\" (UID: \"9aa2f194-429f-467c-8802-273b1ee8b633\") " Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.177297 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa2f194-429f-467c-8802-273b1ee8b633-kube-api-access-z6r8w" (OuterVolumeSpecName: "kube-api-access-z6r8w") pod "9aa2f194-429f-467c-8802-273b1ee8b633" (UID: "9aa2f194-429f-467c-8802-273b1ee8b633"). InnerVolumeSpecName "kube-api-access-z6r8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.198188 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9aa2f194-429f-467c-8802-273b1ee8b633" (UID: "9aa2f194-429f-467c-8802-273b1ee8b633"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.204021 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9aa2f194-429f-467c-8802-273b1ee8b633" (UID: "9aa2f194-429f-467c-8802-273b1ee8b633"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.207236 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-config" (OuterVolumeSpecName: "config") pod "9aa2f194-429f-467c-8802-273b1ee8b633" (UID: "9aa2f194-429f-467c-8802-273b1ee8b633"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.207512 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9aa2f194-429f-467c-8802-273b1ee8b633" (UID: "9aa2f194-429f-467c-8802-273b1ee8b633"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.249955 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.249988 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6r8w\" (UniqueName: \"kubernetes.io/projected/9aa2f194-429f-467c-8802-273b1ee8b633-kube-api-access-z6r8w\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.250001 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.250011 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.250020 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa2f194-429f-467c-8802-273b1ee8b633-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.476420 4854 generic.go:334] "Generic (PLEG): container finished" podID="a3707344-a7d5-4678-afc0-ee83b239c6c8" containerID="bea7cc1b206749f19056ee574d628b6274435693183e638f4a0b3143d3174a42" exitCode=0 Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.476517 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwv82" event={"ID":"a3707344-a7d5-4678-afc0-ee83b239c6c8","Type":"ContainerDied","Data":"bea7cc1b206749f19056ee574d628b6274435693183e638f4a0b3143d3174a42"} Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.476838 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwv82" event={"ID":"a3707344-a7d5-4678-afc0-ee83b239c6c8","Type":"ContainerStarted","Data":"b481d5c27406ca5e8bce538af38e8ca418a0a9765ad1c31af0ee20e697f6b94f"} Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.480029 4854 generic.go:334] "Generic (PLEG): container finished" podID="9aa2f194-429f-467c-8802-273b1ee8b633" containerID="d028e05a3d21ca7d28719abff4cf97190f2d637b1baa10a9f7169f65c5d1262b" exitCode=0 Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.480075 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" event={"ID":"9aa2f194-429f-467c-8802-273b1ee8b633","Type":"ContainerDied","Data":"d028e05a3d21ca7d28719abff4cf97190f2d637b1baa10a9f7169f65c5d1262b"} Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.480108 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" event={"ID":"9aa2f194-429f-467c-8802-273b1ee8b633","Type":"ContainerDied","Data":"06cd65627bb3ef81afe0fabe19d3e24d14b2783766277d0e4f6058ae4441d97d"} Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.480114 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-574bb48d57-zvm4r" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.480134 4854 scope.go:117] "RemoveContainer" containerID="d028e05a3d21ca7d28719abff4cf97190f2d637b1baa10a9f7169f65c5d1262b" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.531494 4854 scope.go:117] "RemoveContainer" containerID="b6c297c3b68447f4330cbfe5ba573f366c5be7f173754660db3099a8b04ac0aa" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.543260 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-574bb48d57-zvm4r"] Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.552849 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-574bb48d57-zvm4r"] Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.559125 4854 scope.go:117] "RemoveContainer" containerID="d028e05a3d21ca7d28719abff4cf97190f2d637b1baa10a9f7169f65c5d1262b" Oct 07 13:56:38 crc kubenswrapper[4854]: E1007 13:56:38.559764 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d028e05a3d21ca7d28719abff4cf97190f2d637b1baa10a9f7169f65c5d1262b\": container with ID starting with d028e05a3d21ca7d28719abff4cf97190f2d637b1baa10a9f7169f65c5d1262b not found: ID does not exist" containerID="d028e05a3d21ca7d28719abff4cf97190f2d637b1baa10a9f7169f65c5d1262b" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.559836 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d028e05a3d21ca7d28719abff4cf97190f2d637b1baa10a9f7169f65c5d1262b"} err="failed to get container status \"d028e05a3d21ca7d28719abff4cf97190f2d637b1baa10a9f7169f65c5d1262b\": rpc error: code = NotFound desc = could not find container \"d028e05a3d21ca7d28719abff4cf97190f2d637b1baa10a9f7169f65c5d1262b\": container with ID starting with d028e05a3d21ca7d28719abff4cf97190f2d637b1baa10a9f7169f65c5d1262b not found: ID does not exist" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.559878 4854 scope.go:117] "RemoveContainer" containerID="b6c297c3b68447f4330cbfe5ba573f366c5be7f173754660db3099a8b04ac0aa" Oct 07 13:56:38 crc kubenswrapper[4854]: E1007 13:56:38.560660 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6c297c3b68447f4330cbfe5ba573f366c5be7f173754660db3099a8b04ac0aa\": container with ID starting with b6c297c3b68447f4330cbfe5ba573f366c5be7f173754660db3099a8b04ac0aa not found: ID does not exist" containerID="b6c297c3b68447f4330cbfe5ba573f366c5be7f173754660db3099a8b04ac0aa" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.560700 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c297c3b68447f4330cbfe5ba573f366c5be7f173754660db3099a8b04ac0aa"} err="failed to get container status \"b6c297c3b68447f4330cbfe5ba573f366c5be7f173754660db3099a8b04ac0aa\": rpc error: code = NotFound desc = could not find container \"b6c297c3b68447f4330cbfe5ba573f366c5be7f173754660db3099a8b04ac0aa\": container with ID starting with b6c297c3b68447f4330cbfe5ba573f366c5be7f173754660db3099a8b04ac0aa not found: ID does not exist" Oct 07 13:56:38 crc kubenswrapper[4854]: I1007 13:56:38.717535 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa2f194-429f-467c-8802-273b1ee8b633" path="/var/lib/kubelet/pods/9aa2f194-429f-467c-8802-273b1ee8b633/volumes" Oct 07 13:56:40 crc kubenswrapper[4854]: I1007 13:56:40.511709 4854 generic.go:334] "Generic (PLEG): container finished" podID="a3707344-a7d5-4678-afc0-ee83b239c6c8" containerID="b159c90767714963fda23a987255664608495a9eed9476874b99e3ec1b07ab0f" exitCode=0 Oct 07 13:56:40 crc kubenswrapper[4854]: I1007 13:56:40.511908 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwv82" event={"ID":"a3707344-a7d5-4678-afc0-ee83b239c6c8","Type":"ContainerDied","Data":"b159c90767714963fda23a987255664608495a9eed9476874b99e3ec1b07ab0f"} Oct 07 13:56:41 crc kubenswrapper[4854]: I1007 13:56:41.527772 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwv82" event={"ID":"a3707344-a7d5-4678-afc0-ee83b239c6c8","Type":"ContainerStarted","Data":"fa254708ee8a79a3bfd88826f96329f8ac1cf29e2e888ac01a3f9e8622209e83"} Oct 07 13:56:41 crc kubenswrapper[4854]: I1007 13:56:41.557411 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dwv82" podStartSLOduration=3.105950236 podStartE2EDuration="5.557382996s" podCreationTimestamp="2025-10-07 13:56:36 +0000 UTC" firstStartedPulling="2025-10-07 13:56:38.479623115 +0000 UTC m=+5514.467455380" lastFinishedPulling="2025-10-07 13:56:40.931055855 +0000 UTC m=+5516.918888140" observedRunningTime="2025-10-07 13:56:41.553237767 +0000 UTC m=+5517.541070092" watchObservedRunningTime="2025-10-07 13:56:41.557382996 +0000 UTC m=+5517.545215281" Oct 07 13:56:44 crc kubenswrapper[4854]: I1007 13:56:44.712739 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:56:44 crc kubenswrapper[4854]: E1007 13:56:44.713200 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:56:47 crc kubenswrapper[4854]: I1007 13:56:47.278847 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:47 crc kubenswrapper[4854]: I1007 13:56:47.279379 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:47 crc kubenswrapper[4854]: I1007 13:56:47.362002 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:47 crc kubenswrapper[4854]: I1007 13:56:47.672970 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:47 crc kubenswrapper[4854]: I1007 13:56:47.735294 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwv82"] Oct 07 13:56:49 crc kubenswrapper[4854]: I1007 13:56:49.628220 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dwv82" podUID="a3707344-a7d5-4678-afc0-ee83b239c6c8" containerName="registry-server" containerID="cri-o://fa254708ee8a79a3bfd88826f96329f8ac1cf29e2e888ac01a3f9e8622209e83" gracePeriod=2 Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.181253 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.218546 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxj49\" (UniqueName: \"kubernetes.io/projected/a3707344-a7d5-4678-afc0-ee83b239c6c8-kube-api-access-mxj49\") pod \"a3707344-a7d5-4678-afc0-ee83b239c6c8\" (UID: \"a3707344-a7d5-4678-afc0-ee83b239c6c8\") " Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.218978 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3707344-a7d5-4678-afc0-ee83b239c6c8-catalog-content\") pod \"a3707344-a7d5-4678-afc0-ee83b239c6c8\" (UID: \"a3707344-a7d5-4678-afc0-ee83b239c6c8\") " Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.219072 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3707344-a7d5-4678-afc0-ee83b239c6c8-utilities\") pod \"a3707344-a7d5-4678-afc0-ee83b239c6c8\" (UID: \"a3707344-a7d5-4678-afc0-ee83b239c6c8\") " Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.220440 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3707344-a7d5-4678-afc0-ee83b239c6c8-utilities" (OuterVolumeSpecName: "utilities") pod "a3707344-a7d5-4678-afc0-ee83b239c6c8" (UID: "a3707344-a7d5-4678-afc0-ee83b239c6c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.225029 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3707344-a7d5-4678-afc0-ee83b239c6c8-kube-api-access-mxj49" (OuterVolumeSpecName: "kube-api-access-mxj49") pod "a3707344-a7d5-4678-afc0-ee83b239c6c8" (UID: "a3707344-a7d5-4678-afc0-ee83b239c6c8"). InnerVolumeSpecName "kube-api-access-mxj49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.244996 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3707344-a7d5-4678-afc0-ee83b239c6c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3707344-a7d5-4678-afc0-ee83b239c6c8" (UID: "a3707344-a7d5-4678-afc0-ee83b239c6c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.322287 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxj49\" (UniqueName: \"kubernetes.io/projected/a3707344-a7d5-4678-afc0-ee83b239c6c8-kube-api-access-mxj49\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.322347 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3707344-a7d5-4678-afc0-ee83b239c6c8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.322374 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3707344-a7d5-4678-afc0-ee83b239c6c8-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.639737 4854 generic.go:334] "Generic (PLEG): container finished" podID="a3707344-a7d5-4678-afc0-ee83b239c6c8" containerID="fa254708ee8a79a3bfd88826f96329f8ac1cf29e2e888ac01a3f9e8622209e83" exitCode=0 Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.639811 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwv82" event={"ID":"a3707344-a7d5-4678-afc0-ee83b239c6c8","Type":"ContainerDied","Data":"fa254708ee8a79a3bfd88826f96329f8ac1cf29e2e888ac01a3f9e8622209e83"} Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.639844 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwv82" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.639873 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwv82" event={"ID":"a3707344-a7d5-4678-afc0-ee83b239c6c8","Type":"ContainerDied","Data":"b481d5c27406ca5e8bce538af38e8ca418a0a9765ad1c31af0ee20e697f6b94f"} Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.639912 4854 scope.go:117] "RemoveContainer" containerID="fa254708ee8a79a3bfd88826f96329f8ac1cf29e2e888ac01a3f9e8622209e83" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.668097 4854 scope.go:117] "RemoveContainer" containerID="b159c90767714963fda23a987255664608495a9eed9476874b99e3ec1b07ab0f" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.710024 4854 scope.go:117] "RemoveContainer" containerID="bea7cc1b206749f19056ee574d628b6274435693183e638f4a0b3143d3174a42" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.718724 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwv82"] Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.723792 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwv82"] Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.737934 4854 scope.go:117] "RemoveContainer" containerID="fa254708ee8a79a3bfd88826f96329f8ac1cf29e2e888ac01a3f9e8622209e83" Oct 07 13:56:50 crc kubenswrapper[4854]: E1007 13:56:50.738563 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa254708ee8a79a3bfd88826f96329f8ac1cf29e2e888ac01a3f9e8622209e83\": container with ID starting with fa254708ee8a79a3bfd88826f96329f8ac1cf29e2e888ac01a3f9e8622209e83 not found: ID does not exist" containerID="fa254708ee8a79a3bfd88826f96329f8ac1cf29e2e888ac01a3f9e8622209e83" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.738616 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa254708ee8a79a3bfd88826f96329f8ac1cf29e2e888ac01a3f9e8622209e83"} err="failed to get container status \"fa254708ee8a79a3bfd88826f96329f8ac1cf29e2e888ac01a3f9e8622209e83\": rpc error: code = NotFound desc = could not find container \"fa254708ee8a79a3bfd88826f96329f8ac1cf29e2e888ac01a3f9e8622209e83\": container with ID starting with fa254708ee8a79a3bfd88826f96329f8ac1cf29e2e888ac01a3f9e8622209e83 not found: ID does not exist" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.738691 4854 scope.go:117] "RemoveContainer" containerID="b159c90767714963fda23a987255664608495a9eed9476874b99e3ec1b07ab0f" Oct 07 13:56:50 crc kubenswrapper[4854]: E1007 13:56:50.739122 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b159c90767714963fda23a987255664608495a9eed9476874b99e3ec1b07ab0f\": container with ID starting with b159c90767714963fda23a987255664608495a9eed9476874b99e3ec1b07ab0f not found: ID does not exist" containerID="b159c90767714963fda23a987255664608495a9eed9476874b99e3ec1b07ab0f" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.739180 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b159c90767714963fda23a987255664608495a9eed9476874b99e3ec1b07ab0f"} err="failed to get container status \"b159c90767714963fda23a987255664608495a9eed9476874b99e3ec1b07ab0f\": rpc error: code = NotFound desc = could not find container \"b159c90767714963fda23a987255664608495a9eed9476874b99e3ec1b07ab0f\": container with ID starting with b159c90767714963fda23a987255664608495a9eed9476874b99e3ec1b07ab0f not found: ID does not exist" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.739208 4854 scope.go:117] "RemoveContainer" containerID="bea7cc1b206749f19056ee574d628b6274435693183e638f4a0b3143d3174a42" Oct 07 13:56:50 crc kubenswrapper[4854]: E1007 13:56:50.739540 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bea7cc1b206749f19056ee574d628b6274435693183e638f4a0b3143d3174a42\": container with ID starting with bea7cc1b206749f19056ee574d628b6274435693183e638f4a0b3143d3174a42 not found: ID does not exist" containerID="bea7cc1b206749f19056ee574d628b6274435693183e638f4a0b3143d3174a42" Oct 07 13:56:50 crc kubenswrapper[4854]: I1007 13:56:50.739568 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea7cc1b206749f19056ee574d628b6274435693183e638f4a0b3143d3174a42"} err="failed to get container status \"bea7cc1b206749f19056ee574d628b6274435693183e638f4a0b3143d3174a42\": rpc error: code = NotFound desc = could not find container \"bea7cc1b206749f19056ee574d628b6274435693183e638f4a0b3143d3174a42\": container with ID starting with bea7cc1b206749f19056ee574d628b6274435693183e638f4a0b3143d3174a42 not found: ID does not exist" Oct 07 13:56:52 crc kubenswrapper[4854]: I1007 13:56:52.712877 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3707344-a7d5-4678-afc0-ee83b239c6c8" path="/var/lib/kubelet/pods/a3707344-a7d5-4678-afc0-ee83b239c6c8/volumes" Oct 07 13:56:55 crc kubenswrapper[4854]: I1007 13:56:55.703538 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:56:55 crc kubenswrapper[4854]: E1007 13:56:55.704225 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:57:03 crc kubenswrapper[4854]: I1007 13:57:03.841241 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:57:03 crc kubenswrapper[4854]: I1007 13:57:03.843217 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-bc8c5c6c6-8plcb" Oct 07 13:57:07 crc kubenswrapper[4854]: I1007 13:57:07.702415 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:57:07 crc kubenswrapper[4854]: E1007 13:57:07.703013 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:57:18 crc kubenswrapper[4854]: I1007 13:57:18.704994 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:57:18 crc kubenswrapper[4854]: E1007 13:57:18.706186 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.015980 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g98jt"] Oct 07 13:57:20 crc kubenswrapper[4854]: E1007 13:57:20.016945 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa2f194-429f-467c-8802-273b1ee8b633" containerName="init" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.016972 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa2f194-429f-467c-8802-273b1ee8b633" containerName="init" Oct 07 13:57:20 crc kubenswrapper[4854]: E1007 13:57:20.017045 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3707344-a7d5-4678-afc0-ee83b239c6c8" containerName="extract-content" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.017060 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3707344-a7d5-4678-afc0-ee83b239c6c8" containerName="extract-content" Oct 07 13:57:20 crc kubenswrapper[4854]: E1007 13:57:20.017118 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3707344-a7d5-4678-afc0-ee83b239c6c8" containerName="registry-server" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.017133 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3707344-a7d5-4678-afc0-ee83b239c6c8" containerName="registry-server" Oct 07 13:57:20 crc kubenswrapper[4854]: E1007 13:57:20.017199 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa2f194-429f-467c-8802-273b1ee8b633" containerName="dnsmasq-dns" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.017213 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa2f194-429f-467c-8802-273b1ee8b633" containerName="dnsmasq-dns" Oct 07 13:57:20 crc kubenswrapper[4854]: E1007 13:57:20.017289 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3707344-a7d5-4678-afc0-ee83b239c6c8" containerName="extract-utilities" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.017303 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3707344-a7d5-4678-afc0-ee83b239c6c8" containerName="extract-utilities" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.017850 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3707344-a7d5-4678-afc0-ee83b239c6c8" containerName="registry-server" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.017945 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa2f194-429f-467c-8802-273b1ee8b633" containerName="dnsmasq-dns" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.022796 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.043289 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g98jt"] Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.167709 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhpkc\" (UniqueName: \"kubernetes.io/projected/42960866-e33a-47d2-901d-9c581683e7ab-kube-api-access-nhpkc\") pod \"certified-operators-g98jt\" (UID: \"42960866-e33a-47d2-901d-9c581683e7ab\") " pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.167777 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42960866-e33a-47d2-901d-9c581683e7ab-utilities\") pod \"certified-operators-g98jt\" (UID: \"42960866-e33a-47d2-901d-9c581683e7ab\") " pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.167854 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42960866-e33a-47d2-901d-9c581683e7ab-catalog-content\") pod \"certified-operators-g98jt\" (UID: \"42960866-e33a-47d2-901d-9c581683e7ab\") " pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.269374 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42960866-e33a-47d2-901d-9c581683e7ab-catalog-content\") pod \"certified-operators-g98jt\" (UID: \"42960866-e33a-47d2-901d-9c581683e7ab\") " pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.269743 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhpkc\" (UniqueName: \"kubernetes.io/projected/42960866-e33a-47d2-901d-9c581683e7ab-kube-api-access-nhpkc\") pod \"certified-operators-g98jt\" (UID: \"42960866-e33a-47d2-901d-9c581683e7ab\") " pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.269773 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42960866-e33a-47d2-901d-9c581683e7ab-utilities\") pod \"certified-operators-g98jt\" (UID: \"42960866-e33a-47d2-901d-9c581683e7ab\") " pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.269933 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42960866-e33a-47d2-901d-9c581683e7ab-catalog-content\") pod \"certified-operators-g98jt\" (UID: \"42960866-e33a-47d2-901d-9c581683e7ab\") " pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.270119 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42960866-e33a-47d2-901d-9c581683e7ab-utilities\") pod \"certified-operators-g98jt\" (UID: \"42960866-e33a-47d2-901d-9c581683e7ab\") " pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.292387 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhpkc\" (UniqueName: \"kubernetes.io/projected/42960866-e33a-47d2-901d-9c581683e7ab-kube-api-access-nhpkc\") pod \"certified-operators-g98jt\" (UID: \"42960866-e33a-47d2-901d-9c581683e7ab\") " pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.365652 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.856336 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g98jt"] Oct 07 13:57:20 crc kubenswrapper[4854]: I1007 13:57:20.947510 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g98jt" event={"ID":"42960866-e33a-47d2-901d-9c581683e7ab","Type":"ContainerStarted","Data":"dfcd968ea644409f9c350b6df80d76196a461e124f672589e4a49a74f0c6e93b"} Oct 07 13:57:21 crc kubenswrapper[4854]: I1007 13:57:21.957763 4854 generic.go:334] "Generic (PLEG): container finished" podID="42960866-e33a-47d2-901d-9c581683e7ab" containerID="19e9c42289c2e86dc097ba1c6cf4a00e18ac7add26f5e57066a1a775dac9ae67" exitCode=0 Oct 07 13:57:21 crc kubenswrapper[4854]: I1007 13:57:21.958066 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g98jt" event={"ID":"42960866-e33a-47d2-901d-9c581683e7ab","Type":"ContainerDied","Data":"19e9c42289c2e86dc097ba1c6cf4a00e18ac7add26f5e57066a1a775dac9ae67"} Oct 07 13:57:22 crc kubenswrapper[4854]: I1007 13:57:22.149164 4854 scope.go:117] "RemoveContainer" containerID="32a0afc3d512ac404f6dbc29f85124ed93f3712c65f6763cc70eeb8a10f831bd" Oct 07 13:57:22 crc kubenswrapper[4854]: I1007 13:57:22.970918 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g98jt" event={"ID":"42960866-e33a-47d2-901d-9c581683e7ab","Type":"ContainerStarted","Data":"d71b6e1cea196ad1aaea024264b6f4e097fdca400d19fedf56e1bdade2e8db42"} Oct 07 13:57:23 crc kubenswrapper[4854]: I1007 13:57:23.982496 4854 generic.go:334] "Generic (PLEG): container finished" podID="42960866-e33a-47d2-901d-9c581683e7ab" containerID="d71b6e1cea196ad1aaea024264b6f4e097fdca400d19fedf56e1bdade2e8db42" exitCode=0 Oct 07 13:57:23 crc kubenswrapper[4854]: I1007 13:57:23.982548 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g98jt" event={"ID":"42960866-e33a-47d2-901d-9c581683e7ab","Type":"ContainerDied","Data":"d71b6e1cea196ad1aaea024264b6f4e097fdca400d19fedf56e1bdade2e8db42"} Oct 07 13:57:24 crc kubenswrapper[4854]: I1007 13:57:24.995206 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g98jt" event={"ID":"42960866-e33a-47d2-901d-9c581683e7ab","Type":"ContainerStarted","Data":"146a7c594c57c96699d12df482fc0fe5d4702c5c1dd8558f1fad84ad5d7da37a"} Oct 07 13:57:25 crc kubenswrapper[4854]: I1007 13:57:25.016708 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g98jt" podStartSLOduration=3.508086452 podStartE2EDuration="6.016688877s" podCreationTimestamp="2025-10-07 13:57:19 +0000 UTC" firstStartedPulling="2025-10-07 13:57:21.959774121 +0000 UTC m=+5557.947606366" lastFinishedPulling="2025-10-07 13:57:24.468376536 +0000 UTC m=+5560.456208791" observedRunningTime="2025-10-07 13:57:25.010216012 +0000 UTC m=+5560.998048307" watchObservedRunningTime="2025-10-07 13:57:25.016688877 +0000 UTC m=+5561.004521142" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.183123 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-x4q2z"] Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.185665 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x4q2z" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.199681 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-x4q2z"] Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.255285 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-ffkq6"] Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.256644 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ffkq6" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.262233 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-ffkq6"] Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.296541 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb2rb\" (UniqueName: \"kubernetes.io/projected/14baabea-8903-4587-bb73-3183643a716f-kube-api-access-xb2rb\") pod \"nova-api-db-create-x4q2z\" (UID: \"14baabea-8903-4587-bb73-3183643a716f\") " pod="openstack/nova-api-db-create-x4q2z" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.354965 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-chs97"] Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.356248 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-chs97" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.366331 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-chs97"] Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.398351 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v67fc\" (UniqueName: \"kubernetes.io/projected/b39e71d1-d571-4a66-ac83-d4f3a2816fda-kube-api-access-v67fc\") pod \"nova-cell0-db-create-ffkq6\" (UID: \"b39e71d1-d571-4a66-ac83-d4f3a2816fda\") " pod="openstack/nova-cell0-db-create-ffkq6" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.398408 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb2rb\" (UniqueName: \"kubernetes.io/projected/14baabea-8903-4587-bb73-3183643a716f-kube-api-access-xb2rb\") pod \"nova-api-db-create-x4q2z\" (UID: \"14baabea-8903-4587-bb73-3183643a716f\") " pod="openstack/nova-api-db-create-x4q2z" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.420358 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb2rb\" (UniqueName: \"kubernetes.io/projected/14baabea-8903-4587-bb73-3183643a716f-kube-api-access-xb2rb\") pod \"nova-api-db-create-x4q2z\" (UID: \"14baabea-8903-4587-bb73-3183643a716f\") " pod="openstack/nova-api-db-create-x4q2z" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.499581 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v67fc\" (UniqueName: \"kubernetes.io/projected/b39e71d1-d571-4a66-ac83-d4f3a2816fda-kube-api-access-v67fc\") pod \"nova-cell0-db-create-ffkq6\" (UID: \"b39e71d1-d571-4a66-ac83-d4f3a2816fda\") " pod="openstack/nova-cell0-db-create-ffkq6" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.499764 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mzv6\" (UniqueName: \"kubernetes.io/projected/5491e135-8a4d-4f3b-b914-d837087b3826-kube-api-access-2mzv6\") pod \"nova-cell1-db-create-chs97\" (UID: \"5491e135-8a4d-4f3b-b914-d837087b3826\") " pod="openstack/nova-cell1-db-create-chs97" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.506831 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x4q2z" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.519901 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v67fc\" (UniqueName: \"kubernetes.io/projected/b39e71d1-d571-4a66-ac83-d4f3a2816fda-kube-api-access-v67fc\") pod \"nova-cell0-db-create-ffkq6\" (UID: \"b39e71d1-d571-4a66-ac83-d4f3a2816fda\") " pod="openstack/nova-cell0-db-create-ffkq6" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.584297 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ffkq6" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.602438 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mzv6\" (UniqueName: \"kubernetes.io/projected/5491e135-8a4d-4f3b-b914-d837087b3826-kube-api-access-2mzv6\") pod \"nova-cell1-db-create-chs97\" (UID: \"5491e135-8a4d-4f3b-b914-d837087b3826\") " pod="openstack/nova-cell1-db-create-chs97" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.619790 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mzv6\" (UniqueName: \"kubernetes.io/projected/5491e135-8a4d-4f3b-b914-d837087b3826-kube-api-access-2mzv6\") pod \"nova-cell1-db-create-chs97\" (UID: \"5491e135-8a4d-4f3b-b914-d837087b3826\") " pod="openstack/nova-cell1-db-create-chs97" Oct 07 13:57:26 crc kubenswrapper[4854]: I1007 13:57:26.690270 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-chs97" Oct 07 13:57:27 crc kubenswrapper[4854]: I1007 13:57:27.025215 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-x4q2z"] Oct 07 13:57:27 crc kubenswrapper[4854]: I1007 13:57:27.130798 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-ffkq6"] Oct 07 13:57:27 crc kubenswrapper[4854]: W1007 13:57:27.133502 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb39e71d1_d571_4a66_ac83_d4f3a2816fda.slice/crio-3be2f651ed96a2c0e6921ecdd7edb4919ac854eb7f271f2a3e6fe051ccd284bf WatchSource:0}: Error finding container 3be2f651ed96a2c0e6921ecdd7edb4919ac854eb7f271f2a3e6fe051ccd284bf: Status 404 returned error can't find the container with id 3be2f651ed96a2c0e6921ecdd7edb4919ac854eb7f271f2a3e6fe051ccd284bf Oct 07 13:57:27 crc kubenswrapper[4854]: I1007 13:57:27.194954 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-chs97"] Oct 07 13:57:27 crc kubenswrapper[4854]: W1007 13:57:27.199164 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5491e135_8a4d_4f3b_b914_d837087b3826.slice/crio-b5157f76863954b814c1037f2c44546c2e3c57fb94687dc19bc5f7fbf11d9b22 WatchSource:0}: Error finding container b5157f76863954b814c1037f2c44546c2e3c57fb94687dc19bc5f7fbf11d9b22: Status 404 returned error can't find the container with id b5157f76863954b814c1037f2c44546c2e3c57fb94687dc19bc5f7fbf11d9b22 Oct 07 13:57:28 crc kubenswrapper[4854]: I1007 13:57:28.026993 4854 generic.go:334] "Generic (PLEG): container finished" podID="14baabea-8903-4587-bb73-3183643a716f" containerID="2e783d4b8d5383605800e1c8e6ea4a86935e417c49a7dc3493ad122f4a0c171e" exitCode=0 Oct 07 13:57:28 crc kubenswrapper[4854]: I1007 13:57:28.027119 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x4q2z" event={"ID":"14baabea-8903-4587-bb73-3183643a716f","Type":"ContainerDied","Data":"2e783d4b8d5383605800e1c8e6ea4a86935e417c49a7dc3493ad122f4a0c171e"} Oct 07 13:57:28 crc kubenswrapper[4854]: I1007 13:57:28.027292 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x4q2z" event={"ID":"14baabea-8903-4587-bb73-3183643a716f","Type":"ContainerStarted","Data":"6ab2f3a9b8ccf15c2e51b9cd52b628b8678bc199caa039b0336969e473935a55"} Oct 07 13:57:28 crc kubenswrapper[4854]: I1007 13:57:28.030126 4854 generic.go:334] "Generic (PLEG): container finished" podID="b39e71d1-d571-4a66-ac83-d4f3a2816fda" containerID="7e9fae7d0ed33c260a17f3617aaaf076c4863385974f314669ae18e9c647b54c" exitCode=0 Oct 07 13:57:28 crc kubenswrapper[4854]: I1007 13:57:28.030190 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ffkq6" event={"ID":"b39e71d1-d571-4a66-ac83-d4f3a2816fda","Type":"ContainerDied","Data":"7e9fae7d0ed33c260a17f3617aaaf076c4863385974f314669ae18e9c647b54c"} Oct 07 13:57:28 crc kubenswrapper[4854]: I1007 13:57:28.030208 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ffkq6" event={"ID":"b39e71d1-d571-4a66-ac83-d4f3a2816fda","Type":"ContainerStarted","Data":"3be2f651ed96a2c0e6921ecdd7edb4919ac854eb7f271f2a3e6fe051ccd284bf"} Oct 07 13:57:28 crc kubenswrapper[4854]: I1007 13:57:28.032415 4854 generic.go:334] "Generic (PLEG): container finished" podID="5491e135-8a4d-4f3b-b914-d837087b3826" containerID="b4ed2e1c5d94007508f04b23ef0de26b29d3ba95195164bbf76397bb026525a0" exitCode=0 Oct 07 13:57:28 crc kubenswrapper[4854]: I1007 13:57:28.032438 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-chs97" event={"ID":"5491e135-8a4d-4f3b-b914-d837087b3826","Type":"ContainerDied","Data":"b4ed2e1c5d94007508f04b23ef0de26b29d3ba95195164bbf76397bb026525a0"} Oct 07 13:57:28 crc kubenswrapper[4854]: I1007 13:57:28.032453 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-chs97" event={"ID":"5491e135-8a4d-4f3b-b914-d837087b3826","Type":"ContainerStarted","Data":"b5157f76863954b814c1037f2c44546c2e3c57fb94687dc19bc5f7fbf11d9b22"} Oct 07 13:57:29 crc kubenswrapper[4854]: I1007 13:57:29.473674 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-chs97" Oct 07 13:57:29 crc kubenswrapper[4854]: I1007 13:57:29.479717 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x4q2z" Oct 07 13:57:29 crc kubenswrapper[4854]: I1007 13:57:29.488906 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ffkq6" Oct 07 13:57:29 crc kubenswrapper[4854]: I1007 13:57:29.572306 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mzv6\" (UniqueName: \"kubernetes.io/projected/5491e135-8a4d-4f3b-b914-d837087b3826-kube-api-access-2mzv6\") pod \"5491e135-8a4d-4f3b-b914-d837087b3826\" (UID: \"5491e135-8a4d-4f3b-b914-d837087b3826\") " Oct 07 13:57:29 crc kubenswrapper[4854]: I1007 13:57:29.578088 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5491e135-8a4d-4f3b-b914-d837087b3826-kube-api-access-2mzv6" (OuterVolumeSpecName: "kube-api-access-2mzv6") pod "5491e135-8a4d-4f3b-b914-d837087b3826" (UID: "5491e135-8a4d-4f3b-b914-d837087b3826"). InnerVolumeSpecName "kube-api-access-2mzv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:29 crc kubenswrapper[4854]: I1007 13:57:29.674302 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb2rb\" (UniqueName: \"kubernetes.io/projected/14baabea-8903-4587-bb73-3183643a716f-kube-api-access-xb2rb\") pod \"14baabea-8903-4587-bb73-3183643a716f\" (UID: \"14baabea-8903-4587-bb73-3183643a716f\") " Oct 07 13:57:29 crc kubenswrapper[4854]: I1007 13:57:29.674388 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v67fc\" (UniqueName: \"kubernetes.io/projected/b39e71d1-d571-4a66-ac83-d4f3a2816fda-kube-api-access-v67fc\") pod \"b39e71d1-d571-4a66-ac83-d4f3a2816fda\" (UID: \"b39e71d1-d571-4a66-ac83-d4f3a2816fda\") " Oct 07 13:57:29 crc kubenswrapper[4854]: I1007 13:57:29.674919 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mzv6\" (UniqueName: \"kubernetes.io/projected/5491e135-8a4d-4f3b-b914-d837087b3826-kube-api-access-2mzv6\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:29 crc kubenswrapper[4854]: I1007 13:57:29.677880 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14baabea-8903-4587-bb73-3183643a716f-kube-api-access-xb2rb" (OuterVolumeSpecName: "kube-api-access-xb2rb") pod "14baabea-8903-4587-bb73-3183643a716f" (UID: "14baabea-8903-4587-bb73-3183643a716f"). InnerVolumeSpecName "kube-api-access-xb2rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:29 crc kubenswrapper[4854]: I1007 13:57:29.678383 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39e71d1-d571-4a66-ac83-d4f3a2816fda-kube-api-access-v67fc" (OuterVolumeSpecName: "kube-api-access-v67fc") pod "b39e71d1-d571-4a66-ac83-d4f3a2816fda" (UID: "b39e71d1-d571-4a66-ac83-d4f3a2816fda"). InnerVolumeSpecName "kube-api-access-v67fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:29 crc kubenswrapper[4854]: I1007 13:57:29.777000 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb2rb\" (UniqueName: \"kubernetes.io/projected/14baabea-8903-4587-bb73-3183643a716f-kube-api-access-xb2rb\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:29 crc kubenswrapper[4854]: I1007 13:57:29.777030 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v67fc\" (UniqueName: \"kubernetes.io/projected/b39e71d1-d571-4a66-ac83-d4f3a2816fda-kube-api-access-v67fc\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:30 crc kubenswrapper[4854]: I1007 13:57:30.061948 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x4q2z" Oct 07 13:57:30 crc kubenswrapper[4854]: I1007 13:57:30.062003 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x4q2z" event={"ID":"14baabea-8903-4587-bb73-3183643a716f","Type":"ContainerDied","Data":"6ab2f3a9b8ccf15c2e51b9cd52b628b8678bc199caa039b0336969e473935a55"} Oct 07 13:57:30 crc kubenswrapper[4854]: I1007 13:57:30.062053 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ab2f3a9b8ccf15c2e51b9cd52b628b8678bc199caa039b0336969e473935a55" Oct 07 13:57:30 crc kubenswrapper[4854]: I1007 13:57:30.064917 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ffkq6" Oct 07 13:57:30 crc kubenswrapper[4854]: I1007 13:57:30.065223 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ffkq6" event={"ID":"b39e71d1-d571-4a66-ac83-d4f3a2816fda","Type":"ContainerDied","Data":"3be2f651ed96a2c0e6921ecdd7edb4919ac854eb7f271f2a3e6fe051ccd284bf"} Oct 07 13:57:30 crc kubenswrapper[4854]: I1007 13:57:30.065277 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3be2f651ed96a2c0e6921ecdd7edb4919ac854eb7f271f2a3e6fe051ccd284bf" Oct 07 13:57:30 crc kubenswrapper[4854]: I1007 13:57:30.067394 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-chs97" event={"ID":"5491e135-8a4d-4f3b-b914-d837087b3826","Type":"ContainerDied","Data":"b5157f76863954b814c1037f2c44546c2e3c57fb94687dc19bc5f7fbf11d9b22"} Oct 07 13:57:30 crc kubenswrapper[4854]: I1007 13:57:30.067438 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5157f76863954b814c1037f2c44546c2e3c57fb94687dc19bc5f7fbf11d9b22" Oct 07 13:57:30 crc kubenswrapper[4854]: I1007 13:57:30.067736 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-chs97" Oct 07 13:57:30 crc kubenswrapper[4854]: I1007 13:57:30.366454 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:30 crc kubenswrapper[4854]: I1007 13:57:30.366527 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:30 crc kubenswrapper[4854]: I1007 13:57:30.432889 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:31 crc kubenswrapper[4854]: I1007 13:57:31.165097 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:31 crc kubenswrapper[4854]: I1007 13:57:31.219916 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g98jt"] Oct 07 13:57:32 crc kubenswrapper[4854]: I1007 13:57:32.702867 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:57:32 crc kubenswrapper[4854]: E1007 13:57:32.703560 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 13:57:33 crc kubenswrapper[4854]: I1007 13:57:33.099176 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g98jt" podUID="42960866-e33a-47d2-901d-9c581683e7ab" containerName="registry-server" containerID="cri-o://146a7c594c57c96699d12df482fc0fe5d4702c5c1dd8558f1fad84ad5d7da37a" gracePeriod=2 Oct 07 13:57:33 crc kubenswrapper[4854]: I1007 13:57:33.582533 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:33 crc kubenswrapper[4854]: I1007 13:57:33.758957 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42960866-e33a-47d2-901d-9c581683e7ab-utilities\") pod \"42960866-e33a-47d2-901d-9c581683e7ab\" (UID: \"42960866-e33a-47d2-901d-9c581683e7ab\") " Oct 07 13:57:33 crc kubenswrapper[4854]: I1007 13:57:33.759114 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhpkc\" (UniqueName: \"kubernetes.io/projected/42960866-e33a-47d2-901d-9c581683e7ab-kube-api-access-nhpkc\") pod \"42960866-e33a-47d2-901d-9c581683e7ab\" (UID: \"42960866-e33a-47d2-901d-9c581683e7ab\") " Oct 07 13:57:33 crc kubenswrapper[4854]: I1007 13:57:33.759231 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42960866-e33a-47d2-901d-9c581683e7ab-catalog-content\") pod \"42960866-e33a-47d2-901d-9c581683e7ab\" (UID: \"42960866-e33a-47d2-901d-9c581683e7ab\") " Oct 07 13:57:33 crc kubenswrapper[4854]: I1007 13:57:33.760001 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42960866-e33a-47d2-901d-9c581683e7ab-utilities" (OuterVolumeSpecName: "utilities") pod "42960866-e33a-47d2-901d-9c581683e7ab" (UID: "42960866-e33a-47d2-901d-9c581683e7ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:33 crc kubenswrapper[4854]: I1007 13:57:33.765477 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42960866-e33a-47d2-901d-9c581683e7ab-kube-api-access-nhpkc" (OuterVolumeSpecName: "kube-api-access-nhpkc") pod "42960866-e33a-47d2-901d-9c581683e7ab" (UID: "42960866-e33a-47d2-901d-9c581683e7ab"). InnerVolumeSpecName "kube-api-access-nhpkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:33 crc kubenswrapper[4854]: I1007 13:57:33.811431 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42960866-e33a-47d2-901d-9c581683e7ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42960866-e33a-47d2-901d-9c581683e7ab" (UID: "42960866-e33a-47d2-901d-9c581683e7ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:33 crc kubenswrapper[4854]: I1007 13:57:33.862182 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhpkc\" (UniqueName: \"kubernetes.io/projected/42960866-e33a-47d2-901d-9c581683e7ab-kube-api-access-nhpkc\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:33 crc kubenswrapper[4854]: I1007 13:57:33.862228 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42960866-e33a-47d2-901d-9c581683e7ab-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:33 crc kubenswrapper[4854]: I1007 13:57:33.862240 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42960866-e33a-47d2-901d-9c581683e7ab-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.111042 4854 generic.go:334] "Generic (PLEG): container finished" podID="42960866-e33a-47d2-901d-9c581683e7ab" containerID="146a7c594c57c96699d12df482fc0fe5d4702c5c1dd8558f1fad84ad5d7da37a" exitCode=0 Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.111176 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g98jt" event={"ID":"42960866-e33a-47d2-901d-9c581683e7ab","Type":"ContainerDied","Data":"146a7c594c57c96699d12df482fc0fe5d4702c5c1dd8558f1fad84ad5d7da37a"} Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.111234 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g98jt" event={"ID":"42960866-e33a-47d2-901d-9c581683e7ab","Type":"ContainerDied","Data":"dfcd968ea644409f9c350b6df80d76196a461e124f672589e4a49a74f0c6e93b"} Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.111255 4854 scope.go:117] "RemoveContainer" containerID="146a7c594c57c96699d12df482fc0fe5d4702c5c1dd8558f1fad84ad5d7da37a" Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.111252 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g98jt" Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.162501 4854 scope.go:117] "RemoveContainer" containerID="d71b6e1cea196ad1aaea024264b6f4e097fdca400d19fedf56e1bdade2e8db42" Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.165338 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g98jt"] Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.174583 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g98jt"] Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.189261 4854 scope.go:117] "RemoveContainer" containerID="19e9c42289c2e86dc097ba1c6cf4a00e18ac7add26f5e57066a1a775dac9ae67" Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.231374 4854 scope.go:117] "RemoveContainer" containerID="146a7c594c57c96699d12df482fc0fe5d4702c5c1dd8558f1fad84ad5d7da37a" Oct 07 13:57:34 crc kubenswrapper[4854]: E1007 13:57:34.231888 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"146a7c594c57c96699d12df482fc0fe5d4702c5c1dd8558f1fad84ad5d7da37a\": container with ID starting with 146a7c594c57c96699d12df482fc0fe5d4702c5c1dd8558f1fad84ad5d7da37a not found: ID does not exist" containerID="146a7c594c57c96699d12df482fc0fe5d4702c5c1dd8558f1fad84ad5d7da37a" Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.231966 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"146a7c594c57c96699d12df482fc0fe5d4702c5c1dd8558f1fad84ad5d7da37a"} err="failed to get container status \"146a7c594c57c96699d12df482fc0fe5d4702c5c1dd8558f1fad84ad5d7da37a\": rpc error: code = NotFound desc = could not find container \"146a7c594c57c96699d12df482fc0fe5d4702c5c1dd8558f1fad84ad5d7da37a\": container with ID starting with 146a7c594c57c96699d12df482fc0fe5d4702c5c1dd8558f1fad84ad5d7da37a not found: ID does not exist" Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.232000 4854 scope.go:117] "RemoveContainer" containerID="d71b6e1cea196ad1aaea024264b6f4e097fdca400d19fedf56e1bdade2e8db42" Oct 07 13:57:34 crc kubenswrapper[4854]: E1007 13:57:34.232470 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d71b6e1cea196ad1aaea024264b6f4e097fdca400d19fedf56e1bdade2e8db42\": container with ID starting with d71b6e1cea196ad1aaea024264b6f4e097fdca400d19fedf56e1bdade2e8db42 not found: ID does not exist" containerID="d71b6e1cea196ad1aaea024264b6f4e097fdca400d19fedf56e1bdade2e8db42" Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.232522 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d71b6e1cea196ad1aaea024264b6f4e097fdca400d19fedf56e1bdade2e8db42"} err="failed to get container status \"d71b6e1cea196ad1aaea024264b6f4e097fdca400d19fedf56e1bdade2e8db42\": rpc error: code = NotFound desc = could not find container \"d71b6e1cea196ad1aaea024264b6f4e097fdca400d19fedf56e1bdade2e8db42\": container with ID starting with d71b6e1cea196ad1aaea024264b6f4e097fdca400d19fedf56e1bdade2e8db42 not found: ID does not exist" Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.232553 4854 scope.go:117] "RemoveContainer" containerID="19e9c42289c2e86dc097ba1c6cf4a00e18ac7add26f5e57066a1a775dac9ae67" Oct 07 13:57:34 crc kubenswrapper[4854]: E1007 13:57:34.233587 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19e9c42289c2e86dc097ba1c6cf4a00e18ac7add26f5e57066a1a775dac9ae67\": container with ID starting with 19e9c42289c2e86dc097ba1c6cf4a00e18ac7add26f5e57066a1a775dac9ae67 not found: ID does not exist" containerID="19e9c42289c2e86dc097ba1c6cf4a00e18ac7add26f5e57066a1a775dac9ae67" Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.233618 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19e9c42289c2e86dc097ba1c6cf4a00e18ac7add26f5e57066a1a775dac9ae67"} err="failed to get container status \"19e9c42289c2e86dc097ba1c6cf4a00e18ac7add26f5e57066a1a775dac9ae67\": rpc error: code = NotFound desc = could not find container \"19e9c42289c2e86dc097ba1c6cf4a00e18ac7add26f5e57066a1a775dac9ae67\": container with ID starting with 19e9c42289c2e86dc097ba1c6cf4a00e18ac7add26f5e57066a1a775dac9ae67 not found: ID does not exist" Oct 07 13:57:34 crc kubenswrapper[4854]: I1007 13:57:34.724597 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42960866-e33a-47d2-901d-9c581683e7ab" path="/var/lib/kubelet/pods/42960866-e33a-47d2-901d-9c581683e7ab/volumes" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.421576 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-02b4-account-create-ws7zm"] Oct 07 13:57:36 crc kubenswrapper[4854]: E1007 13:57:36.422224 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39e71d1-d571-4a66-ac83-d4f3a2816fda" containerName="mariadb-database-create" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.422236 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39e71d1-d571-4a66-ac83-d4f3a2816fda" containerName="mariadb-database-create" Oct 07 13:57:36 crc kubenswrapper[4854]: E1007 13:57:36.422253 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42960866-e33a-47d2-901d-9c581683e7ab" containerName="registry-server" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.422259 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="42960866-e33a-47d2-901d-9c581683e7ab" containerName="registry-server" Oct 07 13:57:36 crc kubenswrapper[4854]: E1007 13:57:36.422277 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5491e135-8a4d-4f3b-b914-d837087b3826" containerName="mariadb-database-create" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.422283 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5491e135-8a4d-4f3b-b914-d837087b3826" containerName="mariadb-database-create" Oct 07 13:57:36 crc kubenswrapper[4854]: E1007 13:57:36.422291 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42960866-e33a-47d2-901d-9c581683e7ab" containerName="extract-content" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.422297 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="42960866-e33a-47d2-901d-9c581683e7ab" containerName="extract-content" Oct 07 13:57:36 crc kubenswrapper[4854]: E1007 13:57:36.422330 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14baabea-8903-4587-bb73-3183643a716f" containerName="mariadb-database-create" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.422339 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="14baabea-8903-4587-bb73-3183643a716f" containerName="mariadb-database-create" Oct 07 13:57:36 crc kubenswrapper[4854]: E1007 13:57:36.422350 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42960866-e33a-47d2-901d-9c581683e7ab" containerName="extract-utilities" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.422357 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="42960866-e33a-47d2-901d-9c581683e7ab" containerName="extract-utilities" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.422546 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="14baabea-8903-4587-bb73-3183643a716f" containerName="mariadb-database-create" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.422558 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39e71d1-d571-4a66-ac83-d4f3a2816fda" containerName="mariadb-database-create" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.422570 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="42960866-e33a-47d2-901d-9c581683e7ab" containerName="registry-server" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.422589 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5491e135-8a4d-4f3b-b914-d837087b3826" containerName="mariadb-database-create" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.423296 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-02b4-account-create-ws7zm" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.425632 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.439840 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-02b4-account-create-ws7zm"] Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.604245 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-65a7-account-create-88qrr"] Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.613106 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-65a7-account-create-88qrr" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.617302 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.618972 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tpnk\" (UniqueName: \"kubernetes.io/projected/29949f4f-fe34-418c-9dc7-4468eb0749d4-kube-api-access-9tpnk\") pod \"nova-api-02b4-account-create-ws7zm\" (UID: \"29949f4f-fe34-418c-9dc7-4468eb0749d4\") " pod="openstack/nova-api-02b4-account-create-ws7zm" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.619802 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-65a7-account-create-88qrr"] Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.721725 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmcrl\" (UniqueName: \"kubernetes.io/projected/fea85a90-a128-4e5a-b27c-b2cb1fb901d3-kube-api-access-xmcrl\") pod \"nova-cell0-65a7-account-create-88qrr\" (UID: \"fea85a90-a128-4e5a-b27c-b2cb1fb901d3\") " pod="openstack/nova-cell0-65a7-account-create-88qrr" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.721809 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tpnk\" (UniqueName: \"kubernetes.io/projected/29949f4f-fe34-418c-9dc7-4468eb0749d4-kube-api-access-9tpnk\") pod \"nova-api-02b4-account-create-ws7zm\" (UID: \"29949f4f-fe34-418c-9dc7-4468eb0749d4\") " pod="openstack/nova-api-02b4-account-create-ws7zm" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.751672 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tpnk\" (UniqueName: \"kubernetes.io/projected/29949f4f-fe34-418c-9dc7-4468eb0749d4-kube-api-access-9tpnk\") pod \"nova-api-02b4-account-create-ws7zm\" (UID: \"29949f4f-fe34-418c-9dc7-4468eb0749d4\") " pod="openstack/nova-api-02b4-account-create-ws7zm" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.759386 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-02b4-account-create-ws7zm" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.801016 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c7d6-account-create-w5tkr"] Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.802765 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c7d6-account-create-w5tkr" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.805736 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.807661 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c7d6-account-create-w5tkr"] Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.823459 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmcrl\" (UniqueName: \"kubernetes.io/projected/fea85a90-a128-4e5a-b27c-b2cb1fb901d3-kube-api-access-xmcrl\") pod \"nova-cell0-65a7-account-create-88qrr\" (UID: \"fea85a90-a128-4e5a-b27c-b2cb1fb901d3\") " pod="openstack/nova-cell0-65a7-account-create-88qrr" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.840549 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmcrl\" (UniqueName: \"kubernetes.io/projected/fea85a90-a128-4e5a-b27c-b2cb1fb901d3-kube-api-access-xmcrl\") pod \"nova-cell0-65a7-account-create-88qrr\" (UID: \"fea85a90-a128-4e5a-b27c-b2cb1fb901d3\") " pod="openstack/nova-cell0-65a7-account-create-88qrr" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.928190 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5jsm\" (UniqueName: \"kubernetes.io/projected/a445c23d-afd6-466d-8bc4-b47492fab261-kube-api-access-n5jsm\") pod \"nova-cell1-c7d6-account-create-w5tkr\" (UID: \"a445c23d-afd6-466d-8bc4-b47492fab261\") " pod="openstack/nova-cell1-c7d6-account-create-w5tkr" Oct 07 13:57:36 crc kubenswrapper[4854]: I1007 13:57:36.930480 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-65a7-account-create-88qrr" Oct 07 13:57:37 crc kubenswrapper[4854]: I1007 13:57:37.029613 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5jsm\" (UniqueName: \"kubernetes.io/projected/a445c23d-afd6-466d-8bc4-b47492fab261-kube-api-access-n5jsm\") pod \"nova-cell1-c7d6-account-create-w5tkr\" (UID: \"a445c23d-afd6-466d-8bc4-b47492fab261\") " pod="openstack/nova-cell1-c7d6-account-create-w5tkr" Oct 07 13:57:37 crc kubenswrapper[4854]: I1007 13:57:37.051004 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5jsm\" (UniqueName: \"kubernetes.io/projected/a445c23d-afd6-466d-8bc4-b47492fab261-kube-api-access-n5jsm\") pod \"nova-cell1-c7d6-account-create-w5tkr\" (UID: \"a445c23d-afd6-466d-8bc4-b47492fab261\") " pod="openstack/nova-cell1-c7d6-account-create-w5tkr" Oct 07 13:57:37 crc kubenswrapper[4854]: I1007 13:57:37.211648 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c7d6-account-create-w5tkr" Oct 07 13:57:37 crc kubenswrapper[4854]: I1007 13:57:37.310075 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-02b4-account-create-ws7zm"] Oct 07 13:57:37 crc kubenswrapper[4854]: W1007 13:57:37.378502 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfea85a90_a128_4e5a_b27c_b2cb1fb901d3.slice/crio-3f3d5c93ddec1e741884777d405feb5b037f538f7712bd24591824958941b3b7 WatchSource:0}: Error finding container 3f3d5c93ddec1e741884777d405feb5b037f538f7712bd24591824958941b3b7: Status 404 returned error can't find the container with id 3f3d5c93ddec1e741884777d405feb5b037f538f7712bd24591824958941b3b7 Oct 07 13:57:37 crc kubenswrapper[4854]: I1007 13:57:37.385302 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-65a7-account-create-88qrr"] Oct 07 13:57:37 crc kubenswrapper[4854]: I1007 13:57:37.651782 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c7d6-account-create-w5tkr"] Oct 07 13:57:37 crc kubenswrapper[4854]: W1007 13:57:37.697557 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda445c23d_afd6_466d_8bc4_b47492fab261.slice/crio-0cc0b6a7499621c631874b900444b269dc8a57e51eb6b059a5ba285ff104b0ce WatchSource:0}: Error finding container 0cc0b6a7499621c631874b900444b269dc8a57e51eb6b059a5ba285ff104b0ce: Status 404 returned error can't find the container with id 0cc0b6a7499621c631874b900444b269dc8a57e51eb6b059a5ba285ff104b0ce Oct 07 13:57:38 crc kubenswrapper[4854]: I1007 13:57:38.160818 4854 generic.go:334] "Generic (PLEG): container finished" podID="29949f4f-fe34-418c-9dc7-4468eb0749d4" containerID="97695d80808032506527452ea15f07d07327d3c990087dc5ec39d10f030b9c42" exitCode=0 Oct 07 13:57:38 crc kubenswrapper[4854]: I1007 13:57:38.160862 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-02b4-account-create-ws7zm" event={"ID":"29949f4f-fe34-418c-9dc7-4468eb0749d4","Type":"ContainerDied","Data":"97695d80808032506527452ea15f07d07327d3c990087dc5ec39d10f030b9c42"} Oct 07 13:57:38 crc kubenswrapper[4854]: I1007 13:57:38.160921 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-02b4-account-create-ws7zm" event={"ID":"29949f4f-fe34-418c-9dc7-4468eb0749d4","Type":"ContainerStarted","Data":"4a4f4ca763e89b072aa79740abc55076a834a6753e76e2963b1b88ae0ddfda5b"} Oct 07 13:57:38 crc kubenswrapper[4854]: I1007 13:57:38.166191 4854 generic.go:334] "Generic (PLEG): container finished" podID="a445c23d-afd6-466d-8bc4-b47492fab261" containerID="47a262d42eeb5b9f85dd35bcceb4918ee64bdf7a0d248afef269dc9c7999272b" exitCode=0 Oct 07 13:57:38 crc kubenswrapper[4854]: I1007 13:57:38.166254 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c7d6-account-create-w5tkr" event={"ID":"a445c23d-afd6-466d-8bc4-b47492fab261","Type":"ContainerDied","Data":"47a262d42eeb5b9f85dd35bcceb4918ee64bdf7a0d248afef269dc9c7999272b"} Oct 07 13:57:38 crc kubenswrapper[4854]: I1007 13:57:38.166318 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c7d6-account-create-w5tkr" event={"ID":"a445c23d-afd6-466d-8bc4-b47492fab261","Type":"ContainerStarted","Data":"0cc0b6a7499621c631874b900444b269dc8a57e51eb6b059a5ba285ff104b0ce"} Oct 07 13:57:38 crc kubenswrapper[4854]: I1007 13:57:38.168531 4854 generic.go:334] "Generic (PLEG): container finished" podID="fea85a90-a128-4e5a-b27c-b2cb1fb901d3" containerID="7cae3b88b9c7a9cc363911f17a67d7217527ff9f01af5c23ac170030384adae5" exitCode=0 Oct 07 13:57:38 crc kubenswrapper[4854]: I1007 13:57:38.168602 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-65a7-account-create-88qrr" event={"ID":"fea85a90-a128-4e5a-b27c-b2cb1fb901d3","Type":"ContainerDied","Data":"7cae3b88b9c7a9cc363911f17a67d7217527ff9f01af5c23ac170030384adae5"} Oct 07 13:57:38 crc kubenswrapper[4854]: I1007 13:57:38.168652 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-65a7-account-create-88qrr" event={"ID":"fea85a90-a128-4e5a-b27c-b2cb1fb901d3","Type":"ContainerStarted","Data":"3f3d5c93ddec1e741884777d405feb5b037f538f7712bd24591824958941b3b7"} Oct 07 13:57:39 crc kubenswrapper[4854]: I1007 13:57:39.585396 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-65a7-account-create-88qrr" Oct 07 13:57:39 crc kubenswrapper[4854]: I1007 13:57:39.594149 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-02b4-account-create-ws7zm" Oct 07 13:57:39 crc kubenswrapper[4854]: I1007 13:57:39.607550 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c7d6-account-create-w5tkr" Oct 07 13:57:39 crc kubenswrapper[4854]: I1007 13:57:39.680759 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tpnk\" (UniqueName: \"kubernetes.io/projected/29949f4f-fe34-418c-9dc7-4468eb0749d4-kube-api-access-9tpnk\") pod \"29949f4f-fe34-418c-9dc7-4468eb0749d4\" (UID: \"29949f4f-fe34-418c-9dc7-4468eb0749d4\") " Oct 07 13:57:39 crc kubenswrapper[4854]: I1007 13:57:39.680852 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmcrl\" (UniqueName: \"kubernetes.io/projected/fea85a90-a128-4e5a-b27c-b2cb1fb901d3-kube-api-access-xmcrl\") pod \"fea85a90-a128-4e5a-b27c-b2cb1fb901d3\" (UID: \"fea85a90-a128-4e5a-b27c-b2cb1fb901d3\") " Oct 07 13:57:39 crc kubenswrapper[4854]: I1007 13:57:39.692490 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fea85a90-a128-4e5a-b27c-b2cb1fb901d3-kube-api-access-xmcrl" (OuterVolumeSpecName: "kube-api-access-xmcrl") pod "fea85a90-a128-4e5a-b27c-b2cb1fb901d3" (UID: "fea85a90-a128-4e5a-b27c-b2cb1fb901d3"). InnerVolumeSpecName "kube-api-access-xmcrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:39 crc kubenswrapper[4854]: I1007 13:57:39.692803 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29949f4f-fe34-418c-9dc7-4468eb0749d4-kube-api-access-9tpnk" (OuterVolumeSpecName: "kube-api-access-9tpnk") pod "29949f4f-fe34-418c-9dc7-4468eb0749d4" (UID: "29949f4f-fe34-418c-9dc7-4468eb0749d4"). InnerVolumeSpecName "kube-api-access-9tpnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:39 crc kubenswrapper[4854]: I1007 13:57:39.782200 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5jsm\" (UniqueName: \"kubernetes.io/projected/a445c23d-afd6-466d-8bc4-b47492fab261-kube-api-access-n5jsm\") pod \"a445c23d-afd6-466d-8bc4-b47492fab261\" (UID: \"a445c23d-afd6-466d-8bc4-b47492fab261\") " Oct 07 13:57:39 crc kubenswrapper[4854]: I1007 13:57:39.782650 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmcrl\" (UniqueName: \"kubernetes.io/projected/fea85a90-a128-4e5a-b27c-b2cb1fb901d3-kube-api-access-xmcrl\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:39 crc kubenswrapper[4854]: I1007 13:57:39.782667 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tpnk\" (UniqueName: \"kubernetes.io/projected/29949f4f-fe34-418c-9dc7-4468eb0749d4-kube-api-access-9tpnk\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:39 crc kubenswrapper[4854]: I1007 13:57:39.785973 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a445c23d-afd6-466d-8bc4-b47492fab261-kube-api-access-n5jsm" (OuterVolumeSpecName: "kube-api-access-n5jsm") pod "a445c23d-afd6-466d-8bc4-b47492fab261" (UID: "a445c23d-afd6-466d-8bc4-b47492fab261"). InnerVolumeSpecName "kube-api-access-n5jsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:39 crc kubenswrapper[4854]: I1007 13:57:39.885463 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5jsm\" (UniqueName: \"kubernetes.io/projected/a445c23d-afd6-466d-8bc4-b47492fab261-kube-api-access-n5jsm\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:40 crc kubenswrapper[4854]: I1007 13:57:40.193616 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-65a7-account-create-88qrr" event={"ID":"fea85a90-a128-4e5a-b27c-b2cb1fb901d3","Type":"ContainerDied","Data":"3f3d5c93ddec1e741884777d405feb5b037f538f7712bd24591824958941b3b7"} Oct 07 13:57:40 crc kubenswrapper[4854]: I1007 13:57:40.193704 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f3d5c93ddec1e741884777d405feb5b037f538f7712bd24591824958941b3b7" Oct 07 13:57:40 crc kubenswrapper[4854]: I1007 13:57:40.193677 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-65a7-account-create-88qrr" Oct 07 13:57:40 crc kubenswrapper[4854]: I1007 13:57:40.215820 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-02b4-account-create-ws7zm" event={"ID":"29949f4f-fe34-418c-9dc7-4468eb0749d4","Type":"ContainerDied","Data":"4a4f4ca763e89b072aa79740abc55076a834a6753e76e2963b1b88ae0ddfda5b"} Oct 07 13:57:40 crc kubenswrapper[4854]: I1007 13:57:40.215873 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a4f4ca763e89b072aa79740abc55076a834a6753e76e2963b1b88ae0ddfda5b" Oct 07 13:57:40 crc kubenswrapper[4854]: I1007 13:57:40.215962 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-02b4-account-create-ws7zm" Oct 07 13:57:40 crc kubenswrapper[4854]: I1007 13:57:40.220234 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c7d6-account-create-w5tkr" event={"ID":"a445c23d-afd6-466d-8bc4-b47492fab261","Type":"ContainerDied","Data":"0cc0b6a7499621c631874b900444b269dc8a57e51eb6b059a5ba285ff104b0ce"} Oct 07 13:57:40 crc kubenswrapper[4854]: I1007 13:57:40.220281 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cc0b6a7499621c631874b900444b269dc8a57e51eb6b059a5ba285ff104b0ce" Oct 07 13:57:40 crc kubenswrapper[4854]: I1007 13:57:40.220382 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c7d6-account-create-w5tkr" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.722871 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tf2vc"] Oct 07 13:57:41 crc kubenswrapper[4854]: E1007 13:57:41.723306 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea85a90-a128-4e5a-b27c-b2cb1fb901d3" containerName="mariadb-account-create" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.723319 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea85a90-a128-4e5a-b27c-b2cb1fb901d3" containerName="mariadb-account-create" Oct 07 13:57:41 crc kubenswrapper[4854]: E1007 13:57:41.723343 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29949f4f-fe34-418c-9dc7-4468eb0749d4" containerName="mariadb-account-create" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.723349 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="29949f4f-fe34-418c-9dc7-4468eb0749d4" containerName="mariadb-account-create" Oct 07 13:57:41 crc kubenswrapper[4854]: E1007 13:57:41.723372 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a445c23d-afd6-466d-8bc4-b47492fab261" containerName="mariadb-account-create" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.723379 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a445c23d-afd6-466d-8bc4-b47492fab261" containerName="mariadb-account-create" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.723532 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea85a90-a128-4e5a-b27c-b2cb1fb901d3" containerName="mariadb-account-create" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.723550 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a445c23d-afd6-466d-8bc4-b47492fab261" containerName="mariadb-account-create" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.723563 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="29949f4f-fe34-418c-9dc7-4468eb0749d4" containerName="mariadb-account-create" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.724135 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.729580 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-722f9" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.729629 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.729725 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.733738 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tf2vc"] Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.828712 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-scripts\") pod \"nova-cell0-conductor-db-sync-tf2vc\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.828751 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-config-data\") pod \"nova-cell0-conductor-db-sync-tf2vc\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.828817 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tf2vc\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.829248 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78jv8\" (UniqueName: \"kubernetes.io/projected/91276b96-40b5-468a-a8e4-11446b2f0cfe-kube-api-access-78jv8\") pod \"nova-cell0-conductor-db-sync-tf2vc\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.931098 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78jv8\" (UniqueName: \"kubernetes.io/projected/91276b96-40b5-468a-a8e4-11446b2f0cfe-kube-api-access-78jv8\") pod \"nova-cell0-conductor-db-sync-tf2vc\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.931203 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-scripts\") pod \"nova-cell0-conductor-db-sync-tf2vc\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.931229 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-config-data\") pod \"nova-cell0-conductor-db-sync-tf2vc\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.931306 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tf2vc\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.937190 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-config-data\") pod \"nova-cell0-conductor-db-sync-tf2vc\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.949019 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-scripts\") pod \"nova-cell0-conductor-db-sync-tf2vc\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.949706 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tf2vc\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:41 crc kubenswrapper[4854]: I1007 13:57:41.962131 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78jv8\" (UniqueName: \"kubernetes.io/projected/91276b96-40b5-468a-a8e4-11446b2f0cfe-kube-api-access-78jv8\") pod \"nova-cell0-conductor-db-sync-tf2vc\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:42 crc kubenswrapper[4854]: I1007 13:57:42.039117 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:42 crc kubenswrapper[4854]: I1007 13:57:42.489573 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tf2vc"] Oct 07 13:57:43 crc kubenswrapper[4854]: I1007 13:57:43.256323 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tf2vc" event={"ID":"91276b96-40b5-468a-a8e4-11446b2f0cfe","Type":"ContainerStarted","Data":"cca28208affe024a95ef54b2966477e7abcd6ed9f4ab6feea62378dcae7c014f"} Oct 07 13:57:43 crc kubenswrapper[4854]: I1007 13:57:43.256988 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tf2vc" event={"ID":"91276b96-40b5-468a-a8e4-11446b2f0cfe","Type":"ContainerStarted","Data":"d1f09d432a33b29267aa42269b4c3dd9967ca14f249ca7ec47ca6366471d07e0"} Oct 07 13:57:43 crc kubenswrapper[4854]: I1007 13:57:43.281218 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-tf2vc" podStartSLOduration=2.281195109 podStartE2EDuration="2.281195109s" podCreationTimestamp="2025-10-07 13:57:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:57:43.276488475 +0000 UTC m=+5579.264320760" watchObservedRunningTime="2025-10-07 13:57:43.281195109 +0000 UTC m=+5579.269027374" Oct 07 13:57:46 crc kubenswrapper[4854]: I1007 13:57:46.703329 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 13:57:47 crc kubenswrapper[4854]: I1007 13:57:47.306639 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"faf9abd5a5e7753883aa95fb4b691a8be7baef96f7df5d4fa745c25c86f27153"} Oct 07 13:57:48 crc kubenswrapper[4854]: I1007 13:57:48.319935 4854 generic.go:334] "Generic (PLEG): container finished" podID="91276b96-40b5-468a-a8e4-11446b2f0cfe" containerID="cca28208affe024a95ef54b2966477e7abcd6ed9f4ab6feea62378dcae7c014f" exitCode=0 Oct 07 13:57:48 crc kubenswrapper[4854]: I1007 13:57:48.320049 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tf2vc" event={"ID":"91276b96-40b5-468a-a8e4-11446b2f0cfe","Type":"ContainerDied","Data":"cca28208affe024a95ef54b2966477e7abcd6ed9f4ab6feea62378dcae7c014f"} Oct 07 13:57:49 crc kubenswrapper[4854]: I1007 13:57:49.843085 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:49 crc kubenswrapper[4854]: I1007 13:57:49.999821 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78jv8\" (UniqueName: \"kubernetes.io/projected/91276b96-40b5-468a-a8e4-11446b2f0cfe-kube-api-access-78jv8\") pod \"91276b96-40b5-468a-a8e4-11446b2f0cfe\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:49.999927 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-scripts\") pod \"91276b96-40b5-468a-a8e4-11446b2f0cfe\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.000046 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-combined-ca-bundle\") pod \"91276b96-40b5-468a-a8e4-11446b2f0cfe\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.000103 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-config-data\") pod \"91276b96-40b5-468a-a8e4-11446b2f0cfe\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.005804 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91276b96-40b5-468a-a8e4-11446b2f0cfe-kube-api-access-78jv8" (OuterVolumeSpecName: "kube-api-access-78jv8") pod "91276b96-40b5-468a-a8e4-11446b2f0cfe" (UID: "91276b96-40b5-468a-a8e4-11446b2f0cfe"). InnerVolumeSpecName "kube-api-access-78jv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.007362 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-scripts" (OuterVolumeSpecName: "scripts") pod "91276b96-40b5-468a-a8e4-11446b2f0cfe" (UID: "91276b96-40b5-468a-a8e4-11446b2f0cfe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:57:50 crc kubenswrapper[4854]: E1007 13:57:50.036068 4854 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-combined-ca-bundle podName:91276b96-40b5-468a-a8e4-11446b2f0cfe nodeName:}" failed. No retries permitted until 2025-10-07 13:57:50.53602995 +0000 UTC m=+5586.523862245 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-combined-ca-bundle") pod "91276b96-40b5-468a-a8e4-11446b2f0cfe" (UID: "91276b96-40b5-468a-a8e4-11446b2f0cfe") : error deleting /var/lib/kubelet/pods/91276b96-40b5-468a-a8e4-11446b2f0cfe/volume-subpaths: remove /var/lib/kubelet/pods/91276b96-40b5-468a-a8e4-11446b2f0cfe/volume-subpaths: no such file or directory Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.040464 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-config-data" (OuterVolumeSpecName: "config-data") pod "91276b96-40b5-468a-a8e4-11446b2f0cfe" (UID: "91276b96-40b5-468a-a8e4-11446b2f0cfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.102068 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.102100 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78jv8\" (UniqueName: \"kubernetes.io/projected/91276b96-40b5-468a-a8e4-11446b2f0cfe-kube-api-access-78jv8\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.102110 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.343906 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tf2vc" event={"ID":"91276b96-40b5-468a-a8e4-11446b2f0cfe","Type":"ContainerDied","Data":"d1f09d432a33b29267aa42269b4c3dd9967ca14f249ca7ec47ca6366471d07e0"} Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.343956 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1f09d432a33b29267aa42269b4c3dd9967ca14f249ca7ec47ca6366471d07e0" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.343957 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tf2vc" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.413128 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 13:57:50 crc kubenswrapper[4854]: E1007 13:57:50.413569 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91276b96-40b5-468a-a8e4-11446b2f0cfe" containerName="nova-cell0-conductor-db-sync" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.413594 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="91276b96-40b5-468a-a8e4-11446b2f0cfe" containerName="nova-cell0-conductor-db-sync" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.413837 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="91276b96-40b5-468a-a8e4-11446b2f0cfe" containerName="nova-cell0-conductor-db-sync" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.414577 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.426015 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.508906 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f226ca89-20c7-4d17-a02b-43c1c8353353-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f226ca89-20c7-4d17-a02b-43c1c8353353\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.509092 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f226ca89-20c7-4d17-a02b-43c1c8353353-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f226ca89-20c7-4d17-a02b-43c1c8353353\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.509134 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdzd6\" (UniqueName: \"kubernetes.io/projected/f226ca89-20c7-4d17-a02b-43c1c8353353-kube-api-access-bdzd6\") pod \"nova-cell0-conductor-0\" (UID: \"f226ca89-20c7-4d17-a02b-43c1c8353353\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.610278 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-combined-ca-bundle\") pod \"91276b96-40b5-468a-a8e4-11446b2f0cfe\" (UID: \"91276b96-40b5-468a-a8e4-11446b2f0cfe\") " Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.610500 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f226ca89-20c7-4d17-a02b-43c1c8353353-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f226ca89-20c7-4d17-a02b-43c1c8353353\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.610612 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f226ca89-20c7-4d17-a02b-43c1c8353353-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f226ca89-20c7-4d17-a02b-43c1c8353353\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.610640 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdzd6\" (UniqueName: \"kubernetes.io/projected/f226ca89-20c7-4d17-a02b-43c1c8353353-kube-api-access-bdzd6\") pod \"nova-cell0-conductor-0\" (UID: \"f226ca89-20c7-4d17-a02b-43c1c8353353\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.613790 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91276b96-40b5-468a-a8e4-11446b2f0cfe" (UID: "91276b96-40b5-468a-a8e4-11446b2f0cfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.613987 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f226ca89-20c7-4d17-a02b-43c1c8353353-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f226ca89-20c7-4d17-a02b-43c1c8353353\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.626082 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f226ca89-20c7-4d17-a02b-43c1c8353353-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f226ca89-20c7-4d17-a02b-43c1c8353353\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.640297 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdzd6\" (UniqueName: \"kubernetes.io/projected/f226ca89-20c7-4d17-a02b-43c1c8353353-kube-api-access-bdzd6\") pod \"nova-cell0-conductor-0\" (UID: \"f226ca89-20c7-4d17-a02b-43c1c8353353\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.712253 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91276b96-40b5-468a-a8e4-11446b2f0cfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:50 crc kubenswrapper[4854]: I1007 13:57:50.740232 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 13:57:51 crc kubenswrapper[4854]: I1007 13:57:51.270890 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 13:57:51 crc kubenswrapper[4854]: W1007 13:57:51.280003 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf226ca89_20c7_4d17_a02b_43c1c8353353.slice/crio-7863dcd968942cfaafd0199bf93c648d791b71b5608d66a2e662c66125e86c79 WatchSource:0}: Error finding container 7863dcd968942cfaafd0199bf93c648d791b71b5608d66a2e662c66125e86c79: Status 404 returned error can't find the container with id 7863dcd968942cfaafd0199bf93c648d791b71b5608d66a2e662c66125e86c79 Oct 07 13:57:51 crc kubenswrapper[4854]: I1007 13:57:51.359785 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f226ca89-20c7-4d17-a02b-43c1c8353353","Type":"ContainerStarted","Data":"7863dcd968942cfaafd0199bf93c648d791b71b5608d66a2e662c66125e86c79"} Oct 07 13:57:52 crc kubenswrapper[4854]: I1007 13:57:52.368646 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f226ca89-20c7-4d17-a02b-43c1c8353353","Type":"ContainerStarted","Data":"ec455117753f3a6d9579cb1fb05d4848361e3fcf5d5bcd7580d80f6fbd7412fb"} Oct 07 13:57:52 crc kubenswrapper[4854]: I1007 13:57:52.369773 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 07 13:57:52 crc kubenswrapper[4854]: I1007 13:57:52.396238 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.396220343 podStartE2EDuration="2.396220343s" podCreationTimestamp="2025-10-07 13:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:57:52.388648287 +0000 UTC m=+5588.376480552" watchObservedRunningTime="2025-10-07 13:57:52.396220343 +0000 UTC m=+5588.384052608" Oct 07 13:58:00 crc kubenswrapper[4854]: I1007 13:58:00.791423 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.332316 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-866fh"] Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.334094 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.335713 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.342259 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.345429 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-866fh"] Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.423839 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-866fh\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.423939 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-config-data\") pod \"nova-cell0-cell-mapping-866fh\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.423995 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5msfs\" (UniqueName: \"kubernetes.io/projected/4a7655f0-c07c-4ca3-9074-449f647da536-kube-api-access-5msfs\") pod \"nova-cell0-cell-mapping-866fh\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.424027 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-scripts\") pod \"nova-cell0-cell-mapping-866fh\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.471558 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.472696 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.474704 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.486306 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.525059 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-866fh\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.525127 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-config-data\") pod \"nova-cell0-cell-mapping-866fh\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.525196 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5msfs\" (UniqueName: \"kubernetes.io/projected/4a7655f0-c07c-4ca3-9074-449f647da536-kube-api-access-5msfs\") pod \"nova-cell0-cell-mapping-866fh\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.525221 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-scripts\") pod \"nova-cell0-cell-mapping-866fh\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.531467 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-866fh\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.531494 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-config-data\") pod \"nova-cell0-cell-mapping-866fh\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.538538 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.540000 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.543364 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.546650 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-scripts\") pod \"nova-cell0-cell-mapping-866fh\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.550013 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5msfs\" (UniqueName: \"kubernetes.io/projected/4a7655f0-c07c-4ca3-9074-449f647da536-kube-api-access-5msfs\") pod \"nova-cell0-cell-mapping-866fh\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.559658 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.626121 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a3f9e9-afda-4138-8c6a-c1351412b296-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21a3f9e9-afda-4138-8c6a-c1351412b296\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.626386 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcqt9\" (UniqueName: \"kubernetes.io/projected/7cc99286-dead-49ed-8dec-60c75a175855-kube-api-access-hcqt9\") pod \"nova-scheduler-0\" (UID: \"7cc99286-dead-49ed-8dec-60c75a175855\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.626524 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc99286-dead-49ed-8dec-60c75a175855-config-data\") pod \"nova-scheduler-0\" (UID: \"7cc99286-dead-49ed-8dec-60c75a175855\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.626561 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a3f9e9-afda-4138-8c6a-c1351412b296-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21a3f9e9-afda-4138-8c6a-c1351412b296\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.626602 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc99286-dead-49ed-8dec-60c75a175855-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7cc99286-dead-49ed-8dec-60c75a175855\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.626620 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f95vx\" (UniqueName: \"kubernetes.io/projected/21a3f9e9-afda-4138-8c6a-c1351412b296-kube-api-access-f95vx\") pod \"nova-cell1-novncproxy-0\" (UID: \"21a3f9e9-afda-4138-8c6a-c1351412b296\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.668620 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.674516 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.675981 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.684824 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.692994 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.694680 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.697923 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.711435 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.728291 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a3f9e9-afda-4138-8c6a-c1351412b296-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21a3f9e9-afda-4138-8c6a-c1351412b296\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.728402 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcqt9\" (UniqueName: \"kubernetes.io/projected/7cc99286-dead-49ed-8dec-60c75a175855-kube-api-access-hcqt9\") pod \"nova-scheduler-0\" (UID: \"7cc99286-dead-49ed-8dec-60c75a175855\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.728469 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc99286-dead-49ed-8dec-60c75a175855-config-data\") pod \"nova-scheduler-0\" (UID: \"7cc99286-dead-49ed-8dec-60c75a175855\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.728498 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a3f9e9-afda-4138-8c6a-c1351412b296-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21a3f9e9-afda-4138-8c6a-c1351412b296\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.728525 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc99286-dead-49ed-8dec-60c75a175855-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7cc99286-dead-49ed-8dec-60c75a175855\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.728550 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f95vx\" (UniqueName: \"kubernetes.io/projected/21a3f9e9-afda-4138-8c6a-c1351412b296-kube-api-access-f95vx\") pod \"nova-cell1-novncproxy-0\" (UID: \"21a3f9e9-afda-4138-8c6a-c1351412b296\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.739434 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a3f9e9-afda-4138-8c6a-c1351412b296-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"21a3f9e9-afda-4138-8c6a-c1351412b296\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.744534 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc99286-dead-49ed-8dec-60c75a175855-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7cc99286-dead-49ed-8dec-60c75a175855\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.744611 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.754981 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc99286-dead-49ed-8dec-60c75a175855-config-data\") pod \"nova-scheduler-0\" (UID: \"7cc99286-dead-49ed-8dec-60c75a175855\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.759852 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a3f9e9-afda-4138-8c6a-c1351412b296-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"21a3f9e9-afda-4138-8c6a-c1351412b296\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.770859 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f95vx\" (UniqueName: \"kubernetes.io/projected/21a3f9e9-afda-4138-8c6a-c1351412b296-kube-api-access-f95vx\") pod \"nova-cell1-novncproxy-0\" (UID: \"21a3f9e9-afda-4138-8c6a-c1351412b296\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.772748 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcqt9\" (UniqueName: \"kubernetes.io/projected/7cc99286-dead-49ed-8dec-60c75a175855-kube-api-access-hcqt9\") pod \"nova-scheduler-0\" (UID: \"7cc99286-dead-49ed-8dec-60c75a175855\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.807586 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.830307 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5703fa35-b0ed-450d-8b90-2e9308878e46-config-data\") pod \"nova-api-0\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " pod="openstack/nova-api-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.830686 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl5f2\" (UniqueName: \"kubernetes.io/projected/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-kube-api-access-dl5f2\") pod \"nova-metadata-0\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " pod="openstack/nova-metadata-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.830721 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " pod="openstack/nova-metadata-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.830767 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-logs\") pod \"nova-metadata-0\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " pod="openstack/nova-metadata-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.830803 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbr84\" (UniqueName: \"kubernetes.io/projected/5703fa35-b0ed-450d-8b90-2e9308878e46-kube-api-access-cbr84\") pod \"nova-api-0\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " pod="openstack/nova-api-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.830869 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5703fa35-b0ed-450d-8b90-2e9308878e46-logs\") pod \"nova-api-0\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " pod="openstack/nova-api-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.832929 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-config-data\") pod \"nova-metadata-0\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " pod="openstack/nova-metadata-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.834281 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5703fa35-b0ed-450d-8b90-2e9308878e46-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " pod="openstack/nova-api-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.839789 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d9676b587-h7nbq"] Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.853019 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d9676b587-h7nbq"] Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.853164 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.926457 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.941512 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbr84\" (UniqueName: \"kubernetes.io/projected/5703fa35-b0ed-450d-8b90-2e9308878e46-kube-api-access-cbr84\") pod \"nova-api-0\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " pod="openstack/nova-api-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.941696 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5703fa35-b0ed-450d-8b90-2e9308878e46-logs\") pod \"nova-api-0\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " pod="openstack/nova-api-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.941734 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-config-data\") pod \"nova-metadata-0\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " pod="openstack/nova-metadata-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.941793 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5703fa35-b0ed-450d-8b90-2e9308878e46-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " pod="openstack/nova-api-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.941924 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5703fa35-b0ed-450d-8b90-2e9308878e46-config-data\") pod \"nova-api-0\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " pod="openstack/nova-api-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.942016 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl5f2\" (UniqueName: \"kubernetes.io/projected/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-kube-api-access-dl5f2\") pod \"nova-metadata-0\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " pod="openstack/nova-metadata-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.943019 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5703fa35-b0ed-450d-8b90-2e9308878e46-logs\") pod \"nova-api-0\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " pod="openstack/nova-api-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.955212 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-config-data\") pod \"nova-metadata-0\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " pod="openstack/nova-metadata-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.962017 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5703fa35-b0ed-450d-8b90-2e9308878e46-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " pod="openstack/nova-api-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.962801 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5703fa35-b0ed-450d-8b90-2e9308878e46-config-data\") pod \"nova-api-0\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " pod="openstack/nova-api-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.963274 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " pod="openstack/nova-metadata-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.963387 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-logs\") pod \"nova-metadata-0\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " pod="openstack/nova-metadata-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.963834 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-logs\") pod \"nova-metadata-0\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " pod="openstack/nova-metadata-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.968737 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbr84\" (UniqueName: \"kubernetes.io/projected/5703fa35-b0ed-450d-8b90-2e9308878e46-kube-api-access-cbr84\") pod \"nova-api-0\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " pod="openstack/nova-api-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.978620 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " pod="openstack/nova-metadata-0" Oct 07 13:58:01 crc kubenswrapper[4854]: I1007 13:58:01.997972 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl5f2\" (UniqueName: \"kubernetes.io/projected/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-kube-api-access-dl5f2\") pod \"nova-metadata-0\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " pod="openstack/nova-metadata-0" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.033782 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.034468 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.071588 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-config\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.072278 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-ovsdbserver-sb\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.072401 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glb7w\" (UniqueName: \"kubernetes.io/projected/c0dea76b-4966-46bf-9b83-effccb6e8985-kube-api-access-glb7w\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.072499 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-ovsdbserver-nb\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.072686 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-dns-svc\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.174325 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-ovsdbserver-sb\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.174373 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glb7w\" (UniqueName: \"kubernetes.io/projected/c0dea76b-4966-46bf-9b83-effccb6e8985-kube-api-access-glb7w\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.174392 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-ovsdbserver-nb\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.174444 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-dns-svc\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.174507 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-config\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.175404 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-config\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.175920 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-ovsdbserver-sb\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.176654 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-ovsdbserver-nb\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.189861 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.189932 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-dns-svc\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.212650 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glb7w\" (UniqueName: \"kubernetes.io/projected/c0dea76b-4966-46bf-9b83-effccb6e8985-kube-api-access-glb7w\") pod \"dnsmasq-dns-6d9676b587-h7nbq\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: W1007 13:58:02.230553 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cc99286_dead_49ed_8dec_60c75a175855.slice/crio-cc604f8a08848d0737f4a17674518ebaa910d17058769fb5122321aa8a53cf94 WatchSource:0}: Error finding container cc604f8a08848d0737f4a17674518ebaa910d17058769fb5122321aa8a53cf94: Status 404 returned error can't find the container with id cc604f8a08848d0737f4a17674518ebaa910d17058769fb5122321aa8a53cf94 Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.338653 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-866fh"] Oct 07 13:58:02 crc kubenswrapper[4854]: W1007 13:58:02.360035 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a7655f0_c07c_4ca3_9074_449f647da536.slice/crio-af55fa6c9eaf45983e1ea09597f2c095684bf1d907fd66f531a86985976770b2 WatchSource:0}: Error finding container af55fa6c9eaf45983e1ea09597f2c095684bf1d907fd66f531a86985976770b2: Status 404 returned error can't find the container with id af55fa6c9eaf45983e1ea09597f2c095684bf1d907fd66f531a86985976770b2 Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.495638 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.497583 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-866fh" event={"ID":"4a7655f0-c07c-4ca3-9074-449f647da536","Type":"ContainerStarted","Data":"af55fa6c9eaf45983e1ea09597f2c095684bf1d907fd66f531a86985976770b2"} Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.508083 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7cc99286-dead-49ed-8dec-60c75a175855","Type":"ContainerStarted","Data":"c7966ee62d53d283edbc2b95ca285ded36c86a09f0d15a8bebb9c84e029d0f75"} Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.508142 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7cc99286-dead-49ed-8dec-60c75a175855","Type":"ContainerStarted","Data":"cc604f8a08848d0737f4a17674518ebaa910d17058769fb5122321aa8a53cf94"} Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.526729 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.52671661 podStartE2EDuration="1.52671661s" podCreationTimestamp="2025-10-07 13:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:02.526499313 +0000 UTC m=+5598.514331568" watchObservedRunningTime="2025-10-07 13:58:02.52671661 +0000 UTC m=+5598.514548865" Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.615067 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.643293 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:58:02 crc kubenswrapper[4854]: W1007 13:58:02.665560 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21a3f9e9_afda_4138_8c6a_c1351412b296.slice/crio-8f5ff4e769bb107dfdf442113b6057dca6538422ad460fc64d3b3bb3db118502 WatchSource:0}: Error finding container 8f5ff4e769bb107dfdf442113b6057dca6538422ad460fc64d3b3bb3db118502: Status 404 returned error can't find the container with id 8f5ff4e769bb107dfdf442113b6057dca6538422ad460fc64d3b3bb3db118502 Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.746181 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:02 crc kubenswrapper[4854]: I1007 13:58:02.978237 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d9676b587-h7nbq"] Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.007697 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mk6pz"] Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.010228 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.012528 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.014580 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.022851 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mk6pz"] Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.111271 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-scripts\") pod \"nova-cell1-conductor-db-sync-mk6pz\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.111334 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-config-data\") pod \"nova-cell1-conductor-db-sync-mk6pz\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.111479 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mk6pz\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.111554 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p9dp\" (UniqueName: \"kubernetes.io/projected/931987ae-eef0-4720-99ba-2b538e98726b-kube-api-access-9p9dp\") pod \"nova-cell1-conductor-db-sync-mk6pz\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.213098 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mk6pz\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.213185 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p9dp\" (UniqueName: \"kubernetes.io/projected/931987ae-eef0-4720-99ba-2b538e98726b-kube-api-access-9p9dp\") pod \"nova-cell1-conductor-db-sync-mk6pz\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.213225 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-scripts\") pod \"nova-cell1-conductor-db-sync-mk6pz\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.213254 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-config-data\") pod \"nova-cell1-conductor-db-sync-mk6pz\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.218565 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-scripts\") pod \"nova-cell1-conductor-db-sync-mk6pz\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.218614 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mk6pz\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.219670 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-config-data\") pod \"nova-cell1-conductor-db-sync-mk6pz\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.232764 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p9dp\" (UniqueName: \"kubernetes.io/projected/931987ae-eef0-4720-99ba-2b538e98726b-kube-api-access-9p9dp\") pod \"nova-cell1-conductor-db-sync-mk6pz\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.363567 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.531902 4854 generic.go:334] "Generic (PLEG): container finished" podID="c0dea76b-4966-46bf-9b83-effccb6e8985" containerID="04f85082a72c363b3f0d71f69b375648f22c563df8dd1b56055469d61c05b6eb" exitCode=0 Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.531994 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" event={"ID":"c0dea76b-4966-46bf-9b83-effccb6e8985","Type":"ContainerDied","Data":"04f85082a72c363b3f0d71f69b375648f22c563df8dd1b56055469d61c05b6eb"} Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.532253 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" event={"ID":"c0dea76b-4966-46bf-9b83-effccb6e8985","Type":"ContainerStarted","Data":"0edfb0b74158573a75debde8c431be1e60e330dbc30709d6ca4a1277f781d28b"} Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.535141 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-866fh" event={"ID":"4a7655f0-c07c-4ca3-9074-449f647da536","Type":"ContainerStarted","Data":"bcf8dc0d93e3aeda3f9ef4dc9c53fb2af324c36e6e24b46342f879499bf4b7ca"} Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.539649 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de","Type":"ContainerStarted","Data":"6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53"} Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.539721 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de","Type":"ContainerStarted","Data":"38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4"} Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.539737 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de","Type":"ContainerStarted","Data":"c876a88ca7fd00a956769e57054079f6217edc6bfc280103eeb3f71e67919dbc"} Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.540913 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21a3f9e9-afda-4138-8c6a-c1351412b296","Type":"ContainerStarted","Data":"d5fba4b59a0dad959f787307e1a7aa2f3f1aba64f6f701c3f3a42598db6b4776"} Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.540940 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21a3f9e9-afda-4138-8c6a-c1351412b296","Type":"ContainerStarted","Data":"8f5ff4e769bb107dfdf442113b6057dca6538422ad460fc64d3b3bb3db118502"} Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.544265 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5703fa35-b0ed-450d-8b90-2e9308878e46","Type":"ContainerStarted","Data":"b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6"} Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.544297 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5703fa35-b0ed-450d-8b90-2e9308878e46","Type":"ContainerStarted","Data":"d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26"} Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.544309 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5703fa35-b0ed-450d-8b90-2e9308878e46","Type":"ContainerStarted","Data":"b7aeaf2fedbcc9a52f30e34ec63acb2c44c1af9d633b4192468f5c7a5d5d252b"} Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.591338 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.591318806 podStartE2EDuration="2.591318806s" podCreationTimestamp="2025-10-07 13:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:03.571451278 +0000 UTC m=+5599.559283553" watchObservedRunningTime="2025-10-07 13:58:03.591318806 +0000 UTC m=+5599.579151061" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.602082 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-866fh" podStartSLOduration=2.602054323 podStartE2EDuration="2.602054323s" podCreationTimestamp="2025-10-07 13:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:03.586837118 +0000 UTC m=+5599.574669373" watchObservedRunningTime="2025-10-07 13:58:03.602054323 +0000 UTC m=+5599.589886578" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.611109 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6110868910000002 podStartE2EDuration="2.611086891s" podCreationTimestamp="2025-10-07 13:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:03.60440701 +0000 UTC m=+5599.592239265" watchObservedRunningTime="2025-10-07 13:58:03.611086891 +0000 UTC m=+5599.598919146" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.643016 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.642995063 podStartE2EDuration="2.642995063s" podCreationTimestamp="2025-10-07 13:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:03.638909236 +0000 UTC m=+5599.626741491" watchObservedRunningTime="2025-10-07 13:58:03.642995063 +0000 UTC m=+5599.630827318" Oct 07 13:58:03 crc kubenswrapper[4854]: I1007 13:58:03.815216 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mk6pz"] Oct 07 13:58:04 crc kubenswrapper[4854]: I1007 13:58:04.565322 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" event={"ID":"c0dea76b-4966-46bf-9b83-effccb6e8985","Type":"ContainerStarted","Data":"29302c4a90f66ea8ad01c90f0ab237b01f0481834252226e4425f0d87050a455"} Oct 07 13:58:04 crc kubenswrapper[4854]: I1007 13:58:04.567648 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:04 crc kubenswrapper[4854]: I1007 13:58:04.572034 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mk6pz" event={"ID":"931987ae-eef0-4720-99ba-2b538e98726b","Type":"ContainerStarted","Data":"0cde039c2e3cb974a2412ea453914ae9d2562a76caa6294b3ca89ee1cdb69039"} Oct 07 13:58:04 crc kubenswrapper[4854]: I1007 13:58:04.572083 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mk6pz" event={"ID":"931987ae-eef0-4720-99ba-2b538e98726b","Type":"ContainerStarted","Data":"ee7bb55bd770b7e00cc427e0003bce9baa984b473d41f1dacd1803769244ef66"} Oct 07 13:58:04 crc kubenswrapper[4854]: I1007 13:58:04.610328 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" podStartSLOduration=3.6103048490000003 podStartE2EDuration="3.610304849s" podCreationTimestamp="2025-10-07 13:58:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:04.597136172 +0000 UTC m=+5600.584968437" watchObservedRunningTime="2025-10-07 13:58:04.610304849 +0000 UTC m=+5600.598137104" Oct 07 13:58:04 crc kubenswrapper[4854]: I1007 13:58:04.618557 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-mk6pz" podStartSLOduration=2.618533074 podStartE2EDuration="2.618533074s" podCreationTimestamp="2025-10-07 13:58:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:04.613940412 +0000 UTC m=+5600.601772667" watchObservedRunningTime="2025-10-07 13:58:04.618533074 +0000 UTC m=+5600.606365329" Oct 07 13:58:06 crc kubenswrapper[4854]: I1007 13:58:06.808988 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 13:58:06 crc kubenswrapper[4854]: I1007 13:58:06.927622 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:07 crc kubenswrapper[4854]: I1007 13:58:07.036265 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:58:07 crc kubenswrapper[4854]: I1007 13:58:07.037877 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:58:07 crc kubenswrapper[4854]: E1007 13:58:07.423367 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a7655f0_c07c_4ca3_9074_449f647da536.slice/crio-bcf8dc0d93e3aeda3f9ef4dc9c53fb2af324c36e6e24b46342f879499bf4b7ca.scope\": RecentStats: unable to find data in memory cache]" Oct 07 13:58:07 crc kubenswrapper[4854]: I1007 13:58:07.601891 4854 generic.go:334] "Generic (PLEG): container finished" podID="931987ae-eef0-4720-99ba-2b538e98726b" containerID="0cde039c2e3cb974a2412ea453914ae9d2562a76caa6294b3ca89ee1cdb69039" exitCode=0 Oct 07 13:58:07 crc kubenswrapper[4854]: I1007 13:58:07.602715 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mk6pz" event={"ID":"931987ae-eef0-4720-99ba-2b538e98726b","Type":"ContainerDied","Data":"0cde039c2e3cb974a2412ea453914ae9d2562a76caa6294b3ca89ee1cdb69039"} Oct 07 13:58:07 crc kubenswrapper[4854]: I1007 13:58:07.603652 4854 generic.go:334] "Generic (PLEG): container finished" podID="4a7655f0-c07c-4ca3-9074-449f647da536" containerID="bcf8dc0d93e3aeda3f9ef4dc9c53fb2af324c36e6e24b46342f879499bf4b7ca" exitCode=0 Oct 07 13:58:07 crc kubenswrapper[4854]: I1007 13:58:07.604911 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-866fh" event={"ID":"4a7655f0-c07c-4ca3-9074-449f647da536","Type":"ContainerDied","Data":"bcf8dc0d93e3aeda3f9ef4dc9c53fb2af324c36e6e24b46342f879499bf4b7ca"} Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.106017 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.112860 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.183795 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5msfs\" (UniqueName: \"kubernetes.io/projected/4a7655f0-c07c-4ca3-9074-449f647da536-kube-api-access-5msfs\") pod \"4a7655f0-c07c-4ca3-9074-449f647da536\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.183869 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-combined-ca-bundle\") pod \"931987ae-eef0-4720-99ba-2b538e98726b\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.183908 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-config-data\") pod \"931987ae-eef0-4720-99ba-2b538e98726b\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.183944 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-scripts\") pod \"4a7655f0-c07c-4ca3-9074-449f647da536\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.183975 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p9dp\" (UniqueName: \"kubernetes.io/projected/931987ae-eef0-4720-99ba-2b538e98726b-kube-api-access-9p9dp\") pod \"931987ae-eef0-4720-99ba-2b538e98726b\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.184006 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-combined-ca-bundle\") pod \"4a7655f0-c07c-4ca3-9074-449f647da536\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.184082 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-scripts\") pod \"931987ae-eef0-4720-99ba-2b538e98726b\" (UID: \"931987ae-eef0-4720-99ba-2b538e98726b\") " Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.184129 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-config-data\") pod \"4a7655f0-c07c-4ca3-9074-449f647da536\" (UID: \"4a7655f0-c07c-4ca3-9074-449f647da536\") " Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.197137 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-scripts" (OuterVolumeSpecName: "scripts") pod "4a7655f0-c07c-4ca3-9074-449f647da536" (UID: "4a7655f0-c07c-4ca3-9074-449f647da536"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.197247 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a7655f0-c07c-4ca3-9074-449f647da536-kube-api-access-5msfs" (OuterVolumeSpecName: "kube-api-access-5msfs") pod "4a7655f0-c07c-4ca3-9074-449f647da536" (UID: "4a7655f0-c07c-4ca3-9074-449f647da536"). InnerVolumeSpecName "kube-api-access-5msfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.197283 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/931987ae-eef0-4720-99ba-2b538e98726b-kube-api-access-9p9dp" (OuterVolumeSpecName: "kube-api-access-9p9dp") pod "931987ae-eef0-4720-99ba-2b538e98726b" (UID: "931987ae-eef0-4720-99ba-2b538e98726b"). InnerVolumeSpecName "kube-api-access-9p9dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.201780 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-scripts" (OuterVolumeSpecName: "scripts") pod "931987ae-eef0-4720-99ba-2b538e98726b" (UID: "931987ae-eef0-4720-99ba-2b538e98726b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.220141 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-config-data" (OuterVolumeSpecName: "config-data") pod "4a7655f0-c07c-4ca3-9074-449f647da536" (UID: "4a7655f0-c07c-4ca3-9074-449f647da536"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.222372 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-config-data" (OuterVolumeSpecName: "config-data") pod "931987ae-eef0-4720-99ba-2b538e98726b" (UID: "931987ae-eef0-4720-99ba-2b538e98726b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.224776 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "931987ae-eef0-4720-99ba-2b538e98726b" (UID: "931987ae-eef0-4720-99ba-2b538e98726b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.229480 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a7655f0-c07c-4ca3-9074-449f647da536" (UID: "4a7655f0-c07c-4ca3-9074-449f647da536"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.286514 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.286553 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.286565 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5msfs\" (UniqueName: \"kubernetes.io/projected/4a7655f0-c07c-4ca3-9074-449f647da536-kube-api-access-5msfs\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.286574 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.286583 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/931987ae-eef0-4720-99ba-2b538e98726b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.286590 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.286598 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p9dp\" (UniqueName: \"kubernetes.io/projected/931987ae-eef0-4720-99ba-2b538e98726b-kube-api-access-9p9dp\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.286608 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7655f0-c07c-4ca3-9074-449f647da536-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.626263 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mk6pz" event={"ID":"931987ae-eef0-4720-99ba-2b538e98726b","Type":"ContainerDied","Data":"ee7bb55bd770b7e00cc427e0003bce9baa984b473d41f1dacd1803769244ef66"} Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.626321 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee7bb55bd770b7e00cc427e0003bce9baa984b473d41f1dacd1803769244ef66" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.626332 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mk6pz" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.629128 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-866fh" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.633299 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-866fh" event={"ID":"4a7655f0-c07c-4ca3-9074-449f647da536","Type":"ContainerDied","Data":"af55fa6c9eaf45983e1ea09597f2c095684bf1d907fd66f531a86985976770b2"} Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.633351 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af55fa6c9eaf45983e1ea09597f2c095684bf1d907fd66f531a86985976770b2" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.731331 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 13:58:09 crc kubenswrapper[4854]: E1007 13:58:09.733959 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7655f0-c07c-4ca3-9074-449f647da536" containerName="nova-manage" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.733991 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7655f0-c07c-4ca3-9074-449f647da536" containerName="nova-manage" Oct 07 13:58:09 crc kubenswrapper[4854]: E1007 13:58:09.734036 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931987ae-eef0-4720-99ba-2b538e98726b" containerName="nova-cell1-conductor-db-sync" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.734047 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="931987ae-eef0-4720-99ba-2b538e98726b" containerName="nova-cell1-conductor-db-sync" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.734335 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="931987ae-eef0-4720-99ba-2b538e98726b" containerName="nova-cell1-conductor-db-sync" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.734371 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a7655f0-c07c-4ca3-9074-449f647da536" containerName="nova-manage" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.737858 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.749410 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.753855 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.801313 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c81fc08-b696-4c22-88fa-6842e8725af3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1c81fc08-b696-4c22-88fa-6842e8725af3\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.801400 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c81fc08-b696-4c22-88fa-6842e8725af3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1c81fc08-b696-4c22-88fa-6842e8725af3\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.801488 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm4nr\" (UniqueName: \"kubernetes.io/projected/1c81fc08-b696-4c22-88fa-6842e8725af3-kube-api-access-qm4nr\") pod \"nova-cell1-conductor-0\" (UID: \"1c81fc08-b696-4c22-88fa-6842e8725af3\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.877752 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.877946 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7cc99286-dead-49ed-8dec-60c75a175855" containerName="nova-scheduler-scheduler" containerID="cri-o://c7966ee62d53d283edbc2b95ca285ded36c86a09f0d15a8bebb9c84e029d0f75" gracePeriod=30 Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.891001 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.891537 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5703fa35-b0ed-450d-8b90-2e9308878e46" containerName="nova-api-log" containerID="cri-o://d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26" gracePeriod=30 Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.891585 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5703fa35-b0ed-450d-8b90-2e9308878e46" containerName="nova-api-api" containerID="cri-o://b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6" gracePeriod=30 Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.901370 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.901643 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" containerName="nova-metadata-log" containerID="cri-o://38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4" gracePeriod=30 Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.901794 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" containerName="nova-metadata-metadata" containerID="cri-o://6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53" gracePeriod=30 Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.902896 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c81fc08-b696-4c22-88fa-6842e8725af3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1c81fc08-b696-4c22-88fa-6842e8725af3\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.902941 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c81fc08-b696-4c22-88fa-6842e8725af3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1c81fc08-b696-4c22-88fa-6842e8725af3\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.903620 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm4nr\" (UniqueName: \"kubernetes.io/projected/1c81fc08-b696-4c22-88fa-6842e8725af3-kube-api-access-qm4nr\") pod \"nova-cell1-conductor-0\" (UID: \"1c81fc08-b696-4c22-88fa-6842e8725af3\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.907914 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c81fc08-b696-4c22-88fa-6842e8725af3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1c81fc08-b696-4c22-88fa-6842e8725af3\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.907987 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c81fc08-b696-4c22-88fa-6842e8725af3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1c81fc08-b696-4c22-88fa-6842e8725af3\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:58:09 crc kubenswrapper[4854]: I1007 13:58:09.930058 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm4nr\" (UniqueName: \"kubernetes.io/projected/1c81fc08-b696-4c22-88fa-6842e8725af3-kube-api-access-qm4nr\") pod \"nova-cell1-conductor-0\" (UID: \"1c81fc08-b696-4c22-88fa-6842e8725af3\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.063278 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.618114 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.621807 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.647539 4854 generic.go:334] "Generic (PLEG): container finished" podID="5703fa35-b0ed-450d-8b90-2e9308878e46" containerID="b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6" exitCode=0 Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.647580 4854 generic.go:334] "Generic (PLEG): container finished" podID="5703fa35-b0ed-450d-8b90-2e9308878e46" containerID="d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26" exitCode=143 Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.647626 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5703fa35-b0ed-450d-8b90-2e9308878e46","Type":"ContainerDied","Data":"b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6"} Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.647649 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5703fa35-b0ed-450d-8b90-2e9308878e46","Type":"ContainerDied","Data":"d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26"} Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.647660 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5703fa35-b0ed-450d-8b90-2e9308878e46","Type":"ContainerDied","Data":"b7aeaf2fedbcc9a52f30e34ec63acb2c44c1af9d633b4192468f5c7a5d5d252b"} Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.647677 4854 scope.go:117] "RemoveContainer" containerID="b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.647825 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.656099 4854 generic.go:334] "Generic (PLEG): container finished" podID="bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" containerID="6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53" exitCode=0 Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.656121 4854 generic.go:334] "Generic (PLEG): container finished" podID="bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" containerID="38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4" exitCode=143 Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.656160 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de","Type":"ContainerDied","Data":"6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53"} Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.656179 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de","Type":"ContainerDied","Data":"38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4"} Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.656191 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de","Type":"ContainerDied","Data":"c876a88ca7fd00a956769e57054079f6217edc6bfc280103eeb3f71e67919dbc"} Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.656250 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.693568 4854 scope.go:117] "RemoveContainer" containerID="d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.722528 4854 scope.go:117] "RemoveContainer" containerID="b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6" Oct 07 13:58:10 crc kubenswrapper[4854]: E1007 13:58:10.722976 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6\": container with ID starting with b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6 not found: ID does not exist" containerID="b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.723019 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6"} err="failed to get container status \"b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6\": rpc error: code = NotFound desc = could not find container \"b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6\": container with ID starting with b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6 not found: ID does not exist" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.723046 4854 scope.go:117] "RemoveContainer" containerID="d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26" Oct 07 13:58:10 crc kubenswrapper[4854]: E1007 13:58:10.723307 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26\": container with ID starting with d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26 not found: ID does not exist" containerID="d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.723341 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26"} err="failed to get container status \"d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26\": rpc error: code = NotFound desc = could not find container \"d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26\": container with ID starting with d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26 not found: ID does not exist" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.723368 4854 scope.go:117] "RemoveContainer" containerID="b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.723585 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6"} err="failed to get container status \"b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6\": rpc error: code = NotFound desc = could not find container \"b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6\": container with ID starting with b6d768382e6f71136c676ec300dca2b662583a783220c9c8afdce3206e1ac6c6 not found: ID does not exist" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.723600 4854 scope.go:117] "RemoveContainer" containerID="d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.723759 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5703fa35-b0ed-450d-8b90-2e9308878e46-logs\") pod \"5703fa35-b0ed-450d-8b90-2e9308878e46\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.723874 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl5f2\" (UniqueName: \"kubernetes.io/projected/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-kube-api-access-dl5f2\") pod \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.723921 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26"} err="failed to get container status \"d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26\": rpc error: code = NotFound desc = could not find container \"d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26\": container with ID starting with d941ba05d9c0c19c9fff5e35932a31c7aa9dec03a922cac552289f9597345c26 not found: ID does not exist" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.723977 4854 scope.go:117] "RemoveContainer" containerID="6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.723957 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5703fa35-b0ed-450d-8b90-2e9308878e46-combined-ca-bundle\") pod \"5703fa35-b0ed-450d-8b90-2e9308878e46\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.724279 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-logs\") pod \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.724363 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbr84\" (UniqueName: \"kubernetes.io/projected/5703fa35-b0ed-450d-8b90-2e9308878e46-kube-api-access-cbr84\") pod \"5703fa35-b0ed-450d-8b90-2e9308878e46\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.724411 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-combined-ca-bundle\") pod \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.724527 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5703fa35-b0ed-450d-8b90-2e9308878e46-config-data\") pod \"5703fa35-b0ed-450d-8b90-2e9308878e46\" (UID: \"5703fa35-b0ed-450d-8b90-2e9308878e46\") " Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.724559 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-config-data\") pod \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\" (UID: \"bf8bce1d-3d34-42f1-a24f-6a1a8465b4de\") " Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.724693 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-logs" (OuterVolumeSpecName: "logs") pod "bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" (UID: "bf8bce1d-3d34-42f1-a24f-6a1a8465b4de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.725552 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.726776 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5703fa35-b0ed-450d-8b90-2e9308878e46-logs" (OuterVolumeSpecName: "logs") pod "5703fa35-b0ed-450d-8b90-2e9308878e46" (UID: "5703fa35-b0ed-450d-8b90-2e9308878e46"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.729732 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-kube-api-access-dl5f2" (OuterVolumeSpecName: "kube-api-access-dl5f2") pod "bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" (UID: "bf8bce1d-3d34-42f1-a24f-6a1a8465b4de"). InnerVolumeSpecName "kube-api-access-dl5f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.730185 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5703fa35-b0ed-450d-8b90-2e9308878e46-kube-api-access-cbr84" (OuterVolumeSpecName: "kube-api-access-cbr84") pod "5703fa35-b0ed-450d-8b90-2e9308878e46" (UID: "5703fa35-b0ed-450d-8b90-2e9308878e46"). InnerVolumeSpecName "kube-api-access-cbr84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.760032 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" (UID: "bf8bce1d-3d34-42f1-a24f-6a1a8465b4de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.763103 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5703fa35-b0ed-450d-8b90-2e9308878e46-config-data" (OuterVolumeSpecName: "config-data") pod "5703fa35-b0ed-450d-8b90-2e9308878e46" (UID: "5703fa35-b0ed-450d-8b90-2e9308878e46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.763621 4854 scope.go:117] "RemoveContainer" containerID="38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.764923 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.767531 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-config-data" (OuterVolumeSpecName: "config-data") pod "bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" (UID: "bf8bce1d-3d34-42f1-a24f-6a1a8465b4de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.777165 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5703fa35-b0ed-450d-8b90-2e9308878e46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5703fa35-b0ed-450d-8b90-2e9308878e46" (UID: "5703fa35-b0ed-450d-8b90-2e9308878e46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.782922 4854 scope.go:117] "RemoveContainer" containerID="6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53" Oct 07 13:58:10 crc kubenswrapper[4854]: E1007 13:58:10.785787 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53\": container with ID starting with 6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53 not found: ID does not exist" containerID="6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.785978 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53"} err="failed to get container status \"6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53\": rpc error: code = NotFound desc = could not find container \"6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53\": container with ID starting with 6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53 not found: ID does not exist" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.786208 4854 scope.go:117] "RemoveContainer" containerID="38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4" Oct 07 13:58:10 crc kubenswrapper[4854]: E1007 13:58:10.786766 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4\": container with ID starting with 38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4 not found: ID does not exist" containerID="38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.786815 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4"} err="failed to get container status \"38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4\": rpc error: code = NotFound desc = could not find container \"38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4\": container with ID starting with 38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4 not found: ID does not exist" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.786844 4854 scope.go:117] "RemoveContainer" containerID="6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.788126 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53"} err="failed to get container status \"6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53\": rpc error: code = NotFound desc = could not find container \"6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53\": container with ID starting with 6374906bd038e074c7f6d42dea844447a53f245f2c75744401ae08c2eeb71e53 not found: ID does not exist" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.788169 4854 scope.go:117] "RemoveContainer" containerID="38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.788433 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4"} err="failed to get container status \"38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4\": rpc error: code = NotFound desc = could not find container \"38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4\": container with ID starting with 38b5e37ba8bcfdbc5b8566e536cc24701d88c6c2e559e288669f91794aa42af4 not found: ID does not exist" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.826738 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5703fa35-b0ed-450d-8b90-2e9308878e46-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.826770 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.826779 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5703fa35-b0ed-450d-8b90-2e9308878e46-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.826789 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl5f2\" (UniqueName: \"kubernetes.io/projected/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-kube-api-access-dl5f2\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.826799 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5703fa35-b0ed-450d-8b90-2e9308878e46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.826807 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbr84\" (UniqueName: \"kubernetes.io/projected/5703fa35-b0ed-450d-8b90-2e9308878e46-kube-api-access-cbr84\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:10 crc kubenswrapper[4854]: I1007 13:58:10.826816 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.066679 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.082525 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.097676 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.107130 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.127640 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:11 crc kubenswrapper[4854]: E1007 13:58:11.128062 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" containerName="nova-metadata-metadata" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.128080 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" containerName="nova-metadata-metadata" Oct 07 13:58:11 crc kubenswrapper[4854]: E1007 13:58:11.128102 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" containerName="nova-metadata-log" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.128109 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" containerName="nova-metadata-log" Oct 07 13:58:11 crc kubenswrapper[4854]: E1007 13:58:11.128121 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5703fa35-b0ed-450d-8b90-2e9308878e46" containerName="nova-api-log" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.128128 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5703fa35-b0ed-450d-8b90-2e9308878e46" containerName="nova-api-log" Oct 07 13:58:11 crc kubenswrapper[4854]: E1007 13:58:11.128165 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5703fa35-b0ed-450d-8b90-2e9308878e46" containerName="nova-api-api" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.128171 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5703fa35-b0ed-450d-8b90-2e9308878e46" containerName="nova-api-api" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.128341 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5703fa35-b0ed-450d-8b90-2e9308878e46" containerName="nova-api-log" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.128352 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5703fa35-b0ed-450d-8b90-2e9308878e46" containerName="nova-api-api" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.128360 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" containerName="nova-metadata-metadata" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.128371 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" containerName="nova-metadata-log" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.129596 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.134620 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.136051 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.156046 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.164861 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.190258 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.202221 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.241030 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4be194e-57aa-4419-9561-11bd83a70a6b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " pod="openstack/nova-api-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.241120 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4be194e-57aa-4419-9561-11bd83a70a6b-config-data\") pod \"nova-api-0\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " pod="openstack/nova-api-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.241538 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8fgj\" (UniqueName: \"kubernetes.io/projected/e4be194e-57aa-4419-9561-11bd83a70a6b-kube-api-access-j8fgj\") pod \"nova-api-0\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " pod="openstack/nova-api-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.241846 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-config-data\") pod \"nova-metadata-0\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " pod="openstack/nova-metadata-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.241914 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4be194e-57aa-4419-9561-11bd83a70a6b-logs\") pod \"nova-api-0\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " pod="openstack/nova-api-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.242011 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-logs\") pod \"nova-metadata-0\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " pod="openstack/nova-metadata-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.242046 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " pod="openstack/nova-metadata-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.242190 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n9t4\" (UniqueName: \"kubernetes.io/projected/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-kube-api-access-5n9t4\") pod \"nova-metadata-0\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " pod="openstack/nova-metadata-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.344333 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-config-data\") pod \"nova-metadata-0\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " pod="openstack/nova-metadata-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.344380 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4be194e-57aa-4419-9561-11bd83a70a6b-logs\") pod \"nova-api-0\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " pod="openstack/nova-api-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.344409 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-logs\") pod \"nova-metadata-0\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " pod="openstack/nova-metadata-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.344426 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " pod="openstack/nova-metadata-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.344467 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n9t4\" (UniqueName: \"kubernetes.io/projected/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-kube-api-access-5n9t4\") pod \"nova-metadata-0\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " pod="openstack/nova-metadata-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.344500 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4be194e-57aa-4419-9561-11bd83a70a6b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " pod="openstack/nova-api-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.344526 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4be194e-57aa-4419-9561-11bd83a70a6b-config-data\") pod \"nova-api-0\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " pod="openstack/nova-api-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.344561 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8fgj\" (UniqueName: \"kubernetes.io/projected/e4be194e-57aa-4419-9561-11bd83a70a6b-kube-api-access-j8fgj\") pod \"nova-api-0\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " pod="openstack/nova-api-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.345001 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4be194e-57aa-4419-9561-11bd83a70a6b-logs\") pod \"nova-api-0\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " pod="openstack/nova-api-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.345556 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-logs\") pod \"nova-metadata-0\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " pod="openstack/nova-metadata-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.349565 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4be194e-57aa-4419-9561-11bd83a70a6b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " pod="openstack/nova-api-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.361850 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " pod="openstack/nova-metadata-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.362040 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-config-data\") pod \"nova-metadata-0\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " pod="openstack/nova-metadata-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.362396 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4be194e-57aa-4419-9561-11bd83a70a6b-config-data\") pod \"nova-api-0\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " pod="openstack/nova-api-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.365656 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n9t4\" (UniqueName: \"kubernetes.io/projected/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-kube-api-access-5n9t4\") pod \"nova-metadata-0\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " pod="openstack/nova-metadata-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.373345 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8fgj\" (UniqueName: \"kubernetes.io/projected/e4be194e-57aa-4419-9561-11bd83a70a6b-kube-api-access-j8fgj\") pod \"nova-api-0\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " pod="openstack/nova-api-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.477176 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.500245 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.680033 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1c81fc08-b696-4c22-88fa-6842e8725af3","Type":"ContainerStarted","Data":"4f13e983b0dd409c06f1c98b96c956be1610e44a71af7d4186271aed239db28b"} Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.680426 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1c81fc08-b696-4c22-88fa-6842e8725af3","Type":"ContainerStarted","Data":"9f607307c26511e2856d05a69e7b44b9b6e3a65f4cb46a85c5e9a62c5f0d150d"} Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.680454 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.698439 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.698408364 podStartE2EDuration="2.698408364s" podCreationTimestamp="2025-10-07 13:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:11.695784299 +0000 UTC m=+5607.683616564" watchObservedRunningTime="2025-10-07 13:58:11.698408364 +0000 UTC m=+5607.686240639" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.927686 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.942333 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:11 crc kubenswrapper[4854]: I1007 13:58:11.995916 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.046361 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:12 crc kubenswrapper[4854]: W1007 13:58:12.051632 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4be194e_57aa_4419_9561_11bd83a70a6b.slice/crio-4d41535791212d11062ad3c8b288065f8f147310cbfc1c5a7707041849ae1f35 WatchSource:0}: Error finding container 4d41535791212d11062ad3c8b288065f8f147310cbfc1c5a7707041849ae1f35: Status 404 returned error can't find the container with id 4d41535791212d11062ad3c8b288065f8f147310cbfc1c5a7707041849ae1f35 Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.497623 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.578112 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dfbc9b597-t66bm"] Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.578394 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" podUID="9309351f-1023-49a9-b6ff-205232665c04" containerName="dnsmasq-dns" containerID="cri-o://18b0a4d37522dcd0c2daa09cbadfcd71e73f6cb9d57b2ecedcb1490898b463d3" gracePeriod=10 Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.726950 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.726924349 podStartE2EDuration="1.726924349s" podCreationTimestamp="2025-10-07 13:58:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:12.719796945 +0000 UTC m=+5608.707629200" watchObservedRunningTime="2025-10-07 13:58:12.726924349 +0000 UTC m=+5608.714756604" Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.726988 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5703fa35-b0ed-450d-8b90-2e9308878e46" path="/var/lib/kubelet/pods/5703fa35-b0ed-450d-8b90-2e9308878e46/volumes" Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.736078 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf8bce1d-3d34-42f1-a24f-6a1a8465b4de" path="/var/lib/kubelet/pods/bf8bce1d-3d34-42f1-a24f-6a1a8465b4de/volumes" Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.737193 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.737224 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dee1e3fd-c301-45c0-b555-a70c1f3c86e7","Type":"ContainerStarted","Data":"d5ea5a54496e838153824526480a99292395b4eea6263367b2914bf07b7c35b5"} Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.737242 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dee1e3fd-c301-45c0-b555-a70c1f3c86e7","Type":"ContainerStarted","Data":"68b18e28ecd134481ddd133013f390e339bcb384a33f06a8f6642c70c7f66270"} Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.737255 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dee1e3fd-c301-45c0-b555-a70c1f3c86e7","Type":"ContainerStarted","Data":"914ccd041df4ea7d26e9cc13402c84acf1e7c045e892ddce4a6604381cd360d7"} Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.737264 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e4be194e-57aa-4419-9561-11bd83a70a6b","Type":"ContainerStarted","Data":"eca8e3be4b0ddfa1d24447041c7aa4ec4c7cd0e95817a57a30c933d5ebdc5f13"} Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.737273 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e4be194e-57aa-4419-9561-11bd83a70a6b","Type":"ContainerStarted","Data":"983101f09c64e776079735a0d4632e94cacbc8d6f7f3c09b54cccacf13ff51dc"} Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.737282 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e4be194e-57aa-4419-9561-11bd83a70a6b","Type":"ContainerStarted","Data":"4d41535791212d11062ad3c8b288065f8f147310cbfc1c5a7707041849ae1f35"} Oct 07 13:58:12 crc kubenswrapper[4854]: I1007 13:58:12.750431 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.75040729 podStartE2EDuration="1.75040729s" podCreationTimestamp="2025-10-07 13:58:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:12.741389773 +0000 UTC m=+5608.729222028" watchObservedRunningTime="2025-10-07 13:58:12.75040729 +0000 UTC m=+5608.738239545" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.158055 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.184528 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-config\") pod \"9309351f-1023-49a9-b6ff-205232665c04\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.184902 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-ovsdbserver-nb\") pod \"9309351f-1023-49a9-b6ff-205232665c04\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.184926 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-ovsdbserver-sb\") pod \"9309351f-1023-49a9-b6ff-205232665c04\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.185058 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrnws\" (UniqueName: \"kubernetes.io/projected/9309351f-1023-49a9-b6ff-205232665c04-kube-api-access-rrnws\") pod \"9309351f-1023-49a9-b6ff-205232665c04\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.185099 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-dns-svc\") pod \"9309351f-1023-49a9-b6ff-205232665c04\" (UID: \"9309351f-1023-49a9-b6ff-205232665c04\") " Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.222629 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9309351f-1023-49a9-b6ff-205232665c04-kube-api-access-rrnws" (OuterVolumeSpecName: "kube-api-access-rrnws") pod "9309351f-1023-49a9-b6ff-205232665c04" (UID: "9309351f-1023-49a9-b6ff-205232665c04"). InnerVolumeSpecName "kube-api-access-rrnws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.250279 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9309351f-1023-49a9-b6ff-205232665c04" (UID: "9309351f-1023-49a9-b6ff-205232665c04"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.251431 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-config" (OuterVolumeSpecName: "config") pod "9309351f-1023-49a9-b6ff-205232665c04" (UID: "9309351f-1023-49a9-b6ff-205232665c04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.255404 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9309351f-1023-49a9-b6ff-205232665c04" (UID: "9309351f-1023-49a9-b6ff-205232665c04"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.258475 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9309351f-1023-49a9-b6ff-205232665c04" (UID: "9309351f-1023-49a9-b6ff-205232665c04"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.287101 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.287135 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.287164 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.287174 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9309351f-1023-49a9-b6ff-205232665c04-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.287185 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrnws\" (UniqueName: \"kubernetes.io/projected/9309351f-1023-49a9-b6ff-205232665c04-kube-api-access-rrnws\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.723926 4854 generic.go:334] "Generic (PLEG): container finished" podID="9309351f-1023-49a9-b6ff-205232665c04" containerID="18b0a4d37522dcd0c2daa09cbadfcd71e73f6cb9d57b2ecedcb1490898b463d3" exitCode=0 Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.724965 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.726805 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" event={"ID":"9309351f-1023-49a9-b6ff-205232665c04","Type":"ContainerDied","Data":"18b0a4d37522dcd0c2daa09cbadfcd71e73f6cb9d57b2ecedcb1490898b463d3"} Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.726848 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dfbc9b597-t66bm" event={"ID":"9309351f-1023-49a9-b6ff-205232665c04","Type":"ContainerDied","Data":"c29995054050c131aca52b43c8981da649e0ba25ffeaf57e968c14fe1b216440"} Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.726871 4854 scope.go:117] "RemoveContainer" containerID="18b0a4d37522dcd0c2daa09cbadfcd71e73f6cb9d57b2ecedcb1490898b463d3" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.769317 4854 scope.go:117] "RemoveContainer" containerID="18ebb44fc33eb07c52e333aec1d675d152dcaead7e17401fdff86eceea16933e" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.782234 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dfbc9b597-t66bm"] Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.797751 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7dfbc9b597-t66bm"] Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.841305 4854 scope.go:117] "RemoveContainer" containerID="18b0a4d37522dcd0c2daa09cbadfcd71e73f6cb9d57b2ecedcb1490898b463d3" Oct 07 13:58:13 crc kubenswrapper[4854]: E1007 13:58:13.841617 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18b0a4d37522dcd0c2daa09cbadfcd71e73f6cb9d57b2ecedcb1490898b463d3\": container with ID starting with 18b0a4d37522dcd0c2daa09cbadfcd71e73f6cb9d57b2ecedcb1490898b463d3 not found: ID does not exist" containerID="18b0a4d37522dcd0c2daa09cbadfcd71e73f6cb9d57b2ecedcb1490898b463d3" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.841656 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18b0a4d37522dcd0c2daa09cbadfcd71e73f6cb9d57b2ecedcb1490898b463d3"} err="failed to get container status \"18b0a4d37522dcd0c2daa09cbadfcd71e73f6cb9d57b2ecedcb1490898b463d3\": rpc error: code = NotFound desc = could not find container \"18b0a4d37522dcd0c2daa09cbadfcd71e73f6cb9d57b2ecedcb1490898b463d3\": container with ID starting with 18b0a4d37522dcd0c2daa09cbadfcd71e73f6cb9d57b2ecedcb1490898b463d3 not found: ID does not exist" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.841681 4854 scope.go:117] "RemoveContainer" containerID="18ebb44fc33eb07c52e333aec1d675d152dcaead7e17401fdff86eceea16933e" Oct 07 13:58:13 crc kubenswrapper[4854]: E1007 13:58:13.842142 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18ebb44fc33eb07c52e333aec1d675d152dcaead7e17401fdff86eceea16933e\": container with ID starting with 18ebb44fc33eb07c52e333aec1d675d152dcaead7e17401fdff86eceea16933e not found: ID does not exist" containerID="18ebb44fc33eb07c52e333aec1d675d152dcaead7e17401fdff86eceea16933e" Oct 07 13:58:13 crc kubenswrapper[4854]: I1007 13:58:13.842172 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18ebb44fc33eb07c52e333aec1d675d152dcaead7e17401fdff86eceea16933e"} err="failed to get container status \"18ebb44fc33eb07c52e333aec1d675d152dcaead7e17401fdff86eceea16933e\": rpc error: code = NotFound desc = could not find container \"18ebb44fc33eb07c52e333aec1d675d152dcaead7e17401fdff86eceea16933e\": container with ID starting with 18ebb44fc33eb07c52e333aec1d675d152dcaead7e17401fdff86eceea16933e not found: ID does not exist" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.198594 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.205712 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc99286-dead-49ed-8dec-60c75a175855-config-data\") pod \"7cc99286-dead-49ed-8dec-60c75a175855\" (UID: \"7cc99286-dead-49ed-8dec-60c75a175855\") " Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.205781 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc99286-dead-49ed-8dec-60c75a175855-combined-ca-bundle\") pod \"7cc99286-dead-49ed-8dec-60c75a175855\" (UID: \"7cc99286-dead-49ed-8dec-60c75a175855\") " Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.205907 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcqt9\" (UniqueName: \"kubernetes.io/projected/7cc99286-dead-49ed-8dec-60c75a175855-kube-api-access-hcqt9\") pod \"7cc99286-dead-49ed-8dec-60c75a175855\" (UID: \"7cc99286-dead-49ed-8dec-60c75a175855\") " Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.210293 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc99286-dead-49ed-8dec-60c75a175855-kube-api-access-hcqt9" (OuterVolumeSpecName: "kube-api-access-hcqt9") pod "7cc99286-dead-49ed-8dec-60c75a175855" (UID: "7cc99286-dead-49ed-8dec-60c75a175855"). InnerVolumeSpecName "kube-api-access-hcqt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.240390 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc99286-dead-49ed-8dec-60c75a175855-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cc99286-dead-49ed-8dec-60c75a175855" (UID: "7cc99286-dead-49ed-8dec-60c75a175855"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.257557 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc99286-dead-49ed-8dec-60c75a175855-config-data" (OuterVolumeSpecName: "config-data") pod "7cc99286-dead-49ed-8dec-60c75a175855" (UID: "7cc99286-dead-49ed-8dec-60c75a175855"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.317616 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc99286-dead-49ed-8dec-60c75a175855-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.317665 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcqt9\" (UniqueName: \"kubernetes.io/projected/7cc99286-dead-49ed-8dec-60c75a175855-kube-api-access-hcqt9\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.317695 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cc99286-dead-49ed-8dec-60c75a175855-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.716624 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9309351f-1023-49a9-b6ff-205232665c04" path="/var/lib/kubelet/pods/9309351f-1023-49a9-b6ff-205232665c04/volumes" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.740073 4854 generic.go:334] "Generic (PLEG): container finished" podID="7cc99286-dead-49ed-8dec-60c75a175855" containerID="c7966ee62d53d283edbc2b95ca285ded36c86a09f0d15a8bebb9c84e029d0f75" exitCode=0 Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.740138 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7cc99286-dead-49ed-8dec-60c75a175855","Type":"ContainerDied","Data":"c7966ee62d53d283edbc2b95ca285ded36c86a09f0d15a8bebb9c84e029d0f75"} Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.740185 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7cc99286-dead-49ed-8dec-60c75a175855","Type":"ContainerDied","Data":"cc604f8a08848d0737f4a17674518ebaa910d17058769fb5122321aa8a53cf94"} Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.740232 4854 scope.go:117] "RemoveContainer" containerID="c7966ee62d53d283edbc2b95ca285ded36c86a09f0d15a8bebb9c84e029d0f75" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.740362 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.771306 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.787760 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.796363 4854 scope.go:117] "RemoveContainer" containerID="c7966ee62d53d283edbc2b95ca285ded36c86a09f0d15a8bebb9c84e029d0f75" Oct 07 13:58:14 crc kubenswrapper[4854]: E1007 13:58:14.796910 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7966ee62d53d283edbc2b95ca285ded36c86a09f0d15a8bebb9c84e029d0f75\": container with ID starting with c7966ee62d53d283edbc2b95ca285ded36c86a09f0d15a8bebb9c84e029d0f75 not found: ID does not exist" containerID="c7966ee62d53d283edbc2b95ca285ded36c86a09f0d15a8bebb9c84e029d0f75" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.796977 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7966ee62d53d283edbc2b95ca285ded36c86a09f0d15a8bebb9c84e029d0f75"} err="failed to get container status \"c7966ee62d53d283edbc2b95ca285ded36c86a09f0d15a8bebb9c84e029d0f75\": rpc error: code = NotFound desc = could not find container \"c7966ee62d53d283edbc2b95ca285ded36c86a09f0d15a8bebb9c84e029d0f75\": container with ID starting with c7966ee62d53d283edbc2b95ca285ded36c86a09f0d15a8bebb9c84e029d0f75 not found: ID does not exist" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.797756 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:14 crc kubenswrapper[4854]: E1007 13:58:14.799668 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc99286-dead-49ed-8dec-60c75a175855" containerName="nova-scheduler-scheduler" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.799696 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc99286-dead-49ed-8dec-60c75a175855" containerName="nova-scheduler-scheduler" Oct 07 13:58:14 crc kubenswrapper[4854]: E1007 13:58:14.799715 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9309351f-1023-49a9-b6ff-205232665c04" containerName="dnsmasq-dns" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.799724 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9309351f-1023-49a9-b6ff-205232665c04" containerName="dnsmasq-dns" Oct 07 13:58:14 crc kubenswrapper[4854]: E1007 13:58:14.799771 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9309351f-1023-49a9-b6ff-205232665c04" containerName="init" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.799779 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9309351f-1023-49a9-b6ff-205232665c04" containerName="init" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.800300 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9309351f-1023-49a9-b6ff-205232665c04" containerName="dnsmasq-dns" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.800336 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc99286-dead-49ed-8dec-60c75a175855" containerName="nova-scheduler-scheduler" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.801612 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.805508 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.810572 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.824274 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/863c1175-864e-474e-a51b-4a07adfff8e9-config-data\") pod \"nova-scheduler-0\" (UID: \"863c1175-864e-474e-a51b-4a07adfff8e9\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.824418 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863c1175-864e-474e-a51b-4a07adfff8e9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"863c1175-864e-474e-a51b-4a07adfff8e9\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.824577 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvhl4\" (UniqueName: \"kubernetes.io/projected/863c1175-864e-474e-a51b-4a07adfff8e9-kube-api-access-cvhl4\") pod \"nova-scheduler-0\" (UID: \"863c1175-864e-474e-a51b-4a07adfff8e9\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.927792 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863c1175-864e-474e-a51b-4a07adfff8e9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"863c1175-864e-474e-a51b-4a07adfff8e9\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.928090 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvhl4\" (UniqueName: \"kubernetes.io/projected/863c1175-864e-474e-a51b-4a07adfff8e9-kube-api-access-cvhl4\") pod \"nova-scheduler-0\" (UID: \"863c1175-864e-474e-a51b-4a07adfff8e9\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.928221 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/863c1175-864e-474e-a51b-4a07adfff8e9-config-data\") pod \"nova-scheduler-0\" (UID: \"863c1175-864e-474e-a51b-4a07adfff8e9\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.931943 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863c1175-864e-474e-a51b-4a07adfff8e9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"863c1175-864e-474e-a51b-4a07adfff8e9\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.933260 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/863c1175-864e-474e-a51b-4a07adfff8e9-config-data\") pod \"nova-scheduler-0\" (UID: \"863c1175-864e-474e-a51b-4a07adfff8e9\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:14 crc kubenswrapper[4854]: I1007 13:58:14.950584 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvhl4\" (UniqueName: \"kubernetes.io/projected/863c1175-864e-474e-a51b-4a07adfff8e9-kube-api-access-cvhl4\") pod \"nova-scheduler-0\" (UID: \"863c1175-864e-474e-a51b-4a07adfff8e9\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.101269 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.130019 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.617835 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.677930 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-7jltj"] Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.679105 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.684512 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.684675 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.708559 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7jltj"] Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.748448 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7jltj\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.748521 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7g2r\" (UniqueName: \"kubernetes.io/projected/24483653-65d3-4679-b34c-c25b9a30a6cc-kube-api-access-d7g2r\") pod \"nova-cell1-cell-mapping-7jltj\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.748552 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-config-data\") pod \"nova-cell1-cell-mapping-7jltj\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.748598 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-scripts\") pod \"nova-cell1-cell-mapping-7jltj\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.757177 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"863c1175-864e-474e-a51b-4a07adfff8e9","Type":"ContainerStarted","Data":"7c12fd9fdecf6e7a7f68ef96ad9d307a8424439c545fddd3412aea71dbaf3093"} Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.850373 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-scripts\") pod \"nova-cell1-cell-mapping-7jltj\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.850507 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7jltj\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.850552 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7g2r\" (UniqueName: \"kubernetes.io/projected/24483653-65d3-4679-b34c-c25b9a30a6cc-kube-api-access-d7g2r\") pod \"nova-cell1-cell-mapping-7jltj\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.850585 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-config-data\") pod \"nova-cell1-cell-mapping-7jltj\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.854562 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-scripts\") pod \"nova-cell1-cell-mapping-7jltj\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.854929 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-config-data\") pod \"nova-cell1-cell-mapping-7jltj\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.854989 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7jltj\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:15 crc kubenswrapper[4854]: I1007 13:58:15.867065 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7g2r\" (UniqueName: \"kubernetes.io/projected/24483653-65d3-4679-b34c-c25b9a30a6cc-kube-api-access-d7g2r\") pod \"nova-cell1-cell-mapping-7jltj\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:16 crc kubenswrapper[4854]: I1007 13:58:16.036724 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:16 crc kubenswrapper[4854]: I1007 13:58:16.493186 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7jltj"] Oct 07 13:58:16 crc kubenswrapper[4854]: I1007 13:58:16.500456 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:58:16 crc kubenswrapper[4854]: I1007 13:58:16.500762 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:58:16 crc kubenswrapper[4854]: I1007 13:58:16.718559 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cc99286-dead-49ed-8dec-60c75a175855" path="/var/lib/kubelet/pods/7cc99286-dead-49ed-8dec-60c75a175855/volumes" Oct 07 13:58:16 crc kubenswrapper[4854]: I1007 13:58:16.769064 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7jltj" event={"ID":"24483653-65d3-4679-b34c-c25b9a30a6cc","Type":"ContainerStarted","Data":"64dc2353957cd2136fe7e081d1d2716580400493d4919029ba9f566824dc836e"} Oct 07 13:58:16 crc kubenswrapper[4854]: I1007 13:58:16.769477 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7jltj" event={"ID":"24483653-65d3-4679-b34c-c25b9a30a6cc","Type":"ContainerStarted","Data":"3a7995e3b8d42604fa1b5c4b477017edad4ab50919c2d94985f70c13f6e9cd7d"} Oct 07 13:58:16 crc kubenswrapper[4854]: I1007 13:58:16.776472 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"863c1175-864e-474e-a51b-4a07adfff8e9","Type":"ContainerStarted","Data":"8a81d31b69b1232195467e0086a9a8fbf6237f343b47b990762ab6c3ecb9a23a"} Oct 07 13:58:16 crc kubenswrapper[4854]: I1007 13:58:16.794316 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-7jltj" podStartSLOduration=1.794288522 podStartE2EDuration="1.794288522s" podCreationTimestamp="2025-10-07 13:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:16.785270724 +0000 UTC m=+5612.773103019" watchObservedRunningTime="2025-10-07 13:58:16.794288522 +0000 UTC m=+5612.782120777" Oct 07 13:58:16 crc kubenswrapper[4854]: I1007 13:58:16.824563 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.824536837 podStartE2EDuration="2.824536837s" podCreationTimestamp="2025-10-07 13:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:16.82221856 +0000 UTC m=+5612.810050815" watchObservedRunningTime="2025-10-07 13:58:16.824536837 +0000 UTC m=+5612.812369112" Oct 07 13:58:20 crc kubenswrapper[4854]: I1007 13:58:20.130388 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 13:58:21 crc kubenswrapper[4854]: I1007 13:58:21.477648 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 13:58:21 crc kubenswrapper[4854]: I1007 13:58:21.480952 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 13:58:21 crc kubenswrapper[4854]: I1007 13:58:21.500862 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 13:58:21 crc kubenswrapper[4854]: I1007 13:58:21.500933 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 13:58:21 crc kubenswrapper[4854]: I1007 13:58:21.850078 4854 generic.go:334] "Generic (PLEG): container finished" podID="24483653-65d3-4679-b34c-c25b9a30a6cc" containerID="64dc2353957cd2136fe7e081d1d2716580400493d4919029ba9f566824dc836e" exitCode=0 Oct 07 13:58:21 crc kubenswrapper[4854]: I1007 13:58:21.850169 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7jltj" event={"ID":"24483653-65d3-4679-b34c-c25b9a30a6cc","Type":"ContainerDied","Data":"64dc2353957cd2136fe7e081d1d2716580400493d4919029ba9f566824dc836e"} Oct 07 13:58:22 crc kubenswrapper[4854]: I1007 13:58:22.520542 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e4be194e-57aa-4419-9561-11bd83a70a6b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.70:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:58:22 crc kubenswrapper[4854]: I1007 13:58:22.520661 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e4be194e-57aa-4419-9561-11bd83a70a6b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.70:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:58:22 crc kubenswrapper[4854]: I1007 13:58:22.603417 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dee1e3fd-c301-45c0-b555-a70c1f3c86e7" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:58:22 crc kubenswrapper[4854]: I1007 13:58:22.603381 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dee1e3fd-c301-45c0-b555-a70c1f3c86e7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.224422 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.303676 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7g2r\" (UniqueName: \"kubernetes.io/projected/24483653-65d3-4679-b34c-c25b9a30a6cc-kube-api-access-d7g2r\") pod \"24483653-65d3-4679-b34c-c25b9a30a6cc\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.303735 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-combined-ca-bundle\") pod \"24483653-65d3-4679-b34c-c25b9a30a6cc\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.303828 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-scripts\") pod \"24483653-65d3-4679-b34c-c25b9a30a6cc\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.304639 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-config-data\") pod \"24483653-65d3-4679-b34c-c25b9a30a6cc\" (UID: \"24483653-65d3-4679-b34c-c25b9a30a6cc\") " Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.309376 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24483653-65d3-4679-b34c-c25b9a30a6cc-kube-api-access-d7g2r" (OuterVolumeSpecName: "kube-api-access-d7g2r") pod "24483653-65d3-4679-b34c-c25b9a30a6cc" (UID: "24483653-65d3-4679-b34c-c25b9a30a6cc"). InnerVolumeSpecName "kube-api-access-d7g2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.309837 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-scripts" (OuterVolumeSpecName: "scripts") pod "24483653-65d3-4679-b34c-c25b9a30a6cc" (UID: "24483653-65d3-4679-b34c-c25b9a30a6cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.334753 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-config-data" (OuterVolumeSpecName: "config-data") pod "24483653-65d3-4679-b34c-c25b9a30a6cc" (UID: "24483653-65d3-4679-b34c-c25b9a30a6cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.356342 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24483653-65d3-4679-b34c-c25b9a30a6cc" (UID: "24483653-65d3-4679-b34c-c25b9a30a6cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.406686 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7g2r\" (UniqueName: \"kubernetes.io/projected/24483653-65d3-4679-b34c-c25b9a30a6cc-kube-api-access-d7g2r\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.406728 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.406737 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.406748 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24483653-65d3-4679-b34c-c25b9a30a6cc-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.871595 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7jltj" event={"ID":"24483653-65d3-4679-b34c-c25b9a30a6cc","Type":"ContainerDied","Data":"3a7995e3b8d42604fa1b5c4b477017edad4ab50919c2d94985f70c13f6e9cd7d"} Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.871650 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a7995e3b8d42604fa1b5c4b477017edad4ab50919c2d94985f70c13f6e9cd7d" Oct 07 13:58:23 crc kubenswrapper[4854]: I1007 13:58:23.871724 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7jltj" Oct 07 13:58:24 crc kubenswrapper[4854]: I1007 13:58:24.067818 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:24 crc kubenswrapper[4854]: I1007 13:58:24.068094 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="863c1175-864e-474e-a51b-4a07adfff8e9" containerName="nova-scheduler-scheduler" containerID="cri-o://8a81d31b69b1232195467e0086a9a8fbf6237f343b47b990762ab6c3ecb9a23a" gracePeriod=30 Oct 07 13:58:24 crc kubenswrapper[4854]: I1007 13:58:24.077761 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:24 crc kubenswrapper[4854]: I1007 13:58:24.078104 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e4be194e-57aa-4419-9561-11bd83a70a6b" containerName="nova-api-log" containerID="cri-o://983101f09c64e776079735a0d4632e94cacbc8d6f7f3c09b54cccacf13ff51dc" gracePeriod=30 Oct 07 13:58:24 crc kubenswrapper[4854]: I1007 13:58:24.078165 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e4be194e-57aa-4419-9561-11bd83a70a6b" containerName="nova-api-api" containerID="cri-o://eca8e3be4b0ddfa1d24447041c7aa4ec4c7cd0e95817a57a30c933d5ebdc5f13" gracePeriod=30 Oct 07 13:58:24 crc kubenswrapper[4854]: I1007 13:58:24.089057 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:24 crc kubenswrapper[4854]: I1007 13:58:24.089986 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dee1e3fd-c301-45c0-b555-a70c1f3c86e7" containerName="nova-metadata-metadata" containerID="cri-o://d5ea5a54496e838153824526480a99292395b4eea6263367b2914bf07b7c35b5" gracePeriod=30 Oct 07 13:58:24 crc kubenswrapper[4854]: I1007 13:58:24.093327 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dee1e3fd-c301-45c0-b555-a70c1f3c86e7" containerName="nova-metadata-log" containerID="cri-o://68b18e28ecd134481ddd133013f390e339bcb384a33f06a8f6642c70c7f66270" gracePeriod=30 Oct 07 13:58:24 crc kubenswrapper[4854]: I1007 13:58:24.882701 4854 generic.go:334] "Generic (PLEG): container finished" podID="e4be194e-57aa-4419-9561-11bd83a70a6b" containerID="983101f09c64e776079735a0d4632e94cacbc8d6f7f3c09b54cccacf13ff51dc" exitCode=143 Oct 07 13:58:24 crc kubenswrapper[4854]: I1007 13:58:24.882811 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e4be194e-57aa-4419-9561-11bd83a70a6b","Type":"ContainerDied","Data":"983101f09c64e776079735a0d4632e94cacbc8d6f7f3c09b54cccacf13ff51dc"} Oct 07 13:58:24 crc kubenswrapper[4854]: I1007 13:58:24.885393 4854 generic.go:334] "Generic (PLEG): container finished" podID="dee1e3fd-c301-45c0-b555-a70c1f3c86e7" containerID="68b18e28ecd134481ddd133013f390e339bcb384a33f06a8f6642c70c7f66270" exitCode=143 Oct 07 13:58:24 crc kubenswrapper[4854]: I1007 13:58:24.885444 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dee1e3fd-c301-45c0-b555-a70c1f3c86e7","Type":"ContainerDied","Data":"68b18e28ecd134481ddd133013f390e339bcb384a33f06a8f6642c70c7f66270"} Oct 07 13:58:26 crc kubenswrapper[4854]: I1007 13:58:26.911245 4854 generic.go:334] "Generic (PLEG): container finished" podID="863c1175-864e-474e-a51b-4a07adfff8e9" containerID="8a81d31b69b1232195467e0086a9a8fbf6237f343b47b990762ab6c3ecb9a23a" exitCode=0 Oct 07 13:58:26 crc kubenswrapper[4854]: I1007 13:58:26.911527 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"863c1175-864e-474e-a51b-4a07adfff8e9","Type":"ContainerDied","Data":"8a81d31b69b1232195467e0086a9a8fbf6237f343b47b990762ab6c3ecb9a23a"} Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.427741 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.493624 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvhl4\" (UniqueName: \"kubernetes.io/projected/863c1175-864e-474e-a51b-4a07adfff8e9-kube-api-access-cvhl4\") pod \"863c1175-864e-474e-a51b-4a07adfff8e9\" (UID: \"863c1175-864e-474e-a51b-4a07adfff8e9\") " Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.493725 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/863c1175-864e-474e-a51b-4a07adfff8e9-config-data\") pod \"863c1175-864e-474e-a51b-4a07adfff8e9\" (UID: \"863c1175-864e-474e-a51b-4a07adfff8e9\") " Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.493798 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863c1175-864e-474e-a51b-4a07adfff8e9-combined-ca-bundle\") pod \"863c1175-864e-474e-a51b-4a07adfff8e9\" (UID: \"863c1175-864e-474e-a51b-4a07adfff8e9\") " Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.501288 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/863c1175-864e-474e-a51b-4a07adfff8e9-kube-api-access-cvhl4" (OuterVolumeSpecName: "kube-api-access-cvhl4") pod "863c1175-864e-474e-a51b-4a07adfff8e9" (UID: "863c1175-864e-474e-a51b-4a07adfff8e9"). InnerVolumeSpecName "kube-api-access-cvhl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.519475 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/863c1175-864e-474e-a51b-4a07adfff8e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "863c1175-864e-474e-a51b-4a07adfff8e9" (UID: "863c1175-864e-474e-a51b-4a07adfff8e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.525275 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/863c1175-864e-474e-a51b-4a07adfff8e9-config-data" (OuterVolumeSpecName: "config-data") pod "863c1175-864e-474e-a51b-4a07adfff8e9" (UID: "863c1175-864e-474e-a51b-4a07adfff8e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.595953 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvhl4\" (UniqueName: \"kubernetes.io/projected/863c1175-864e-474e-a51b-4a07adfff8e9-kube-api-access-cvhl4\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.595992 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/863c1175-864e-474e-a51b-4a07adfff8e9-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.596002 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/863c1175-864e-474e-a51b-4a07adfff8e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.657168 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.699700 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-config-data\") pod \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.699762 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-logs\") pod \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.699809 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n9t4\" (UniqueName: \"kubernetes.io/projected/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-kube-api-access-5n9t4\") pod \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.699851 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-combined-ca-bundle\") pod \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\" (UID: \"dee1e3fd-c301-45c0-b555-a70c1f3c86e7\") " Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.702362 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-logs" (OuterVolumeSpecName: "logs") pod "dee1e3fd-c301-45c0-b555-a70c1f3c86e7" (UID: "dee1e3fd-c301-45c0-b555-a70c1f3c86e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.710578 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-kube-api-access-5n9t4" (OuterVolumeSpecName: "kube-api-access-5n9t4") pod "dee1e3fd-c301-45c0-b555-a70c1f3c86e7" (UID: "dee1e3fd-c301-45c0-b555-a70c1f3c86e7"). InnerVolumeSpecName "kube-api-access-5n9t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.734876 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-config-data" (OuterVolumeSpecName: "config-data") pod "dee1e3fd-c301-45c0-b555-a70c1f3c86e7" (UID: "dee1e3fd-c301-45c0-b555-a70c1f3c86e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.737404 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dee1e3fd-c301-45c0-b555-a70c1f3c86e7" (UID: "dee1e3fd-c301-45c0-b555-a70c1f3c86e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.801740 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.801792 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.801806 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n9t4\" (UniqueName: \"kubernetes.io/projected/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-kube-api-access-5n9t4\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.801819 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee1e3fd-c301-45c0-b555-a70c1f3c86e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.895253 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.903329 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4be194e-57aa-4419-9561-11bd83a70a6b-config-data\") pod \"e4be194e-57aa-4419-9561-11bd83a70a6b\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.903488 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8fgj\" (UniqueName: \"kubernetes.io/projected/e4be194e-57aa-4419-9561-11bd83a70a6b-kube-api-access-j8fgj\") pod \"e4be194e-57aa-4419-9561-11bd83a70a6b\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.903880 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4be194e-57aa-4419-9561-11bd83a70a6b-combined-ca-bundle\") pod \"e4be194e-57aa-4419-9561-11bd83a70a6b\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.907063 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4be194e-57aa-4419-9561-11bd83a70a6b-kube-api-access-j8fgj" (OuterVolumeSpecName: "kube-api-access-j8fgj") pod "e4be194e-57aa-4419-9561-11bd83a70a6b" (UID: "e4be194e-57aa-4419-9561-11bd83a70a6b"). InnerVolumeSpecName "kube-api-access-j8fgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.930842 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"863c1175-864e-474e-a51b-4a07adfff8e9","Type":"ContainerDied","Data":"7c12fd9fdecf6e7a7f68ef96ad9d307a8424439c545fddd3412aea71dbaf3093"} Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.931033 4854 scope.go:117] "RemoveContainer" containerID="8a81d31b69b1232195467e0086a9a8fbf6237f343b47b990762ab6c3ecb9a23a" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.932375 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.936194 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4be194e-57aa-4419-9561-11bd83a70a6b-config-data" (OuterVolumeSpecName: "config-data") pod "e4be194e-57aa-4419-9561-11bd83a70a6b" (UID: "e4be194e-57aa-4419-9561-11bd83a70a6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.953555 4854 generic.go:334] "Generic (PLEG): container finished" podID="dee1e3fd-c301-45c0-b555-a70c1f3c86e7" containerID="d5ea5a54496e838153824526480a99292395b4eea6263367b2914bf07b7c35b5" exitCode=0 Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.953824 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dee1e3fd-c301-45c0-b555-a70c1f3c86e7","Type":"ContainerDied","Data":"d5ea5a54496e838153824526480a99292395b4eea6263367b2914bf07b7c35b5"} Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.953931 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dee1e3fd-c301-45c0-b555-a70c1f3c86e7","Type":"ContainerDied","Data":"914ccd041df4ea7d26e9cc13402c84acf1e7c045e892ddce4a6604381cd360d7"} Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.954048 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.954461 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4be194e-57aa-4419-9561-11bd83a70a6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4be194e-57aa-4419-9561-11bd83a70a6b" (UID: "e4be194e-57aa-4419-9561-11bd83a70a6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.961596 4854 generic.go:334] "Generic (PLEG): container finished" podID="e4be194e-57aa-4419-9561-11bd83a70a6b" containerID="eca8e3be4b0ddfa1d24447041c7aa4ec4c7cd0e95817a57a30c933d5ebdc5f13" exitCode=0 Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.961660 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e4be194e-57aa-4419-9561-11bd83a70a6b","Type":"ContainerDied","Data":"eca8e3be4b0ddfa1d24447041c7aa4ec4c7cd0e95817a57a30c933d5ebdc5f13"} Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.961689 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e4be194e-57aa-4419-9561-11bd83a70a6b","Type":"ContainerDied","Data":"4d41535791212d11062ad3c8b288065f8f147310cbfc1c5a7707041849ae1f35"} Oct 07 13:58:27 crc kubenswrapper[4854]: I1007 13:58:27.961822 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.007249 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4be194e-57aa-4419-9561-11bd83a70a6b-logs\") pod \"e4be194e-57aa-4419-9561-11bd83a70a6b\" (UID: \"e4be194e-57aa-4419-9561-11bd83a70a6b\") " Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.007681 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4be194e-57aa-4419-9561-11bd83a70a6b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.007702 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8fgj\" (UniqueName: \"kubernetes.io/projected/e4be194e-57aa-4419-9561-11bd83a70a6b-kube-api-access-j8fgj\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.007716 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4be194e-57aa-4419-9561-11bd83a70a6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.008100 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4be194e-57aa-4419-9561-11bd83a70a6b-logs" (OuterVolumeSpecName: "logs") pod "e4be194e-57aa-4419-9561-11bd83a70a6b" (UID: "e4be194e-57aa-4419-9561-11bd83a70a6b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.050785 4854 scope.go:117] "RemoveContainer" containerID="d5ea5a54496e838153824526480a99292395b4eea6263367b2914bf07b7c35b5" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.062932 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.091970 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.115075 4854 scope.go:117] "RemoveContainer" containerID="68b18e28ecd134481ddd133013f390e339bcb384a33f06a8f6642c70c7f66270" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.116265 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4be194e-57aa-4419-9561-11bd83a70a6b-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.119161 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.132721 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.138079 4854 scope.go:117] "RemoveContainer" containerID="d5ea5a54496e838153824526480a99292395b4eea6263367b2914bf07b7c35b5" Oct 07 13:58:28 crc kubenswrapper[4854]: E1007 13:58:28.138896 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ea5a54496e838153824526480a99292395b4eea6263367b2914bf07b7c35b5\": container with ID starting with d5ea5a54496e838153824526480a99292395b4eea6263367b2914bf07b7c35b5 not found: ID does not exist" containerID="d5ea5a54496e838153824526480a99292395b4eea6263367b2914bf07b7c35b5" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.139018 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ea5a54496e838153824526480a99292395b4eea6263367b2914bf07b7c35b5"} err="failed to get container status \"d5ea5a54496e838153824526480a99292395b4eea6263367b2914bf07b7c35b5\": rpc error: code = NotFound desc = could not find container \"d5ea5a54496e838153824526480a99292395b4eea6263367b2914bf07b7c35b5\": container with ID starting with d5ea5a54496e838153824526480a99292395b4eea6263367b2914bf07b7c35b5 not found: ID does not exist" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.139117 4854 scope.go:117] "RemoveContainer" containerID="68b18e28ecd134481ddd133013f390e339bcb384a33f06a8f6642c70c7f66270" Oct 07 13:58:28 crc kubenswrapper[4854]: E1007 13:58:28.140437 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b18e28ecd134481ddd133013f390e339bcb384a33f06a8f6642c70c7f66270\": container with ID starting with 68b18e28ecd134481ddd133013f390e339bcb384a33f06a8f6642c70c7f66270 not found: ID does not exist" containerID="68b18e28ecd134481ddd133013f390e339bcb384a33f06a8f6642c70c7f66270" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.140677 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b18e28ecd134481ddd133013f390e339bcb384a33f06a8f6642c70c7f66270"} err="failed to get container status \"68b18e28ecd134481ddd133013f390e339bcb384a33f06a8f6642c70c7f66270\": rpc error: code = NotFound desc = could not find container \"68b18e28ecd134481ddd133013f390e339bcb384a33f06a8f6642c70c7f66270\": container with ID starting with 68b18e28ecd134481ddd133013f390e339bcb384a33f06a8f6642c70c7f66270 not found: ID does not exist" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.140829 4854 scope.go:117] "RemoveContainer" containerID="eca8e3be4b0ddfa1d24447041c7aa4ec4c7cd0e95817a57a30c933d5ebdc5f13" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.141532 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:28 crc kubenswrapper[4854]: E1007 13:58:28.142190 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee1e3fd-c301-45c0-b555-a70c1f3c86e7" containerName="nova-metadata-log" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.142216 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee1e3fd-c301-45c0-b555-a70c1f3c86e7" containerName="nova-metadata-log" Oct 07 13:58:28 crc kubenswrapper[4854]: E1007 13:58:28.142248 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4be194e-57aa-4419-9561-11bd83a70a6b" containerName="nova-api-log" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.142259 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4be194e-57aa-4419-9561-11bd83a70a6b" containerName="nova-api-log" Oct 07 13:58:28 crc kubenswrapper[4854]: E1007 13:58:28.142277 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863c1175-864e-474e-a51b-4a07adfff8e9" containerName="nova-scheduler-scheduler" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.142286 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="863c1175-864e-474e-a51b-4a07adfff8e9" containerName="nova-scheduler-scheduler" Oct 07 13:58:28 crc kubenswrapper[4854]: E1007 13:58:28.142326 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee1e3fd-c301-45c0-b555-a70c1f3c86e7" containerName="nova-metadata-metadata" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.142337 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee1e3fd-c301-45c0-b555-a70c1f3c86e7" containerName="nova-metadata-metadata" Oct 07 13:58:28 crc kubenswrapper[4854]: E1007 13:58:28.142351 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4be194e-57aa-4419-9561-11bd83a70a6b" containerName="nova-api-api" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.142360 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4be194e-57aa-4419-9561-11bd83a70a6b" containerName="nova-api-api" Oct 07 13:58:28 crc kubenswrapper[4854]: E1007 13:58:28.142380 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24483653-65d3-4679-b34c-c25b9a30a6cc" containerName="nova-manage" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.142389 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="24483653-65d3-4679-b34c-c25b9a30a6cc" containerName="nova-manage" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.142634 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="863c1175-864e-474e-a51b-4a07adfff8e9" containerName="nova-scheduler-scheduler" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.142661 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="24483653-65d3-4679-b34c-c25b9a30a6cc" containerName="nova-manage" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.142677 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee1e3fd-c301-45c0-b555-a70c1f3c86e7" containerName="nova-metadata-metadata" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.142692 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4be194e-57aa-4419-9561-11bd83a70a6b" containerName="nova-api-log" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.142717 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee1e3fd-c301-45c0-b555-a70c1f3c86e7" containerName="nova-metadata-log" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.142732 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4be194e-57aa-4419-9561-11bd83a70a6b" containerName="nova-api-api" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.144475 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.146428 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.151536 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.152751 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.154250 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.168905 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.178550 4854 scope.go:117] "RemoveContainer" containerID="983101f09c64e776079735a0d4632e94cacbc8d6f7f3c09b54cccacf13ff51dc" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.179372 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.194975 4854 scope.go:117] "RemoveContainer" containerID="eca8e3be4b0ddfa1d24447041c7aa4ec4c7cd0e95817a57a30c933d5ebdc5f13" Oct 07 13:58:28 crc kubenswrapper[4854]: E1007 13:58:28.195451 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eca8e3be4b0ddfa1d24447041c7aa4ec4c7cd0e95817a57a30c933d5ebdc5f13\": container with ID starting with eca8e3be4b0ddfa1d24447041c7aa4ec4c7cd0e95817a57a30c933d5ebdc5f13 not found: ID does not exist" containerID="eca8e3be4b0ddfa1d24447041c7aa4ec4c7cd0e95817a57a30c933d5ebdc5f13" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.195489 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca8e3be4b0ddfa1d24447041c7aa4ec4c7cd0e95817a57a30c933d5ebdc5f13"} err="failed to get container status \"eca8e3be4b0ddfa1d24447041c7aa4ec4c7cd0e95817a57a30c933d5ebdc5f13\": rpc error: code = NotFound desc = could not find container \"eca8e3be4b0ddfa1d24447041c7aa4ec4c7cd0e95817a57a30c933d5ebdc5f13\": container with ID starting with eca8e3be4b0ddfa1d24447041c7aa4ec4c7cd0e95817a57a30c933d5ebdc5f13 not found: ID does not exist" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.195518 4854 scope.go:117] "RemoveContainer" containerID="983101f09c64e776079735a0d4632e94cacbc8d6f7f3c09b54cccacf13ff51dc" Oct 07 13:58:28 crc kubenswrapper[4854]: E1007 13:58:28.195807 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"983101f09c64e776079735a0d4632e94cacbc8d6f7f3c09b54cccacf13ff51dc\": container with ID starting with 983101f09c64e776079735a0d4632e94cacbc8d6f7f3c09b54cccacf13ff51dc not found: ID does not exist" containerID="983101f09c64e776079735a0d4632e94cacbc8d6f7f3c09b54cccacf13ff51dc" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.195856 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"983101f09c64e776079735a0d4632e94cacbc8d6f7f3c09b54cccacf13ff51dc"} err="failed to get container status \"983101f09c64e776079735a0d4632e94cacbc8d6f7f3c09b54cccacf13ff51dc\": rpc error: code = NotFound desc = could not find container \"983101f09c64e776079735a0d4632e94cacbc8d6f7f3c09b54cccacf13ff51dc\": container with ID starting with 983101f09c64e776079735a0d4632e94cacbc8d6f7f3c09b54cccacf13ff51dc not found: ID does not exist" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.218334 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b62572-0d75-4e00-8215-0efe097aa5ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " pod="openstack/nova-metadata-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.218404 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b62572-0d75-4e00-8215-0efe097aa5ad-config-data\") pod \"nova-metadata-0\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " pod="openstack/nova-metadata-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.218482 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4b62572-0d75-4e00-8215-0efe097aa5ad-logs\") pod \"nova-metadata-0\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " pod="openstack/nova-metadata-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.218534 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2jsq\" (UniqueName: \"kubernetes.io/projected/a7139c6b-fad9-41da-ac5e-feff21db38c1-kube-api-access-t2jsq\") pod \"nova-scheduler-0\" (UID: \"a7139c6b-fad9-41da-ac5e-feff21db38c1\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.218673 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk4sq\" (UniqueName: \"kubernetes.io/projected/d4b62572-0d75-4e00-8215-0efe097aa5ad-kube-api-access-vk4sq\") pod \"nova-metadata-0\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " pod="openstack/nova-metadata-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.218905 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7139c6b-fad9-41da-ac5e-feff21db38c1-config-data\") pod \"nova-scheduler-0\" (UID: \"a7139c6b-fad9-41da-ac5e-feff21db38c1\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.218935 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7139c6b-fad9-41da-ac5e-feff21db38c1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a7139c6b-fad9-41da-ac5e-feff21db38c1\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.312226 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.320282 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7139c6b-fad9-41da-ac5e-feff21db38c1-config-data\") pod \"nova-scheduler-0\" (UID: \"a7139c6b-fad9-41da-ac5e-feff21db38c1\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.320342 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7139c6b-fad9-41da-ac5e-feff21db38c1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a7139c6b-fad9-41da-ac5e-feff21db38c1\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.320391 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b62572-0d75-4e00-8215-0efe097aa5ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " pod="openstack/nova-metadata-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.320432 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b62572-0d75-4e00-8215-0efe097aa5ad-config-data\") pod \"nova-metadata-0\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " pod="openstack/nova-metadata-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.320468 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4b62572-0d75-4e00-8215-0efe097aa5ad-logs\") pod \"nova-metadata-0\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " pod="openstack/nova-metadata-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.320510 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2jsq\" (UniqueName: \"kubernetes.io/projected/a7139c6b-fad9-41da-ac5e-feff21db38c1-kube-api-access-t2jsq\") pod \"nova-scheduler-0\" (UID: \"a7139c6b-fad9-41da-ac5e-feff21db38c1\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.320564 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk4sq\" (UniqueName: \"kubernetes.io/projected/d4b62572-0d75-4e00-8215-0efe097aa5ad-kube-api-access-vk4sq\") pod \"nova-metadata-0\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " pod="openstack/nova-metadata-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.321352 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4b62572-0d75-4e00-8215-0efe097aa5ad-logs\") pod \"nova-metadata-0\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " pod="openstack/nova-metadata-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.325133 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7139c6b-fad9-41da-ac5e-feff21db38c1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a7139c6b-fad9-41da-ac5e-feff21db38c1\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.325203 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b62572-0d75-4e00-8215-0efe097aa5ad-config-data\") pod \"nova-metadata-0\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " pod="openstack/nova-metadata-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.325222 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.325796 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7139c6b-fad9-41da-ac5e-feff21db38c1-config-data\") pod \"nova-scheduler-0\" (UID: \"a7139c6b-fad9-41da-ac5e-feff21db38c1\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.328676 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b62572-0d75-4e00-8215-0efe097aa5ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " pod="openstack/nova-metadata-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.354125 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.360779 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.369910 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.381030 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk4sq\" (UniqueName: \"kubernetes.io/projected/d4b62572-0d75-4e00-8215-0efe097aa5ad-kube-api-access-vk4sq\") pod \"nova-metadata-0\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " pod="openstack/nova-metadata-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.383419 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.392991 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2jsq\" (UniqueName: \"kubernetes.io/projected/a7139c6b-fad9-41da-ac5e-feff21db38c1-kube-api-access-t2jsq\") pod \"nova-scheduler-0\" (UID: \"a7139c6b-fad9-41da-ac5e-feff21db38c1\") " pod="openstack/nova-scheduler-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.423418 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.423712 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8hnk\" (UniqueName: \"kubernetes.io/projected/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-kube-api-access-l8hnk\") pod \"nova-api-0\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.423853 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-logs\") pod \"nova-api-0\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.423888 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-config-data\") pod \"nova-api-0\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.471770 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.484995 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.525650 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8hnk\" (UniqueName: \"kubernetes.io/projected/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-kube-api-access-l8hnk\") pod \"nova-api-0\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.525721 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-logs\") pod \"nova-api-0\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.525757 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-config-data\") pod \"nova-api-0\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.525795 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.526258 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-logs\") pod \"nova-api-0\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.530570 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-config-data\") pod \"nova-api-0\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.530838 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.544077 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8hnk\" (UniqueName: \"kubernetes.io/projected/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-kube-api-access-l8hnk\") pod \"nova-api-0\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.719961 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="863c1175-864e-474e-a51b-4a07adfff8e9" path="/var/lib/kubelet/pods/863c1175-864e-474e-a51b-4a07adfff8e9/volumes" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.720663 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dee1e3fd-c301-45c0-b555-a70c1f3c86e7" path="/var/lib/kubelet/pods/dee1e3fd-c301-45c0-b555-a70c1f3c86e7/volumes" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.721400 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4be194e-57aa-4419-9561-11bd83a70a6b" path="/var/lib/kubelet/pods/e4be194e-57aa-4419-9561-11bd83a70a6b/volumes" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.756179 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.928473 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:58:28 crc kubenswrapper[4854]: I1007 13:58:28.979384 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4b62572-0d75-4e00-8215-0efe097aa5ad","Type":"ContainerStarted","Data":"aa74902a2f656ce8452abc3b1732d7299c1a7c34f5c09209a42ca96f15f2254f"} Oct 07 13:58:29 crc kubenswrapper[4854]: I1007 13:58:29.003797 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:58:29 crc kubenswrapper[4854]: I1007 13:58:29.042831 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:58:29 crc kubenswrapper[4854]: W1007 13:58:29.059637 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2378d51_a1ef_4417_9ecd_5bec044c7f9a.slice/crio-a406d83dbcde43c15156932ce780bd75be18775b030754764c099fe1c405ce47 WatchSource:0}: Error finding container a406d83dbcde43c15156932ce780bd75be18775b030754764c099fe1c405ce47: Status 404 returned error can't find the container with id a406d83dbcde43c15156932ce780bd75be18775b030754764c099fe1c405ce47 Oct 07 13:58:29 crc kubenswrapper[4854]: I1007 13:58:29.996924 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a7139c6b-fad9-41da-ac5e-feff21db38c1","Type":"ContainerStarted","Data":"5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15"} Oct 07 13:58:29 crc kubenswrapper[4854]: I1007 13:58:29.997299 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a7139c6b-fad9-41da-ac5e-feff21db38c1","Type":"ContainerStarted","Data":"ff915f63ee8e4338f958b2e60dbfd7f8b0ab99f5271cff109e4489ff417f2e25"} Oct 07 13:58:30 crc kubenswrapper[4854]: I1007 13:58:30.001772 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4b62572-0d75-4e00-8215-0efe097aa5ad","Type":"ContainerStarted","Data":"fd14c247db401a5f40b87b4108ce86eb56416925a695eae766f22cff61c5971b"} Oct 07 13:58:30 crc kubenswrapper[4854]: I1007 13:58:30.001851 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4b62572-0d75-4e00-8215-0efe097aa5ad","Type":"ContainerStarted","Data":"05f14f52023bcab9f989f9f9684117c21d02a70e42ba64c95a1844b38efedcb3"} Oct 07 13:58:30 crc kubenswrapper[4854]: I1007 13:58:30.007777 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2378d51-a1ef-4417-9ecd-5bec044c7f9a","Type":"ContainerStarted","Data":"df1043ba50c6b1deb56442ad2422456e643cafab877a18e0393b3d673e6e2cb3"} Oct 07 13:58:30 crc kubenswrapper[4854]: I1007 13:58:30.007866 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2378d51-a1ef-4417-9ecd-5bec044c7f9a","Type":"ContainerStarted","Data":"9042dd2e9539d42a711c3eb659343d3385a072a8789fde76452a4cf328458ddb"} Oct 07 13:58:30 crc kubenswrapper[4854]: I1007 13:58:30.007889 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2378d51-a1ef-4417-9ecd-5bec044c7f9a","Type":"ContainerStarted","Data":"a406d83dbcde43c15156932ce780bd75be18775b030754764c099fe1c405ce47"} Oct 07 13:58:30 crc kubenswrapper[4854]: I1007 13:58:30.062011 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.061982419 podStartE2EDuration="2.061982419s" podCreationTimestamp="2025-10-07 13:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:30.03155451 +0000 UTC m=+5626.019386795" watchObservedRunningTime="2025-10-07 13:58:30.061982419 +0000 UTC m=+5626.049814714" Oct 07 13:58:30 crc kubenswrapper[4854]: I1007 13:58:30.077781 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.07775046 podStartE2EDuration="2.07775046s" podCreationTimestamp="2025-10-07 13:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:30.059886149 +0000 UTC m=+5626.047718464" watchObservedRunningTime="2025-10-07 13:58:30.07775046 +0000 UTC m=+5626.065582755" Oct 07 13:58:30 crc kubenswrapper[4854]: I1007 13:58:30.112209 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.112182614 podStartE2EDuration="2.112182614s" podCreationTimestamp="2025-10-07 13:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:30.087342804 +0000 UTC m=+5626.075175089" watchObservedRunningTime="2025-10-07 13:58:30.112182614 +0000 UTC m=+5626.100014869" Oct 07 13:58:33 crc kubenswrapper[4854]: I1007 13:58:33.471897 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:58:33 crc kubenswrapper[4854]: I1007 13:58:33.472403 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:58:33 crc kubenswrapper[4854]: I1007 13:58:33.485523 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 13:58:38 crc kubenswrapper[4854]: I1007 13:58:38.472029 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 13:58:38 crc kubenswrapper[4854]: I1007 13:58:38.472687 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 13:58:38 crc kubenswrapper[4854]: I1007 13:58:38.485297 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 13:58:38 crc kubenswrapper[4854]: I1007 13:58:38.522074 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 13:58:38 crc kubenswrapper[4854]: I1007 13:58:38.757226 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 13:58:38 crc kubenswrapper[4854]: I1007 13:58:38.757918 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 13:58:39 crc kubenswrapper[4854]: I1007 13:58:39.155362 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 13:58:39 crc kubenswrapper[4854]: I1007 13:58:39.554403 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d4b62572-0d75-4e00-8215-0efe097aa5ad" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.74:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:58:39 crc kubenswrapper[4854]: I1007 13:58:39.554457 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d4b62572-0d75-4e00-8215-0efe097aa5ad" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.74:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:58:39 crc kubenswrapper[4854]: I1007 13:58:39.798380 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b2378d51-a1ef-4417-9ecd-5bec044c7f9a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.76:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:58:39 crc kubenswrapper[4854]: I1007 13:58:39.798470 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b2378d51-a1ef-4417-9ecd-5bec044c7f9a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.76:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:58:48 crc kubenswrapper[4854]: I1007 13:58:48.474601 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 13:58:48 crc kubenswrapper[4854]: I1007 13:58:48.476712 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 13:58:48 crc kubenswrapper[4854]: I1007 13:58:48.478360 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 13:58:48 crc kubenswrapper[4854]: I1007 13:58:48.763270 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 13:58:48 crc kubenswrapper[4854]: I1007 13:58:48.763746 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 13:58:48 crc kubenswrapper[4854]: I1007 13:58:48.763856 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 13:58:48 crc kubenswrapper[4854]: I1007 13:58:48.767861 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.242600 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.244573 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.248393 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.510450 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7854858d47-tb5j6"] Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.512071 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.528863 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7854858d47-tb5j6"] Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.648191 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-dns-svc\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.648269 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-ovsdbserver-sb\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.648584 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5frcb\" (UniqueName: \"kubernetes.io/projected/9e46c481-351c-4f63-aba4-31a89f278f1c-kube-api-access-5frcb\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.649034 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-ovsdbserver-nb\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.649116 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-config\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.753819 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5frcb\" (UniqueName: \"kubernetes.io/projected/9e46c481-351c-4f63-aba4-31a89f278f1c-kube-api-access-5frcb\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.754027 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-ovsdbserver-nb\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.754107 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-config\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.754223 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-dns-svc\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.754290 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-ovsdbserver-sb\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.756000 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-ovsdbserver-nb\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.756652 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-ovsdbserver-sb\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.756682 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-config\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.757383 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-dns-svc\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.779446 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5frcb\" (UniqueName: \"kubernetes.io/projected/9e46c481-351c-4f63-aba4-31a89f278f1c-kube-api-access-5frcb\") pod \"dnsmasq-dns-7854858d47-tb5j6\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:49 crc kubenswrapper[4854]: I1007 13:58:49.832734 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:50 crc kubenswrapper[4854]: W1007 13:58:50.392790 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e46c481_351c_4f63_aba4_31a89f278f1c.slice/crio-0a3ce2c5601738f18adfac65788117166653b1115a5882263bc985e2a34a412f WatchSource:0}: Error finding container 0a3ce2c5601738f18adfac65788117166653b1115a5882263bc985e2a34a412f: Status 404 returned error can't find the container with id 0a3ce2c5601738f18adfac65788117166653b1115a5882263bc985e2a34a412f Oct 07 13:58:50 crc kubenswrapper[4854]: I1007 13:58:50.396534 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7854858d47-tb5j6"] Oct 07 13:58:51 crc kubenswrapper[4854]: I1007 13:58:51.265680 4854 generic.go:334] "Generic (PLEG): container finished" podID="9e46c481-351c-4f63-aba4-31a89f278f1c" containerID="0ae0eb5a3f9e855b10dfbcee498e8ed101beb4cb2ed1ac0c248351f7554976c6" exitCode=0 Oct 07 13:58:51 crc kubenswrapper[4854]: I1007 13:58:51.265766 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7854858d47-tb5j6" event={"ID":"9e46c481-351c-4f63-aba4-31a89f278f1c","Type":"ContainerDied","Data":"0ae0eb5a3f9e855b10dfbcee498e8ed101beb4cb2ed1ac0c248351f7554976c6"} Oct 07 13:58:51 crc kubenswrapper[4854]: I1007 13:58:51.265849 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7854858d47-tb5j6" event={"ID":"9e46c481-351c-4f63-aba4-31a89f278f1c","Type":"ContainerStarted","Data":"0a3ce2c5601738f18adfac65788117166653b1115a5882263bc985e2a34a412f"} Oct 07 13:58:52 crc kubenswrapper[4854]: I1007 13:58:52.277559 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7854858d47-tb5j6" event={"ID":"9e46c481-351c-4f63-aba4-31a89f278f1c","Type":"ContainerStarted","Data":"c9cc2919e579bda39274620d6c337300a8496e5548752ac02b7c13031530756b"} Oct 07 13:58:52 crc kubenswrapper[4854]: I1007 13:58:52.278988 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:52 crc kubenswrapper[4854]: I1007 13:58:52.307770 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7854858d47-tb5j6" podStartSLOduration=3.307753216 podStartE2EDuration="3.307753216s" podCreationTimestamp="2025-10-07 13:58:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:58:52.305943534 +0000 UTC m=+5648.293775799" watchObservedRunningTime="2025-10-07 13:58:52.307753216 +0000 UTC m=+5648.295585471" Oct 07 13:58:59 crc kubenswrapper[4854]: I1007 13:58:59.835343 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:58:59 crc kubenswrapper[4854]: I1007 13:58:59.930663 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d9676b587-h7nbq"] Oct 07 13:58:59 crc kubenswrapper[4854]: I1007 13:58:59.931845 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" podUID="c0dea76b-4966-46bf-9b83-effccb6e8985" containerName="dnsmasq-dns" containerID="cri-o://29302c4a90f66ea8ad01c90f0ab237b01f0481834252226e4425f0d87050a455" gracePeriod=10 Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.390869 4854 generic.go:334] "Generic (PLEG): container finished" podID="c0dea76b-4966-46bf-9b83-effccb6e8985" containerID="29302c4a90f66ea8ad01c90f0ab237b01f0481834252226e4425f0d87050a455" exitCode=0 Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.390911 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" event={"ID":"c0dea76b-4966-46bf-9b83-effccb6e8985","Type":"ContainerDied","Data":"29302c4a90f66ea8ad01c90f0ab237b01f0481834252226e4425f0d87050a455"} Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.590251 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.728572 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-dns-svc\") pod \"c0dea76b-4966-46bf-9b83-effccb6e8985\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.728620 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glb7w\" (UniqueName: \"kubernetes.io/projected/c0dea76b-4966-46bf-9b83-effccb6e8985-kube-api-access-glb7w\") pod \"c0dea76b-4966-46bf-9b83-effccb6e8985\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.728686 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-config\") pod \"c0dea76b-4966-46bf-9b83-effccb6e8985\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.728730 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-ovsdbserver-nb\") pod \"c0dea76b-4966-46bf-9b83-effccb6e8985\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.728770 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-ovsdbserver-sb\") pod \"c0dea76b-4966-46bf-9b83-effccb6e8985\" (UID: \"c0dea76b-4966-46bf-9b83-effccb6e8985\") " Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.734991 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0dea76b-4966-46bf-9b83-effccb6e8985-kube-api-access-glb7w" (OuterVolumeSpecName: "kube-api-access-glb7w") pod "c0dea76b-4966-46bf-9b83-effccb6e8985" (UID: "c0dea76b-4966-46bf-9b83-effccb6e8985"). InnerVolumeSpecName "kube-api-access-glb7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.778517 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0dea76b-4966-46bf-9b83-effccb6e8985" (UID: "c0dea76b-4966-46bf-9b83-effccb6e8985"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.784851 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c0dea76b-4966-46bf-9b83-effccb6e8985" (UID: "c0dea76b-4966-46bf-9b83-effccb6e8985"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.787076 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-config" (OuterVolumeSpecName: "config") pod "c0dea76b-4966-46bf-9b83-effccb6e8985" (UID: "c0dea76b-4966-46bf-9b83-effccb6e8985"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.789279 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c0dea76b-4966-46bf-9b83-effccb6e8985" (UID: "c0dea76b-4966-46bf-9b83-effccb6e8985"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.830629 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.830674 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glb7w\" (UniqueName: \"kubernetes.io/projected/c0dea76b-4966-46bf-9b83-effccb6e8985-kube-api-access-glb7w\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.830689 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.830702 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:00 crc kubenswrapper[4854]: I1007 13:59:00.830714 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0dea76b-4966-46bf-9b83-effccb6e8985-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:01 crc kubenswrapper[4854]: I1007 13:59:01.414301 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" event={"ID":"c0dea76b-4966-46bf-9b83-effccb6e8985","Type":"ContainerDied","Data":"0edfb0b74158573a75debde8c431be1e60e330dbc30709d6ca4a1277f781d28b"} Oct 07 13:59:01 crc kubenswrapper[4854]: I1007 13:59:01.414373 4854 scope.go:117] "RemoveContainer" containerID="29302c4a90f66ea8ad01c90f0ab237b01f0481834252226e4425f0d87050a455" Oct 07 13:59:01 crc kubenswrapper[4854]: I1007 13:59:01.414446 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d9676b587-h7nbq" Oct 07 13:59:01 crc kubenswrapper[4854]: I1007 13:59:01.463239 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d9676b587-h7nbq"] Oct 07 13:59:01 crc kubenswrapper[4854]: I1007 13:59:01.478127 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d9676b587-h7nbq"] Oct 07 13:59:01 crc kubenswrapper[4854]: I1007 13:59:01.479715 4854 scope.go:117] "RemoveContainer" containerID="04f85082a72c363b3f0d71f69b375648f22c563df8dd1b56055469d61c05b6eb" Oct 07 13:59:02 crc kubenswrapper[4854]: I1007 13:59:02.719375 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0dea76b-4966-46bf-9b83-effccb6e8985" path="/var/lib/kubelet/pods/c0dea76b-4966-46bf-9b83-effccb6e8985/volumes" Oct 07 13:59:03 crc kubenswrapper[4854]: I1007 13:59:03.703878 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6p2jw"] Oct 07 13:59:03 crc kubenswrapper[4854]: E1007 13:59:03.704634 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0dea76b-4966-46bf-9b83-effccb6e8985" containerName="dnsmasq-dns" Oct 07 13:59:03 crc kubenswrapper[4854]: I1007 13:59:03.704681 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0dea76b-4966-46bf-9b83-effccb6e8985" containerName="dnsmasq-dns" Oct 07 13:59:03 crc kubenswrapper[4854]: E1007 13:59:03.704741 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0dea76b-4966-46bf-9b83-effccb6e8985" containerName="init" Oct 07 13:59:03 crc kubenswrapper[4854]: I1007 13:59:03.704761 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0dea76b-4966-46bf-9b83-effccb6e8985" containerName="init" Oct 07 13:59:03 crc kubenswrapper[4854]: I1007 13:59:03.705261 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0dea76b-4966-46bf-9b83-effccb6e8985" containerName="dnsmasq-dns" Oct 07 13:59:03 crc kubenswrapper[4854]: I1007 13:59:03.706648 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6p2jw" Oct 07 13:59:03 crc kubenswrapper[4854]: I1007 13:59:03.710609 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6p2jw"] Oct 07 13:59:03 crc kubenswrapper[4854]: I1007 13:59:03.792761 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r69g\" (UniqueName: \"kubernetes.io/projected/6c2b6c86-281c-4a3d-9998-4bb308bea16c-kube-api-access-2r69g\") pod \"cinder-db-create-6p2jw\" (UID: \"6c2b6c86-281c-4a3d-9998-4bb308bea16c\") " pod="openstack/cinder-db-create-6p2jw" Oct 07 13:59:03 crc kubenswrapper[4854]: I1007 13:59:03.894141 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r69g\" (UniqueName: \"kubernetes.io/projected/6c2b6c86-281c-4a3d-9998-4bb308bea16c-kube-api-access-2r69g\") pod \"cinder-db-create-6p2jw\" (UID: \"6c2b6c86-281c-4a3d-9998-4bb308bea16c\") " pod="openstack/cinder-db-create-6p2jw" Oct 07 13:59:03 crc kubenswrapper[4854]: I1007 13:59:03.916095 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r69g\" (UniqueName: \"kubernetes.io/projected/6c2b6c86-281c-4a3d-9998-4bb308bea16c-kube-api-access-2r69g\") pod \"cinder-db-create-6p2jw\" (UID: \"6c2b6c86-281c-4a3d-9998-4bb308bea16c\") " pod="openstack/cinder-db-create-6p2jw" Oct 07 13:59:04 crc kubenswrapper[4854]: I1007 13:59:04.032049 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6p2jw" Oct 07 13:59:04 crc kubenswrapper[4854]: I1007 13:59:04.367762 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6p2jw"] Oct 07 13:59:04 crc kubenswrapper[4854]: W1007 13:59:04.380531 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c2b6c86_281c_4a3d_9998_4bb308bea16c.slice/crio-2e71b0064a013f96c1040f4c7317f637a7010b1f7f92389252956ff9ff7798c4 WatchSource:0}: Error finding container 2e71b0064a013f96c1040f4c7317f637a7010b1f7f92389252956ff9ff7798c4: Status 404 returned error can't find the container with id 2e71b0064a013f96c1040f4c7317f637a7010b1f7f92389252956ff9ff7798c4 Oct 07 13:59:04 crc kubenswrapper[4854]: I1007 13:59:04.453284 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6p2jw" event={"ID":"6c2b6c86-281c-4a3d-9998-4bb308bea16c","Type":"ContainerStarted","Data":"2e71b0064a013f96c1040f4c7317f637a7010b1f7f92389252956ff9ff7798c4"} Oct 07 13:59:05 crc kubenswrapper[4854]: I1007 13:59:05.469397 4854 generic.go:334] "Generic (PLEG): container finished" podID="6c2b6c86-281c-4a3d-9998-4bb308bea16c" containerID="09136106b68329237faf5b4cd31b40ce6c296e85cfae8247c08017c039efa6b0" exitCode=0 Oct 07 13:59:05 crc kubenswrapper[4854]: I1007 13:59:05.469499 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6p2jw" event={"ID":"6c2b6c86-281c-4a3d-9998-4bb308bea16c","Type":"ContainerDied","Data":"09136106b68329237faf5b4cd31b40ce6c296e85cfae8247c08017c039efa6b0"} Oct 07 13:59:06 crc kubenswrapper[4854]: I1007 13:59:06.894455 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6p2jw" Oct 07 13:59:07 crc kubenswrapper[4854]: I1007 13:59:07.054668 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r69g\" (UniqueName: \"kubernetes.io/projected/6c2b6c86-281c-4a3d-9998-4bb308bea16c-kube-api-access-2r69g\") pod \"6c2b6c86-281c-4a3d-9998-4bb308bea16c\" (UID: \"6c2b6c86-281c-4a3d-9998-4bb308bea16c\") " Oct 07 13:59:07 crc kubenswrapper[4854]: I1007 13:59:07.063805 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c2b6c86-281c-4a3d-9998-4bb308bea16c-kube-api-access-2r69g" (OuterVolumeSpecName: "kube-api-access-2r69g") pod "6c2b6c86-281c-4a3d-9998-4bb308bea16c" (UID: "6c2b6c86-281c-4a3d-9998-4bb308bea16c"). InnerVolumeSpecName "kube-api-access-2r69g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:59:07 crc kubenswrapper[4854]: I1007 13:59:07.157335 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r69g\" (UniqueName: \"kubernetes.io/projected/6c2b6c86-281c-4a3d-9998-4bb308bea16c-kube-api-access-2r69g\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:07 crc kubenswrapper[4854]: I1007 13:59:07.496330 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6p2jw" event={"ID":"6c2b6c86-281c-4a3d-9998-4bb308bea16c","Type":"ContainerDied","Data":"2e71b0064a013f96c1040f4c7317f637a7010b1f7f92389252956ff9ff7798c4"} Oct 07 13:59:07 crc kubenswrapper[4854]: I1007 13:59:07.496386 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e71b0064a013f96c1040f4c7317f637a7010b1f7f92389252956ff9ff7798c4" Oct 07 13:59:07 crc kubenswrapper[4854]: I1007 13:59:07.496402 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6p2jw" Oct 07 13:59:13 crc kubenswrapper[4854]: I1007 13:59:13.838198 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-6f06-account-create-52qjt"] Oct 07 13:59:13 crc kubenswrapper[4854]: E1007 13:59:13.839105 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c2b6c86-281c-4a3d-9998-4bb308bea16c" containerName="mariadb-database-create" Oct 07 13:59:13 crc kubenswrapper[4854]: I1007 13:59:13.839119 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2b6c86-281c-4a3d-9998-4bb308bea16c" containerName="mariadb-database-create" Oct 07 13:59:13 crc kubenswrapper[4854]: I1007 13:59:13.839373 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c2b6c86-281c-4a3d-9998-4bb308bea16c" containerName="mariadb-database-create" Oct 07 13:59:13 crc kubenswrapper[4854]: I1007 13:59:13.840033 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6f06-account-create-52qjt" Oct 07 13:59:13 crc kubenswrapper[4854]: I1007 13:59:13.850485 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 07 13:59:13 crc kubenswrapper[4854]: I1007 13:59:13.854748 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6f06-account-create-52qjt"] Oct 07 13:59:14 crc kubenswrapper[4854]: I1007 13:59:14.007747 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqhfv\" (UniqueName: \"kubernetes.io/projected/965e4c63-cd95-4f53-93df-3ad027fd5758-kube-api-access-kqhfv\") pod \"cinder-6f06-account-create-52qjt\" (UID: \"965e4c63-cd95-4f53-93df-3ad027fd5758\") " pod="openstack/cinder-6f06-account-create-52qjt" Oct 07 13:59:14 crc kubenswrapper[4854]: I1007 13:59:14.110090 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqhfv\" (UniqueName: \"kubernetes.io/projected/965e4c63-cd95-4f53-93df-3ad027fd5758-kube-api-access-kqhfv\") pod \"cinder-6f06-account-create-52qjt\" (UID: \"965e4c63-cd95-4f53-93df-3ad027fd5758\") " pod="openstack/cinder-6f06-account-create-52qjt" Oct 07 13:59:14 crc kubenswrapper[4854]: I1007 13:59:14.134807 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqhfv\" (UniqueName: \"kubernetes.io/projected/965e4c63-cd95-4f53-93df-3ad027fd5758-kube-api-access-kqhfv\") pod \"cinder-6f06-account-create-52qjt\" (UID: \"965e4c63-cd95-4f53-93df-3ad027fd5758\") " pod="openstack/cinder-6f06-account-create-52qjt" Oct 07 13:59:14 crc kubenswrapper[4854]: I1007 13:59:14.167115 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6f06-account-create-52qjt" Oct 07 13:59:14 crc kubenswrapper[4854]: I1007 13:59:14.635863 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6f06-account-create-52qjt"] Oct 07 13:59:14 crc kubenswrapper[4854]: W1007 13:59:14.639233 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod965e4c63_cd95_4f53_93df_3ad027fd5758.slice/crio-14fd98a88ea6a3a3dbb963d502d789c39d6698817dca8f534a351978d1d1cdb8 WatchSource:0}: Error finding container 14fd98a88ea6a3a3dbb963d502d789c39d6698817dca8f534a351978d1d1cdb8: Status 404 returned error can't find the container with id 14fd98a88ea6a3a3dbb963d502d789c39d6698817dca8f534a351978d1d1cdb8 Oct 07 13:59:15 crc kubenswrapper[4854]: I1007 13:59:15.612220 4854 generic.go:334] "Generic (PLEG): container finished" podID="965e4c63-cd95-4f53-93df-3ad027fd5758" containerID="388423d7188e6e69d16a1682193b76bf45f0a8ead74b37a6deb545b4d26b8afc" exitCode=0 Oct 07 13:59:15 crc kubenswrapper[4854]: I1007 13:59:15.612318 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6f06-account-create-52qjt" event={"ID":"965e4c63-cd95-4f53-93df-3ad027fd5758","Type":"ContainerDied","Data":"388423d7188e6e69d16a1682193b76bf45f0a8ead74b37a6deb545b4d26b8afc"} Oct 07 13:59:15 crc kubenswrapper[4854]: I1007 13:59:15.612662 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6f06-account-create-52qjt" event={"ID":"965e4c63-cd95-4f53-93df-3ad027fd5758","Type":"ContainerStarted","Data":"14fd98a88ea6a3a3dbb963d502d789c39d6698817dca8f534a351978d1d1cdb8"} Oct 07 13:59:17 crc kubenswrapper[4854]: I1007 13:59:17.058606 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6f06-account-create-52qjt" Oct 07 13:59:17 crc kubenswrapper[4854]: I1007 13:59:17.186643 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqhfv\" (UniqueName: \"kubernetes.io/projected/965e4c63-cd95-4f53-93df-3ad027fd5758-kube-api-access-kqhfv\") pod \"965e4c63-cd95-4f53-93df-3ad027fd5758\" (UID: \"965e4c63-cd95-4f53-93df-3ad027fd5758\") " Oct 07 13:59:17 crc kubenswrapper[4854]: I1007 13:59:17.195077 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/965e4c63-cd95-4f53-93df-3ad027fd5758-kube-api-access-kqhfv" (OuterVolumeSpecName: "kube-api-access-kqhfv") pod "965e4c63-cd95-4f53-93df-3ad027fd5758" (UID: "965e4c63-cd95-4f53-93df-3ad027fd5758"). InnerVolumeSpecName "kube-api-access-kqhfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:59:17 crc kubenswrapper[4854]: I1007 13:59:17.289493 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqhfv\" (UniqueName: \"kubernetes.io/projected/965e4c63-cd95-4f53-93df-3ad027fd5758-kube-api-access-kqhfv\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:17 crc kubenswrapper[4854]: I1007 13:59:17.630945 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6f06-account-create-52qjt" event={"ID":"965e4c63-cd95-4f53-93df-3ad027fd5758","Type":"ContainerDied","Data":"14fd98a88ea6a3a3dbb963d502d789c39d6698817dca8f534a351978d1d1cdb8"} Oct 07 13:59:17 crc kubenswrapper[4854]: I1007 13:59:17.630984 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14fd98a88ea6a3a3dbb963d502d789c39d6698817dca8f534a351978d1d1cdb8" Oct 07 13:59:17 crc kubenswrapper[4854]: I1007 13:59:17.631035 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6f06-account-create-52qjt" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.005271 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-fccv9"] Oct 07 13:59:19 crc kubenswrapper[4854]: E1007 13:59:19.006043 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="965e4c63-cd95-4f53-93df-3ad027fd5758" containerName="mariadb-account-create" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.006062 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="965e4c63-cd95-4f53-93df-3ad027fd5758" containerName="mariadb-account-create" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.006415 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="965e4c63-cd95-4f53-93df-3ad027fd5758" containerName="mariadb-account-create" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.007218 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.009619 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.009912 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rjxnw" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.010705 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.035678 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fccv9"] Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.127457 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-scripts\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.127560 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmgpr\" (UniqueName: \"kubernetes.io/projected/f36c7ef9-d715-4471-966d-e086733bc6a5-kube-api-access-zmgpr\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.127606 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-db-sync-config-data\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.127692 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f36c7ef9-d715-4471-966d-e086733bc6a5-etc-machine-id\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.127735 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-config-data\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.129579 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-combined-ca-bundle\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.231985 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f36c7ef9-d715-4471-966d-e086733bc6a5-etc-machine-id\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.232033 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-config-data\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.232086 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-combined-ca-bundle\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.232166 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-scripts\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.232175 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f36c7ef9-d715-4471-966d-e086733bc6a5-etc-machine-id\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.232212 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmgpr\" (UniqueName: \"kubernetes.io/projected/f36c7ef9-d715-4471-966d-e086733bc6a5-kube-api-access-zmgpr\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.232250 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-db-sync-config-data\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.238315 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-db-sync-config-data\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.238388 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-scripts\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.240689 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-combined-ca-bundle\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.244900 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-config-data\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.247826 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmgpr\" (UniqueName: \"kubernetes.io/projected/f36c7ef9-d715-4471-966d-e086733bc6a5-kube-api-access-zmgpr\") pod \"cinder-db-sync-fccv9\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.341465 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:19 crc kubenswrapper[4854]: I1007 13:59:19.850190 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fccv9"] Oct 07 13:59:20 crc kubenswrapper[4854]: I1007 13:59:20.667793 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fccv9" event={"ID":"f36c7ef9-d715-4471-966d-e086733bc6a5","Type":"ContainerStarted","Data":"6003185ab2a816d22883b0da860ffb6267b7e077e1428534400a61043b7d80eb"} Oct 07 13:59:20 crc kubenswrapper[4854]: I1007 13:59:20.668104 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fccv9" event={"ID":"f36c7ef9-d715-4471-966d-e086733bc6a5","Type":"ContainerStarted","Data":"464ab7aa076ec3d68b8e1f5001265ebe9ebe0a68b2b5c37de076f852fe03b225"} Oct 07 13:59:20 crc kubenswrapper[4854]: I1007 13:59:20.692313 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-fccv9" podStartSLOduration=2.6922966759999998 podStartE2EDuration="2.692296676s" podCreationTimestamp="2025-10-07 13:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:59:20.686267753 +0000 UTC m=+5676.674100028" watchObservedRunningTime="2025-10-07 13:59:20.692296676 +0000 UTC m=+5676.680128931" Oct 07 13:59:23 crc kubenswrapper[4854]: I1007 13:59:23.710862 4854 generic.go:334] "Generic (PLEG): container finished" podID="f36c7ef9-d715-4471-966d-e086733bc6a5" containerID="6003185ab2a816d22883b0da860ffb6267b7e077e1428534400a61043b7d80eb" exitCode=0 Oct 07 13:59:23 crc kubenswrapper[4854]: I1007 13:59:23.710982 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fccv9" event={"ID":"f36c7ef9-d715-4471-966d-e086733bc6a5","Type":"ContainerDied","Data":"6003185ab2a816d22883b0da860ffb6267b7e077e1428534400a61043b7d80eb"} Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.054315 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.258043 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-combined-ca-bundle\") pod \"f36c7ef9-d715-4471-966d-e086733bc6a5\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.258140 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmgpr\" (UniqueName: \"kubernetes.io/projected/f36c7ef9-d715-4471-966d-e086733bc6a5-kube-api-access-zmgpr\") pod \"f36c7ef9-d715-4471-966d-e086733bc6a5\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.258269 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-config-data\") pod \"f36c7ef9-d715-4471-966d-e086733bc6a5\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.258308 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f36c7ef9-d715-4471-966d-e086733bc6a5-etc-machine-id\") pod \"f36c7ef9-d715-4471-966d-e086733bc6a5\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.258420 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-db-sync-config-data\") pod \"f36c7ef9-d715-4471-966d-e086733bc6a5\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.258477 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f36c7ef9-d715-4471-966d-e086733bc6a5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f36c7ef9-d715-4471-966d-e086733bc6a5" (UID: "f36c7ef9-d715-4471-966d-e086733bc6a5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.258500 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-scripts\") pod \"f36c7ef9-d715-4471-966d-e086733bc6a5\" (UID: \"f36c7ef9-d715-4471-966d-e086733bc6a5\") " Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.259203 4854 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f36c7ef9-d715-4471-966d-e086733bc6a5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.266267 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f36c7ef9-d715-4471-966d-e086733bc6a5-kube-api-access-zmgpr" (OuterVolumeSpecName: "kube-api-access-zmgpr") pod "f36c7ef9-d715-4471-966d-e086733bc6a5" (UID: "f36c7ef9-d715-4471-966d-e086733bc6a5"). InnerVolumeSpecName "kube-api-access-zmgpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.266436 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f36c7ef9-d715-4471-966d-e086733bc6a5" (UID: "f36c7ef9-d715-4471-966d-e086733bc6a5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.267235 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-scripts" (OuterVolumeSpecName: "scripts") pod "f36c7ef9-d715-4471-966d-e086733bc6a5" (UID: "f36c7ef9-d715-4471-966d-e086733bc6a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.295359 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f36c7ef9-d715-4471-966d-e086733bc6a5" (UID: "f36c7ef9-d715-4471-966d-e086733bc6a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.328848 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-config-data" (OuterVolumeSpecName: "config-data") pod "f36c7ef9-d715-4471-966d-e086733bc6a5" (UID: "f36c7ef9-d715-4471-966d-e086733bc6a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.361272 4854 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.361638 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.362606 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.362631 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmgpr\" (UniqueName: \"kubernetes.io/projected/f36c7ef9-d715-4471-966d-e086733bc6a5-kube-api-access-zmgpr\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.362642 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f36c7ef9-d715-4471-966d-e086733bc6a5-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.743880 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fccv9" event={"ID":"f36c7ef9-d715-4471-966d-e086733bc6a5","Type":"ContainerDied","Data":"464ab7aa076ec3d68b8e1f5001265ebe9ebe0a68b2b5c37de076f852fe03b225"} Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.744224 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="464ab7aa076ec3d68b8e1f5001265ebe9ebe0a68b2b5c37de076f852fe03b225" Oct 07 13:59:25 crc kubenswrapper[4854]: I1007 13:59:25.744333 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fccv9" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.090256 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-549dd5b965-mr5x2"] Oct 07 13:59:26 crc kubenswrapper[4854]: E1007 13:59:26.090936 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36c7ef9-d715-4471-966d-e086733bc6a5" containerName="cinder-db-sync" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.090954 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36c7ef9-d715-4471-966d-e086733bc6a5" containerName="cinder-db-sync" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.091746 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f36c7ef9-d715-4471-966d-e086733bc6a5" containerName="cinder-db-sync" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.099209 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.107833 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-549dd5b965-mr5x2"] Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.175592 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mv97\" (UniqueName: \"kubernetes.io/projected/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-kube-api-access-5mv97\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.175900 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-ovsdbserver-nb\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.176026 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-dns-svc\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.176237 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-config\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.176383 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-ovsdbserver-sb\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.277376 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-config\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.277445 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-ovsdbserver-sb\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.277504 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mv97\" (UniqueName: \"kubernetes.io/projected/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-kube-api-access-5mv97\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.277547 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-ovsdbserver-nb\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.277579 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-dns-svc\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.278672 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-config\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.278766 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-dns-svc\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.278949 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-ovsdbserver-sb\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.279388 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-ovsdbserver-nb\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.293381 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mv97\" (UniqueName: \"kubernetes.io/projected/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-kube-api-access-5mv97\") pod \"dnsmasq-dns-549dd5b965-mr5x2\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.310812 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.312285 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.315031 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rjxnw" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.315204 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.315355 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.315105 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.332062 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.378322 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.378368 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-config-data-custom\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.378417 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pjtf\" (UniqueName: \"kubernetes.io/projected/5826d74d-f392-425f-8d56-7c04c7a67ed7-kube-api-access-2pjtf\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.378584 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-config-data\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.378768 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5826d74d-f392-425f-8d56-7c04c7a67ed7-logs\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.378817 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5826d74d-f392-425f-8d56-7c04c7a67ed7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.378997 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-scripts\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.420223 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.480537 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.480623 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-config-data-custom\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.480698 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pjtf\" (UniqueName: \"kubernetes.io/projected/5826d74d-f392-425f-8d56-7c04c7a67ed7-kube-api-access-2pjtf\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.480772 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-config-data\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.480855 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5826d74d-f392-425f-8d56-7c04c7a67ed7-logs\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.480906 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5826d74d-f392-425f-8d56-7c04c7a67ed7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.481001 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-scripts\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.481983 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5826d74d-f392-425f-8d56-7c04c7a67ed7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.482366 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5826d74d-f392-425f-8d56-7c04c7a67ed7-logs\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.488842 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-scripts\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.488929 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-config-data-custom\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.489992 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-config-data\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.496242 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.501486 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pjtf\" (UniqueName: \"kubernetes.io/projected/5826d74d-f392-425f-8d56-7c04c7a67ed7-kube-api-access-2pjtf\") pod \"cinder-api-0\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.644663 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 13:59:26 crc kubenswrapper[4854]: I1007 13:59:26.990598 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-549dd5b965-mr5x2"] Oct 07 13:59:27 crc kubenswrapper[4854]: W1007 13:59:27.138893 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5826d74d_f392_425f_8d56_7c04c7a67ed7.slice/crio-184db8c5630bf3f8d199019bdb64e50e8ce8fe31feb12c7859d833971cc499bd WatchSource:0}: Error finding container 184db8c5630bf3f8d199019bdb64e50e8ce8fe31feb12c7859d833971cc499bd: Status 404 returned error can't find the container with id 184db8c5630bf3f8d199019bdb64e50e8ce8fe31feb12c7859d833971cc499bd Oct 07 13:59:27 crc kubenswrapper[4854]: I1007 13:59:27.143783 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 13:59:27 crc kubenswrapper[4854]: I1007 13:59:27.784193 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5826d74d-f392-425f-8d56-7c04c7a67ed7","Type":"ContainerStarted","Data":"39540665e851bdcde17936534d7635fc62c065a102d446e32338dfbc58e5ced0"} Oct 07 13:59:27 crc kubenswrapper[4854]: I1007 13:59:27.784596 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5826d74d-f392-425f-8d56-7c04c7a67ed7","Type":"ContainerStarted","Data":"184db8c5630bf3f8d199019bdb64e50e8ce8fe31feb12c7859d833971cc499bd"} Oct 07 13:59:27 crc kubenswrapper[4854]: I1007 13:59:27.788043 4854 generic.go:334] "Generic (PLEG): container finished" podID="8823e4f7-cb5e-4f62-9db1-84f8d4f28b93" containerID="9178e6a3cb41fd1be6f120fff0b3b3a3b47a61cc47d34bd8923e0f0eae0fa5b7" exitCode=0 Oct 07 13:59:27 crc kubenswrapper[4854]: I1007 13:59:27.788088 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" event={"ID":"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93","Type":"ContainerDied","Data":"9178e6a3cb41fd1be6f120fff0b3b3a3b47a61cc47d34bd8923e0f0eae0fa5b7"} Oct 07 13:59:27 crc kubenswrapper[4854]: I1007 13:59:27.788114 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" event={"ID":"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93","Type":"ContainerStarted","Data":"6fbd18681706096ccff3d1e93c8378fbe72c4ed8fd9d0491d6db0fc7472347ad"} Oct 07 13:59:28 crc kubenswrapper[4854]: I1007 13:59:28.798949 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5826d74d-f392-425f-8d56-7c04c7a67ed7","Type":"ContainerStarted","Data":"1871b936d0f6ff5f03623762fc356d0693e8dd110d47b103caa243cd30043372"} Oct 07 13:59:28 crc kubenswrapper[4854]: I1007 13:59:28.799320 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 13:59:28 crc kubenswrapper[4854]: I1007 13:59:28.801738 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" event={"ID":"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93","Type":"ContainerStarted","Data":"37bf59643938313b465d6a9f5849fa5e4e43ae91174ed85f53db0289760ffb7d"} Oct 07 13:59:28 crc kubenswrapper[4854]: I1007 13:59:28.801908 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:28 crc kubenswrapper[4854]: I1007 13:59:28.824655 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.824633064 podStartE2EDuration="2.824633064s" podCreationTimestamp="2025-10-07 13:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:59:28.816038128 +0000 UTC m=+5684.803870403" watchObservedRunningTime="2025-10-07 13:59:28.824633064 +0000 UTC m=+5684.812465319" Oct 07 13:59:28 crc kubenswrapper[4854]: I1007 13:59:28.836860 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" podStartSLOduration=2.8368403730000002 podStartE2EDuration="2.836840373s" podCreationTimestamp="2025-10-07 13:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:59:28.831915702 +0000 UTC m=+5684.819747957" watchObservedRunningTime="2025-10-07 13:59:28.836840373 +0000 UTC m=+5684.824672628" Oct 07 13:59:36 crc kubenswrapper[4854]: I1007 13:59:36.421386 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 13:59:36 crc kubenswrapper[4854]: I1007 13:59:36.477970 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7854858d47-tb5j6"] Oct 07 13:59:36 crc kubenswrapper[4854]: I1007 13:59:36.478351 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7854858d47-tb5j6" podUID="9e46c481-351c-4f63-aba4-31a89f278f1c" containerName="dnsmasq-dns" containerID="cri-o://c9cc2919e579bda39274620d6c337300a8496e5548752ac02b7c13031530756b" gracePeriod=10 Oct 07 13:59:36 crc kubenswrapper[4854]: I1007 13:59:36.907039 4854 generic.go:334] "Generic (PLEG): container finished" podID="9e46c481-351c-4f63-aba4-31a89f278f1c" containerID="c9cc2919e579bda39274620d6c337300a8496e5548752ac02b7c13031530756b" exitCode=0 Oct 07 13:59:36 crc kubenswrapper[4854]: I1007 13:59:36.907369 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7854858d47-tb5j6" event={"ID":"9e46c481-351c-4f63-aba4-31a89f278f1c","Type":"ContainerDied","Data":"c9cc2919e579bda39274620d6c337300a8496e5548752ac02b7c13031530756b"} Oct 07 13:59:36 crc kubenswrapper[4854]: I1007 13:59:36.907398 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7854858d47-tb5j6" event={"ID":"9e46c481-351c-4f63-aba4-31a89f278f1c","Type":"ContainerDied","Data":"0a3ce2c5601738f18adfac65788117166653b1115a5882263bc985e2a34a412f"} Oct 07 13:59:36 crc kubenswrapper[4854]: I1007 13:59:36.907408 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a3ce2c5601738f18adfac65788117166653b1115a5882263bc985e2a34a412f" Oct 07 13:59:36 crc kubenswrapper[4854]: I1007 13:59:36.938725 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.089255 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-ovsdbserver-nb\") pod \"9e46c481-351c-4f63-aba4-31a89f278f1c\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.089657 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5frcb\" (UniqueName: \"kubernetes.io/projected/9e46c481-351c-4f63-aba4-31a89f278f1c-kube-api-access-5frcb\") pod \"9e46c481-351c-4f63-aba4-31a89f278f1c\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.089868 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-config\") pod \"9e46c481-351c-4f63-aba4-31a89f278f1c\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.090116 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-dns-svc\") pod \"9e46c481-351c-4f63-aba4-31a89f278f1c\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.090340 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-ovsdbserver-sb\") pod \"9e46c481-351c-4f63-aba4-31a89f278f1c\" (UID: \"9e46c481-351c-4f63-aba4-31a89f278f1c\") " Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.108120 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e46c481-351c-4f63-aba4-31a89f278f1c-kube-api-access-5frcb" (OuterVolumeSpecName: "kube-api-access-5frcb") pod "9e46c481-351c-4f63-aba4-31a89f278f1c" (UID: "9e46c481-351c-4f63-aba4-31a89f278f1c"). InnerVolumeSpecName "kube-api-access-5frcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.145179 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9e46c481-351c-4f63-aba4-31a89f278f1c" (UID: "9e46c481-351c-4f63-aba4-31a89f278f1c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.150422 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e46c481-351c-4f63-aba4-31a89f278f1c" (UID: "9e46c481-351c-4f63-aba4-31a89f278f1c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.162961 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-config" (OuterVolumeSpecName: "config") pod "9e46c481-351c-4f63-aba4-31a89f278f1c" (UID: "9e46c481-351c-4f63-aba4-31a89f278f1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.165391 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9e46c481-351c-4f63-aba4-31a89f278f1c" (UID: "9e46c481-351c-4f63-aba4-31a89f278f1c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.192326 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.192358 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5frcb\" (UniqueName: \"kubernetes.io/projected/9e46c481-351c-4f63-aba4-31a89f278f1c-kube-api-access-5frcb\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.192371 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.192380 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.192388 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e46c481-351c-4f63-aba4-31a89f278f1c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.914730 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7854858d47-tb5j6" Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.956550 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7854858d47-tb5j6"] Oct 07 13:59:37 crc kubenswrapper[4854]: I1007 13:59:37.967739 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7854858d47-tb5j6"] Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.221875 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.222104 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a7139c6b-fad9-41da-ac5e-feff21db38c1" containerName="nova-scheduler-scheduler" containerID="cri-o://5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15" gracePeriod=30 Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.230432 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.231320 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="f226ca89-20c7-4d17-a02b-43c1c8353353" containerName="nova-cell0-conductor-conductor" containerID="cri-o://ec455117753f3a6d9579cb1fb05d4848361e3fcf5d5bcd7580d80f6fbd7412fb" gracePeriod=30 Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.243496 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.243831 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d4b62572-0d75-4e00-8215-0efe097aa5ad" containerName="nova-metadata-metadata" containerID="cri-o://fd14c247db401a5f40b87b4108ce86eb56416925a695eae766f22cff61c5971b" gracePeriod=30 Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.244009 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d4b62572-0d75-4e00-8215-0efe097aa5ad" containerName="nova-metadata-log" containerID="cri-o://05f14f52023bcab9f989f9f9684117c21d02a70e42ba64c95a1844b38efedcb3" gracePeriod=30 Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.279536 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.279901 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b2378d51-a1ef-4417-9ecd-5bec044c7f9a" containerName="nova-api-log" containerID="cri-o://9042dd2e9539d42a711c3eb659343d3385a072a8789fde76452a4cf328458ddb" gracePeriod=30 Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.280408 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b2378d51-a1ef-4417-9ecd-5bec044c7f9a" containerName="nova-api-api" containerID="cri-o://df1043ba50c6b1deb56442ad2422456e643cafab877a18e0393b3d673e6e2cb3" gracePeriod=30 Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.291951 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.292224 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="21a3f9e9-afda-4138-8c6a-c1351412b296" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d5fba4b59a0dad959f787307e1a7aa2f3f1aba64f6f701c3f3a42598db6b4776" gracePeriod=30 Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.311987 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.312288 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="1c81fc08-b696-4c22-88fa-6842e8725af3" containerName="nova-cell1-conductor-conductor" containerID="cri-o://4f13e983b0dd409c06f1c98b96c956be1610e44a71af7d4186271aed239db28b" gracePeriod=30 Oct 07 13:59:38 crc kubenswrapper[4854]: E1007 13:59:38.490652 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 13:59:38 crc kubenswrapper[4854]: E1007 13:59:38.492579 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 13:59:38 crc kubenswrapper[4854]: E1007 13:59:38.503572 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 13:59:38 crc kubenswrapper[4854]: E1007 13:59:38.503662 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a7139c6b-fad9-41da-ac5e-feff21db38c1" containerName="nova-scheduler-scheduler" Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.718254 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e46c481-351c-4f63-aba4-31a89f278f1c" path="/var/lib/kubelet/pods/9e46c481-351c-4f63-aba4-31a89f278f1c/volumes" Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.859552 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.938908 4854 generic.go:334] "Generic (PLEG): container finished" podID="d4b62572-0d75-4e00-8215-0efe097aa5ad" containerID="05f14f52023bcab9f989f9f9684117c21d02a70e42ba64c95a1844b38efedcb3" exitCode=143 Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.939013 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4b62572-0d75-4e00-8215-0efe097aa5ad","Type":"ContainerDied","Data":"05f14f52023bcab9f989f9f9684117c21d02a70e42ba64c95a1844b38efedcb3"} Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.943984 4854 generic.go:334] "Generic (PLEG): container finished" podID="b2378d51-a1ef-4417-9ecd-5bec044c7f9a" containerID="9042dd2e9539d42a711c3eb659343d3385a072a8789fde76452a4cf328458ddb" exitCode=143 Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.944050 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2378d51-a1ef-4417-9ecd-5bec044c7f9a","Type":"ContainerDied","Data":"9042dd2e9539d42a711c3eb659343d3385a072a8789fde76452a4cf328458ddb"} Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.957427 4854 generic.go:334] "Generic (PLEG): container finished" podID="21a3f9e9-afda-4138-8c6a-c1351412b296" containerID="d5fba4b59a0dad959f787307e1a7aa2f3f1aba64f6f701c3f3a42598db6b4776" exitCode=0 Oct 07 13:59:38 crc kubenswrapper[4854]: I1007 13:59:38.957462 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21a3f9e9-afda-4138-8c6a-c1351412b296","Type":"ContainerDied","Data":"d5fba4b59a0dad959f787307e1a7aa2f3f1aba64f6f701c3f3a42598db6b4776"} Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.271492 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.335674 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a3f9e9-afda-4138-8c6a-c1351412b296-combined-ca-bundle\") pod \"21a3f9e9-afda-4138-8c6a-c1351412b296\" (UID: \"21a3f9e9-afda-4138-8c6a-c1351412b296\") " Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.335873 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a3f9e9-afda-4138-8c6a-c1351412b296-config-data\") pod \"21a3f9e9-afda-4138-8c6a-c1351412b296\" (UID: \"21a3f9e9-afda-4138-8c6a-c1351412b296\") " Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.335899 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f95vx\" (UniqueName: \"kubernetes.io/projected/21a3f9e9-afda-4138-8c6a-c1351412b296-kube-api-access-f95vx\") pod \"21a3f9e9-afda-4138-8c6a-c1351412b296\" (UID: \"21a3f9e9-afda-4138-8c6a-c1351412b296\") " Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.342365 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21a3f9e9-afda-4138-8c6a-c1351412b296-kube-api-access-f95vx" (OuterVolumeSpecName: "kube-api-access-f95vx") pod "21a3f9e9-afda-4138-8c6a-c1351412b296" (UID: "21a3f9e9-afda-4138-8c6a-c1351412b296"). InnerVolumeSpecName "kube-api-access-f95vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.366175 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a3f9e9-afda-4138-8c6a-c1351412b296-config-data" (OuterVolumeSpecName: "config-data") pod "21a3f9e9-afda-4138-8c6a-c1351412b296" (UID: "21a3f9e9-afda-4138-8c6a-c1351412b296"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.410359 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a3f9e9-afda-4138-8c6a-c1351412b296-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21a3f9e9-afda-4138-8c6a-c1351412b296" (UID: "21a3f9e9-afda-4138-8c6a-c1351412b296"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.437476 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a3f9e9-afda-4138-8c6a-c1351412b296-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.437505 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f95vx\" (UniqueName: \"kubernetes.io/projected/21a3f9e9-afda-4138-8c6a-c1351412b296-kube-api-access-f95vx\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.437516 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a3f9e9-afda-4138-8c6a-c1351412b296-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.902519 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.950608 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm4nr\" (UniqueName: \"kubernetes.io/projected/1c81fc08-b696-4c22-88fa-6842e8725af3-kube-api-access-qm4nr\") pod \"1c81fc08-b696-4c22-88fa-6842e8725af3\" (UID: \"1c81fc08-b696-4c22-88fa-6842e8725af3\") " Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.950658 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c81fc08-b696-4c22-88fa-6842e8725af3-combined-ca-bundle\") pod \"1c81fc08-b696-4c22-88fa-6842e8725af3\" (UID: \"1c81fc08-b696-4c22-88fa-6842e8725af3\") " Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.950767 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c81fc08-b696-4c22-88fa-6842e8725af3-config-data\") pod \"1c81fc08-b696-4c22-88fa-6842e8725af3\" (UID: \"1c81fc08-b696-4c22-88fa-6842e8725af3\") " Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.960431 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c81fc08-b696-4c22-88fa-6842e8725af3-kube-api-access-qm4nr" (OuterVolumeSpecName: "kube-api-access-qm4nr") pod "1c81fc08-b696-4c22-88fa-6842e8725af3" (UID: "1c81fc08-b696-4c22-88fa-6842e8725af3"). InnerVolumeSpecName "kube-api-access-qm4nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.982675 4854 generic.go:334] "Generic (PLEG): container finished" podID="f226ca89-20c7-4d17-a02b-43c1c8353353" containerID="ec455117753f3a6d9579cb1fb05d4848361e3fcf5d5bcd7580d80f6fbd7412fb" exitCode=0 Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.982751 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f226ca89-20c7-4d17-a02b-43c1c8353353","Type":"ContainerDied","Data":"ec455117753f3a6d9579cb1fb05d4848361e3fcf5d5bcd7580d80f6fbd7412fb"} Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.982771 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c81fc08-b696-4c22-88fa-6842e8725af3-config-data" (OuterVolumeSpecName: "config-data") pod "1c81fc08-b696-4c22-88fa-6842e8725af3" (UID: "1c81fc08-b696-4c22-88fa-6842e8725af3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.985896 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c81fc08-b696-4c22-88fa-6842e8725af3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c81fc08-b696-4c22-88fa-6842e8725af3" (UID: "1c81fc08-b696-4c22-88fa-6842e8725af3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.986537 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"21a3f9e9-afda-4138-8c6a-c1351412b296","Type":"ContainerDied","Data":"8f5ff4e769bb107dfdf442113b6057dca6538422ad460fc64d3b3bb3db118502"} Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.986572 4854 scope.go:117] "RemoveContainer" containerID="d5fba4b59a0dad959f787307e1a7aa2f3f1aba64f6f701c3f3a42598db6b4776" Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.986687 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.992203 4854 generic.go:334] "Generic (PLEG): container finished" podID="1c81fc08-b696-4c22-88fa-6842e8725af3" containerID="4f13e983b0dd409c06f1c98b96c956be1610e44a71af7d4186271aed239db28b" exitCode=0 Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.992239 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1c81fc08-b696-4c22-88fa-6842e8725af3","Type":"ContainerDied","Data":"4f13e983b0dd409c06f1c98b96c956be1610e44a71af7d4186271aed239db28b"} Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.992261 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1c81fc08-b696-4c22-88fa-6842e8725af3","Type":"ContainerDied","Data":"9f607307c26511e2856d05a69e7b44b9b6e3a65f4cb46a85c5e9a62c5f0d150d"} Oct 07 13:59:39 crc kubenswrapper[4854]: I1007 13:59:39.992320 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.019787 4854 scope.go:117] "RemoveContainer" containerID="4f13e983b0dd409c06f1c98b96c956be1610e44a71af7d4186271aed239db28b" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.053218 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm4nr\" (UniqueName: \"kubernetes.io/projected/1c81fc08-b696-4c22-88fa-6842e8725af3-kube-api-access-qm4nr\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.053249 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c81fc08-b696-4c22-88fa-6842e8725af3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.053259 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c81fc08-b696-4c22-88fa-6842e8725af3-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.085745 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.086458 4854 scope.go:117] "RemoveContainer" containerID="4f13e983b0dd409c06f1c98b96c956be1610e44a71af7d4186271aed239db28b" Oct 07 13:59:40 crc kubenswrapper[4854]: E1007 13:59:40.093210 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f13e983b0dd409c06f1c98b96c956be1610e44a71af7d4186271aed239db28b\": container with ID starting with 4f13e983b0dd409c06f1c98b96c956be1610e44a71af7d4186271aed239db28b not found: ID does not exist" containerID="4f13e983b0dd409c06f1c98b96c956be1610e44a71af7d4186271aed239db28b" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.093271 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f13e983b0dd409c06f1c98b96c956be1610e44a71af7d4186271aed239db28b"} err="failed to get container status \"4f13e983b0dd409c06f1c98b96c956be1610e44a71af7d4186271aed239db28b\": rpc error: code = NotFound desc = could not find container \"4f13e983b0dd409c06f1c98b96c956be1610e44a71af7d4186271aed239db28b\": container with ID starting with 4f13e983b0dd409c06f1c98b96c956be1610e44a71af7d4186271aed239db28b not found: ID does not exist" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.096028 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.106933 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.114197 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.120203 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:59:40 crc kubenswrapper[4854]: E1007 13:59:40.120712 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c81fc08-b696-4c22-88fa-6842e8725af3" containerName="nova-cell1-conductor-conductor" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.120742 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c81fc08-b696-4c22-88fa-6842e8725af3" containerName="nova-cell1-conductor-conductor" Oct 07 13:59:40 crc kubenswrapper[4854]: E1007 13:59:40.120765 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21a3f9e9-afda-4138-8c6a-c1351412b296" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.120774 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="21a3f9e9-afda-4138-8c6a-c1351412b296" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 13:59:40 crc kubenswrapper[4854]: E1007 13:59:40.120811 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e46c481-351c-4f63-aba4-31a89f278f1c" containerName="init" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.120820 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e46c481-351c-4f63-aba4-31a89f278f1c" containerName="init" Oct 07 13:59:40 crc kubenswrapper[4854]: E1007 13:59:40.120833 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e46c481-351c-4f63-aba4-31a89f278f1c" containerName="dnsmasq-dns" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.120841 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e46c481-351c-4f63-aba4-31a89f278f1c" containerName="dnsmasq-dns" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.121054 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="21a3f9e9-afda-4138-8c6a-c1351412b296" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.121090 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e46c481-351c-4f63-aba4-31a89f278f1c" containerName="dnsmasq-dns" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.121111 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c81fc08-b696-4c22-88fa-6842e8725af3" containerName="nova-cell1-conductor-conductor" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.121741 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.124127 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.132424 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.133911 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.139847 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.151322 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.154346 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.154394 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3619fac8-bf60-4908-a9ef-bb5af339f530-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3619fac8-bf60-4908-a9ef-bb5af339f530\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.154423 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtph2\" (UniqueName: \"kubernetes.io/projected/3619fac8-bf60-4908-a9ef-bb5af339f530-kube-api-access-jtph2\") pod \"nova-cell1-conductor-0\" (UID: \"3619fac8-bf60-4908-a9ef-bb5af339f530\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.154505 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpqjv\" (UniqueName: \"kubernetes.io/projected/5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92-kube-api-access-hpqjv\") pod \"nova-cell1-novncproxy-0\" (UID: \"5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.154535 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.154552 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3619fac8-bf60-4908-a9ef-bb5af339f530-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3619fac8-bf60-4908-a9ef-bb5af339f530\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.165095 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.207548 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.255256 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f226ca89-20c7-4d17-a02b-43c1c8353353-config-data\") pod \"f226ca89-20c7-4d17-a02b-43c1c8353353\" (UID: \"f226ca89-20c7-4d17-a02b-43c1c8353353\") " Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.255344 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f226ca89-20c7-4d17-a02b-43c1c8353353-combined-ca-bundle\") pod \"f226ca89-20c7-4d17-a02b-43c1c8353353\" (UID: \"f226ca89-20c7-4d17-a02b-43c1c8353353\") " Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.255450 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdzd6\" (UniqueName: \"kubernetes.io/projected/f226ca89-20c7-4d17-a02b-43c1c8353353-kube-api-access-bdzd6\") pod \"f226ca89-20c7-4d17-a02b-43c1c8353353\" (UID: \"f226ca89-20c7-4d17-a02b-43c1c8353353\") " Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.256789 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3619fac8-bf60-4908-a9ef-bb5af339f530-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3619fac8-bf60-4908-a9ef-bb5af339f530\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.256884 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtph2\" (UniqueName: \"kubernetes.io/projected/3619fac8-bf60-4908-a9ef-bb5af339f530-kube-api-access-jtph2\") pod \"nova-cell1-conductor-0\" (UID: \"3619fac8-bf60-4908-a9ef-bb5af339f530\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.256992 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpqjv\" (UniqueName: \"kubernetes.io/projected/5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92-kube-api-access-hpqjv\") pod \"nova-cell1-novncproxy-0\" (UID: \"5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.257044 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.257066 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3619fac8-bf60-4908-a9ef-bb5af339f530-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3619fac8-bf60-4908-a9ef-bb5af339f530\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.257239 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.258848 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f226ca89-20c7-4d17-a02b-43c1c8353353-kube-api-access-bdzd6" (OuterVolumeSpecName: "kube-api-access-bdzd6") pod "f226ca89-20c7-4d17-a02b-43c1c8353353" (UID: "f226ca89-20c7-4d17-a02b-43c1c8353353"). InnerVolumeSpecName "kube-api-access-bdzd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.261222 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3619fac8-bf60-4908-a9ef-bb5af339f530-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3619fac8-bf60-4908-a9ef-bb5af339f530\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.261466 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.263765 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3619fac8-bf60-4908-a9ef-bb5af339f530-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3619fac8-bf60-4908-a9ef-bb5af339f530\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.274948 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpqjv\" (UniqueName: \"kubernetes.io/projected/5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92-kube-api-access-hpqjv\") pod \"nova-cell1-novncproxy-0\" (UID: \"5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.277842 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.288445 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtph2\" (UniqueName: \"kubernetes.io/projected/3619fac8-bf60-4908-a9ef-bb5af339f530-kube-api-access-jtph2\") pod \"nova-cell1-conductor-0\" (UID: \"3619fac8-bf60-4908-a9ef-bb5af339f530\") " pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.291750 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f226ca89-20c7-4d17-a02b-43c1c8353353-config-data" (OuterVolumeSpecName: "config-data") pod "f226ca89-20c7-4d17-a02b-43c1c8353353" (UID: "f226ca89-20c7-4d17-a02b-43c1c8353353"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.296158 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f226ca89-20c7-4d17-a02b-43c1c8353353-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f226ca89-20c7-4d17-a02b-43c1c8353353" (UID: "f226ca89-20c7-4d17-a02b-43c1c8353353"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.358705 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f226ca89-20c7-4d17-a02b-43c1c8353353-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.359019 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f226ca89-20c7-4d17-a02b-43c1c8353353-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.359032 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdzd6\" (UniqueName: \"kubernetes.io/projected/f226ca89-20c7-4d17-a02b-43c1c8353353-kube-api-access-bdzd6\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.451212 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.455819 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.712760 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c81fc08-b696-4c22-88fa-6842e8725af3" path="/var/lib/kubelet/pods/1c81fc08-b696-4c22-88fa-6842e8725af3/volumes" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.713495 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21a3f9e9-afda-4138-8c6a-c1351412b296" path="/var/lib/kubelet/pods/21a3f9e9-afda-4138-8c6a-c1351412b296/volumes" Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.933883 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 13:59:40 crc kubenswrapper[4854]: I1007 13:59:40.995929 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 13:59:41 crc kubenswrapper[4854]: W1007 13:59:41.010228 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3619fac8_bf60_4908_a9ef_bb5af339f530.slice/crio-c412d7a263468f393580bb9a4d8377fcd8bbb60e81d2912fd0d0a39981e65b47 WatchSource:0}: Error finding container c412d7a263468f393580bb9a4d8377fcd8bbb60e81d2912fd0d0a39981e65b47: Status 404 returned error can't find the container with id c412d7a263468f393580bb9a4d8377fcd8bbb60e81d2912fd0d0a39981e65b47 Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.012196 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.012139 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f226ca89-20c7-4d17-a02b-43c1c8353353","Type":"ContainerDied","Data":"7863dcd968942cfaafd0199bf93c648d791b71b5608d66a2e662c66125e86c79"} Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.012555 4854 scope.go:117] "RemoveContainer" containerID="ec455117753f3a6d9579cb1fb05d4848361e3fcf5d5bcd7580d80f6fbd7412fb" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.018264 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92","Type":"ContainerStarted","Data":"b54a42203925596af5121d1c9f0e40fe34a8b054fca716c422406a0cdc3824ab"} Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.109321 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.136980 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.148284 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 13:59:41 crc kubenswrapper[4854]: E1007 13:59:41.148701 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f226ca89-20c7-4d17-a02b-43c1c8353353" containerName="nova-cell0-conductor-conductor" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.148720 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f226ca89-20c7-4d17-a02b-43c1c8353353" containerName="nova-cell0-conductor-conductor" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.148962 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f226ca89-20c7-4d17-a02b-43c1c8353353" containerName="nova-cell0-conductor-conductor" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.149630 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.153181 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.164852 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.173410 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840cffec-0fba-4a84-8d36-4c0cc26cadff-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"840cffec-0fba-4a84-8d36-4c0cc26cadff\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.173552 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840cffec-0fba-4a84-8d36-4c0cc26cadff-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"840cffec-0fba-4a84-8d36-4c0cc26cadff\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.173690 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw5jq\" (UniqueName: \"kubernetes.io/projected/840cffec-0fba-4a84-8d36-4c0cc26cadff-kube-api-access-xw5jq\") pod \"nova-cell0-conductor-0\" (UID: \"840cffec-0fba-4a84-8d36-4c0cc26cadff\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.275500 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840cffec-0fba-4a84-8d36-4c0cc26cadff-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"840cffec-0fba-4a84-8d36-4c0cc26cadff\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.275573 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840cffec-0fba-4a84-8d36-4c0cc26cadff-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"840cffec-0fba-4a84-8d36-4c0cc26cadff\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.275624 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw5jq\" (UniqueName: \"kubernetes.io/projected/840cffec-0fba-4a84-8d36-4c0cc26cadff-kube-api-access-xw5jq\") pod \"nova-cell0-conductor-0\" (UID: \"840cffec-0fba-4a84-8d36-4c0cc26cadff\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.279781 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840cffec-0fba-4a84-8d36-4c0cc26cadff-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"840cffec-0fba-4a84-8d36-4c0cc26cadff\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.282510 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840cffec-0fba-4a84-8d36-4c0cc26cadff-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"840cffec-0fba-4a84-8d36-4c0cc26cadff\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.292313 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw5jq\" (UniqueName: \"kubernetes.io/projected/840cffec-0fba-4a84-8d36-4c0cc26cadff-kube-api-access-xw5jq\") pod \"nova-cell0-conductor-0\" (UID: \"840cffec-0fba-4a84-8d36-4c0cc26cadff\") " pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.468091 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.681422 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d4b62572-0d75-4e00-8215-0efe097aa5ad" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.74:8775/\": read tcp 10.217.0.2:38416->10.217.1.74:8775: read: connection reset by peer" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.681749 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d4b62572-0d75-4e00-8215-0efe097aa5ad" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.74:8775/\": read tcp 10.217.0.2:38420->10.217.1.74:8775: read: connection reset by peer" Oct 07 13:59:41 crc kubenswrapper[4854]: I1007 13:59:41.997088 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.061358 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92","Type":"ContainerStarted","Data":"164c7aabc3e5f39467bf0294fadfd4b010be9e75121df2c863ef4eff679b3157"} Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.094437 4854 generic.go:334] "Generic (PLEG): container finished" podID="d4b62572-0d75-4e00-8215-0efe097aa5ad" containerID="fd14c247db401a5f40b87b4108ce86eb56416925a695eae766f22cff61c5971b" exitCode=0 Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.094557 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4b62572-0d75-4e00-8215-0efe097aa5ad","Type":"ContainerDied","Data":"fd14c247db401a5f40b87b4108ce86eb56416925a695eae766f22cff61c5971b"} Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.095434 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.095403117 podStartE2EDuration="2.095403117s" podCreationTimestamp="2025-10-07 13:59:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:59:42.08008637 +0000 UTC m=+5698.067918625" watchObservedRunningTime="2025-10-07 13:59:42.095403117 +0000 UTC m=+5698.083235373" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.098504 4854 generic.go:334] "Generic (PLEG): container finished" podID="b2378d51-a1ef-4417-9ecd-5bec044c7f9a" containerID="df1043ba50c6b1deb56442ad2422456e643cafab877a18e0393b3d673e6e2cb3" exitCode=0 Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.098558 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2378d51-a1ef-4417-9ecd-5bec044c7f9a","Type":"ContainerDied","Data":"df1043ba50c6b1deb56442ad2422456e643cafab877a18e0393b3d673e6e2cb3"} Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.098579 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2378d51-a1ef-4417-9ecd-5bec044c7f9a","Type":"ContainerDied","Data":"a406d83dbcde43c15156932ce780bd75be18775b030754764c099fe1c405ce47"} Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.098597 4854 scope.go:117] "RemoveContainer" containerID="df1043ba50c6b1deb56442ad2422456e643cafab877a18e0393b3d673e6e2cb3" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.098704 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.099841 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-logs\") pod \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.099932 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-combined-ca-bundle\") pod \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.100023 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8hnk\" (UniqueName: \"kubernetes.io/projected/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-kube-api-access-l8hnk\") pod \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.100120 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-config-data\") pod \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\" (UID: \"b2378d51-a1ef-4417-9ecd-5bec044c7f9a\") " Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.107782 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-logs" (OuterVolumeSpecName: "logs") pod "b2378d51-a1ef-4417-9ecd-5bec044c7f9a" (UID: "b2378d51-a1ef-4417-9ecd-5bec044c7f9a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.114080 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.118478 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-kube-api-access-l8hnk" (OuterVolumeSpecName: "kube-api-access-l8hnk") pod "b2378d51-a1ef-4417-9ecd-5bec044c7f9a" (UID: "b2378d51-a1ef-4417-9ecd-5bec044c7f9a"). InnerVolumeSpecName "kube-api-access-l8hnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.119313 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3619fac8-bf60-4908-a9ef-bb5af339f530","Type":"ContainerStarted","Data":"0748a0f6c6ff32f97816e5d32c085f155fba2d7e63486d88aea7d29d50169426"} Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.119386 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3619fac8-bf60-4908-a9ef-bb5af339f530","Type":"ContainerStarted","Data":"c412d7a263468f393580bb9a4d8377fcd8bbb60e81d2912fd0d0a39981e65b47"} Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.120288 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.151578 4854 scope.go:117] "RemoveContainer" containerID="9042dd2e9539d42a711c3eb659343d3385a072a8789fde76452a4cf328458ddb" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.159865 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-config-data" (OuterVolumeSpecName: "config-data") pod "b2378d51-a1ef-4417-9ecd-5bec044c7f9a" (UID: "b2378d51-a1ef-4417-9ecd-5bec044c7f9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.167622 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2378d51-a1ef-4417-9ecd-5bec044c7f9a" (UID: "b2378d51-a1ef-4417-9ecd-5bec044c7f9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.174761 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.174734415 podStartE2EDuration="2.174734415s" podCreationTimestamp="2025-10-07 13:59:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:59:42.15218947 +0000 UTC m=+5698.140021745" watchObservedRunningTime="2025-10-07 13:59:42.174734415 +0000 UTC m=+5698.162566660" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.178589 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.194388 4854 scope.go:117] "RemoveContainer" containerID="df1043ba50c6b1deb56442ad2422456e643cafab877a18e0393b3d673e6e2cb3" Oct 07 13:59:42 crc kubenswrapper[4854]: E1007 13:59:42.196490 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df1043ba50c6b1deb56442ad2422456e643cafab877a18e0393b3d673e6e2cb3\": container with ID starting with df1043ba50c6b1deb56442ad2422456e643cafab877a18e0393b3d673e6e2cb3 not found: ID does not exist" containerID="df1043ba50c6b1deb56442ad2422456e643cafab877a18e0393b3d673e6e2cb3" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.196528 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df1043ba50c6b1deb56442ad2422456e643cafab877a18e0393b3d673e6e2cb3"} err="failed to get container status \"df1043ba50c6b1deb56442ad2422456e643cafab877a18e0393b3d673e6e2cb3\": rpc error: code = NotFound desc = could not find container \"df1043ba50c6b1deb56442ad2422456e643cafab877a18e0393b3d673e6e2cb3\": container with ID starting with df1043ba50c6b1deb56442ad2422456e643cafab877a18e0393b3d673e6e2cb3 not found: ID does not exist" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.196548 4854 scope.go:117] "RemoveContainer" containerID="9042dd2e9539d42a711c3eb659343d3385a072a8789fde76452a4cf328458ddb" Oct 07 13:59:42 crc kubenswrapper[4854]: E1007 13:59:42.201408 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9042dd2e9539d42a711c3eb659343d3385a072a8789fde76452a4cf328458ddb\": container with ID starting with 9042dd2e9539d42a711c3eb659343d3385a072a8789fde76452a4cf328458ddb not found: ID does not exist" containerID="9042dd2e9539d42a711c3eb659343d3385a072a8789fde76452a4cf328458ddb" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.201455 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9042dd2e9539d42a711c3eb659343d3385a072a8789fde76452a4cf328458ddb"} err="failed to get container status \"9042dd2e9539d42a711c3eb659343d3385a072a8789fde76452a4cf328458ddb\": rpc error: code = NotFound desc = could not find container \"9042dd2e9539d42a711c3eb659343d3385a072a8789fde76452a4cf328458ddb\": container with ID starting with 9042dd2e9539d42a711c3eb659343d3385a072a8789fde76452a4cf328458ddb not found: ID does not exist" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.202777 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk4sq\" (UniqueName: \"kubernetes.io/projected/d4b62572-0d75-4e00-8215-0efe097aa5ad-kube-api-access-vk4sq\") pod \"d4b62572-0d75-4e00-8215-0efe097aa5ad\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.202858 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b62572-0d75-4e00-8215-0efe097aa5ad-config-data\") pod \"d4b62572-0d75-4e00-8215-0efe097aa5ad\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.203010 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b62572-0d75-4e00-8215-0efe097aa5ad-combined-ca-bundle\") pod \"d4b62572-0d75-4e00-8215-0efe097aa5ad\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.203052 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4b62572-0d75-4e00-8215-0efe097aa5ad-logs\") pod \"d4b62572-0d75-4e00-8215-0efe097aa5ad\" (UID: \"d4b62572-0d75-4e00-8215-0efe097aa5ad\") " Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.203441 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.203454 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.203464 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8hnk\" (UniqueName: \"kubernetes.io/projected/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-kube-api-access-l8hnk\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.203477 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2378d51-a1ef-4417-9ecd-5bec044c7f9a-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.208025 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b62572-0d75-4e00-8215-0efe097aa5ad-kube-api-access-vk4sq" (OuterVolumeSpecName: "kube-api-access-vk4sq") pod "d4b62572-0d75-4e00-8215-0efe097aa5ad" (UID: "d4b62572-0d75-4e00-8215-0efe097aa5ad"). InnerVolumeSpecName "kube-api-access-vk4sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.220381 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b62572-0d75-4e00-8215-0efe097aa5ad-logs" (OuterVolumeSpecName: "logs") pod "d4b62572-0d75-4e00-8215-0efe097aa5ad" (UID: "d4b62572-0d75-4e00-8215-0efe097aa5ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.230537 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b62572-0d75-4e00-8215-0efe097aa5ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4b62572-0d75-4e00-8215-0efe097aa5ad" (UID: "d4b62572-0d75-4e00-8215-0efe097aa5ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.264187 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b62572-0d75-4e00-8215-0efe097aa5ad-config-data" (OuterVolumeSpecName: "config-data") pod "d4b62572-0d75-4e00-8215-0efe097aa5ad" (UID: "d4b62572-0d75-4e00-8215-0efe097aa5ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.305504 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b62572-0d75-4e00-8215-0efe097aa5ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.305864 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4b62572-0d75-4e00-8215-0efe097aa5ad-logs\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.305875 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk4sq\" (UniqueName: \"kubernetes.io/projected/d4b62572-0d75-4e00-8215-0efe097aa5ad-kube-api-access-vk4sq\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.305886 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b62572-0d75-4e00-8215-0efe097aa5ad-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.496030 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.522994 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.532502 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 13:59:42 crc kubenswrapper[4854]: E1007 13:59:42.532953 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2378d51-a1ef-4417-9ecd-5bec044c7f9a" containerName="nova-api-api" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.532970 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2378d51-a1ef-4417-9ecd-5bec044c7f9a" containerName="nova-api-api" Oct 07 13:59:42 crc kubenswrapper[4854]: E1007 13:59:42.532993 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b62572-0d75-4e00-8215-0efe097aa5ad" containerName="nova-metadata-log" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.533001 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b62572-0d75-4e00-8215-0efe097aa5ad" containerName="nova-metadata-log" Oct 07 13:59:42 crc kubenswrapper[4854]: E1007 13:59:42.533021 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b62572-0d75-4e00-8215-0efe097aa5ad" containerName="nova-metadata-metadata" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.533027 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b62572-0d75-4e00-8215-0efe097aa5ad" containerName="nova-metadata-metadata" Oct 07 13:59:42 crc kubenswrapper[4854]: E1007 13:59:42.533042 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2378d51-a1ef-4417-9ecd-5bec044c7f9a" containerName="nova-api-log" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.533050 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2378d51-a1ef-4417-9ecd-5bec044c7f9a" containerName="nova-api-log" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.533231 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b62572-0d75-4e00-8215-0efe097aa5ad" containerName="nova-metadata-log" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.533255 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2378d51-a1ef-4417-9ecd-5bec044c7f9a" containerName="nova-api-api" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.533265 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b62572-0d75-4e00-8215-0efe097aa5ad" containerName="nova-metadata-metadata" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.533275 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2378d51-a1ef-4417-9ecd-5bec044c7f9a" containerName="nova-api-log" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.534446 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.540972 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.543837 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.612116 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3779dac-2010-491e-b67c-9e54dc96802e-config-data\") pod \"nova-api-0\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.612346 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3779dac-2010-491e-b67c-9e54dc96802e-logs\") pod \"nova-api-0\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.612580 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3779dac-2010-491e-b67c-9e54dc96802e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.612747 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lxl4\" (UniqueName: \"kubernetes.io/projected/f3779dac-2010-491e-b67c-9e54dc96802e-kube-api-access-5lxl4\") pod \"nova-api-0\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.713535 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3779dac-2010-491e-b67c-9e54dc96802e-config-data\") pod \"nova-api-0\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.713652 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3779dac-2010-491e-b67c-9e54dc96802e-logs\") pod \"nova-api-0\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.713696 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3779dac-2010-491e-b67c-9e54dc96802e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.713748 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lxl4\" (UniqueName: \"kubernetes.io/projected/f3779dac-2010-491e-b67c-9e54dc96802e-kube-api-access-5lxl4\") pod \"nova-api-0\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.715385 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3779dac-2010-491e-b67c-9e54dc96802e-logs\") pod \"nova-api-0\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.719947 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3779dac-2010-491e-b67c-9e54dc96802e-config-data\") pod \"nova-api-0\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.721199 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2378d51-a1ef-4417-9ecd-5bec044c7f9a" path="/var/lib/kubelet/pods/b2378d51-a1ef-4417-9ecd-5bec044c7f9a/volumes" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.721800 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f226ca89-20c7-4d17-a02b-43c1c8353353" path="/var/lib/kubelet/pods/f226ca89-20c7-4d17-a02b-43c1c8353353/volumes" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.725821 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3779dac-2010-491e-b67c-9e54dc96802e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.739075 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lxl4\" (UniqueName: \"kubernetes.io/projected/f3779dac-2010-491e-b67c-9e54dc96802e-kube-api-access-5lxl4\") pod \"nova-api-0\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " pod="openstack/nova-api-0" Oct 07 13:59:42 crc kubenswrapper[4854]: I1007 13:59:42.894049 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.198966 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4b62572-0d75-4e00-8215-0efe097aa5ad","Type":"ContainerDied","Data":"aa74902a2f656ce8452abc3b1732d7299c1a7c34f5c09209a42ca96f15f2254f"} Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.199028 4854 scope.go:117] "RemoveContainer" containerID="fd14c247db401a5f40b87b4108ce86eb56416925a695eae766f22cff61c5971b" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.199224 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.212390 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"840cffec-0fba-4a84-8d36-4c0cc26cadff","Type":"ContainerStarted","Data":"1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7"} Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.212685 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"840cffec-0fba-4a84-8d36-4c0cc26cadff","Type":"ContainerStarted","Data":"38e96b0c3bf2de8356a585173b43064d5621456d59e140af636ecbd7eca371be"} Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.213460 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.214865 4854 generic.go:334] "Generic (PLEG): container finished" podID="a7139c6b-fad9-41da-ac5e-feff21db38c1" containerID="5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15" exitCode=0 Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.215279 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a7139c6b-fad9-41da-ac5e-feff21db38c1","Type":"ContainerDied","Data":"5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15"} Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.245980 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.276414 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.283789 4854 scope.go:117] "RemoveContainer" containerID="05f14f52023bcab9f989f9f9684117c21d02a70e42ba64c95a1844b38efedcb3" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.294873 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.294845767 podStartE2EDuration="2.294845767s" podCreationTimestamp="2025-10-07 13:59:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:59:43.246812274 +0000 UTC m=+5699.234644529" watchObservedRunningTime="2025-10-07 13:59:43.294845767 +0000 UTC m=+5699.282678022" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.299338 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.301720 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.305551 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.320285 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.338363 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4znxl\" (UniqueName: \"kubernetes.io/projected/54138e60-1a8c-4f4d-8179-35716959b0b2-kube-api-access-4znxl\") pod \"nova-metadata-0\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.338516 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54138e60-1a8c-4f4d-8179-35716959b0b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.338549 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54138e60-1a8c-4f4d-8179-35716959b0b2-logs\") pod \"nova-metadata-0\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.338577 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54138e60-1a8c-4f4d-8179-35716959b0b2-config-data\") pod \"nova-metadata-0\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.363249 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 13:59:43 crc kubenswrapper[4854]: W1007 13:59:43.380526 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3779dac_2010_491e_b67c_9e54dc96802e.slice/crio-df9e3db0ac52b6bbc01b374026bfaa4eeb4cadfc26ac4e427952e0237bb2b3f7 WatchSource:0}: Error finding container df9e3db0ac52b6bbc01b374026bfaa4eeb4cadfc26ac4e427952e0237bb2b3f7: Status 404 returned error can't find the container with id df9e3db0ac52b6bbc01b374026bfaa4eeb4cadfc26ac4e427952e0237bb2b3f7 Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.440089 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4znxl\" (UniqueName: \"kubernetes.io/projected/54138e60-1a8c-4f4d-8179-35716959b0b2-kube-api-access-4znxl\") pod \"nova-metadata-0\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.440377 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54138e60-1a8c-4f4d-8179-35716959b0b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.440457 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54138e60-1a8c-4f4d-8179-35716959b0b2-logs\") pod \"nova-metadata-0\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.440537 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54138e60-1a8c-4f4d-8179-35716959b0b2-config-data\") pod \"nova-metadata-0\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.441436 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54138e60-1a8c-4f4d-8179-35716959b0b2-logs\") pod \"nova-metadata-0\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.446124 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54138e60-1a8c-4f4d-8179-35716959b0b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.446806 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54138e60-1a8c-4f4d-8179-35716959b0b2-config-data\") pod \"nova-metadata-0\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.460255 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4znxl\" (UniqueName: \"kubernetes.io/projected/54138e60-1a8c-4f4d-8179-35716959b0b2-kube-api-access-4znxl\") pod \"nova-metadata-0\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: E1007 13:59:43.486532 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15 is running failed: container process not found" containerID="5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 13:59:43 crc kubenswrapper[4854]: E1007 13:59:43.486727 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15 is running failed: container process not found" containerID="5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 13:59:43 crc kubenswrapper[4854]: E1007 13:59:43.487113 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15 is running failed: container process not found" containerID="5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 13:59:43 crc kubenswrapper[4854]: E1007 13:59:43.487160 4854 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a7139c6b-fad9-41da-ac5e-feff21db38c1" containerName="nova-scheduler-scheduler" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.539089 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.623111 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.643826 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7139c6b-fad9-41da-ac5e-feff21db38c1-combined-ca-bundle\") pod \"a7139c6b-fad9-41da-ac5e-feff21db38c1\" (UID: \"a7139c6b-fad9-41da-ac5e-feff21db38c1\") " Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.644051 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2jsq\" (UniqueName: \"kubernetes.io/projected/a7139c6b-fad9-41da-ac5e-feff21db38c1-kube-api-access-t2jsq\") pod \"a7139c6b-fad9-41da-ac5e-feff21db38c1\" (UID: \"a7139c6b-fad9-41da-ac5e-feff21db38c1\") " Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.644127 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7139c6b-fad9-41da-ac5e-feff21db38c1-config-data\") pod \"a7139c6b-fad9-41da-ac5e-feff21db38c1\" (UID: \"a7139c6b-fad9-41da-ac5e-feff21db38c1\") " Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.650103 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7139c6b-fad9-41da-ac5e-feff21db38c1-kube-api-access-t2jsq" (OuterVolumeSpecName: "kube-api-access-t2jsq") pod "a7139c6b-fad9-41da-ac5e-feff21db38c1" (UID: "a7139c6b-fad9-41da-ac5e-feff21db38c1"). InnerVolumeSpecName "kube-api-access-t2jsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.678436 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7139c6b-fad9-41da-ac5e-feff21db38c1-config-data" (OuterVolumeSpecName: "config-data") pod "a7139c6b-fad9-41da-ac5e-feff21db38c1" (UID: "a7139c6b-fad9-41da-ac5e-feff21db38c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.679414 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7139c6b-fad9-41da-ac5e-feff21db38c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7139c6b-fad9-41da-ac5e-feff21db38c1" (UID: "a7139c6b-fad9-41da-ac5e-feff21db38c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.746477 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2jsq\" (UniqueName: \"kubernetes.io/projected/a7139c6b-fad9-41da-ac5e-feff21db38c1-kube-api-access-t2jsq\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.746502 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7139c6b-fad9-41da-ac5e-feff21db38c1-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:43 crc kubenswrapper[4854]: I1007 13:59:43.746512 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7139c6b-fad9-41da-ac5e-feff21db38c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.082554 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 13:59:44 crc kubenswrapper[4854]: W1007 13:59:44.110922 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54138e60_1a8c_4f4d_8179_35716959b0b2.slice/crio-1d19d093543669e298353b699bdccde259e990288064bca6ece4b76897388f04 WatchSource:0}: Error finding container 1d19d093543669e298353b699bdccde259e990288064bca6ece4b76897388f04: Status 404 returned error can't find the container with id 1d19d093543669e298353b699bdccde259e990288064bca6ece4b76897388f04 Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.230824 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a7139c6b-fad9-41da-ac5e-feff21db38c1","Type":"ContainerDied","Data":"ff915f63ee8e4338f958b2e60dbfd7f8b0ab99f5271cff109e4489ff417f2e25"} Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.230887 4854 scope.go:117] "RemoveContainer" containerID="5ace9a77fcd32675abcbf2472d8da6f143300f2ff2d882d4f0ace0aa49752d15" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.231983 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.235201 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3779dac-2010-491e-b67c-9e54dc96802e","Type":"ContainerStarted","Data":"5b7f6ddecbcef7acbabcf085221c456d859aa45dac3b562ade0d34ef256b53bc"} Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.235243 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3779dac-2010-491e-b67c-9e54dc96802e","Type":"ContainerStarted","Data":"11763ec9a98b94affeaab04f64e454d4111a5c368421f72492b914777447ad35"} Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.235258 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3779dac-2010-491e-b67c-9e54dc96802e","Type":"ContainerStarted","Data":"df9e3db0ac52b6bbc01b374026bfaa4eeb4cadfc26ac4e427952e0237bb2b3f7"} Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.244486 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54138e60-1a8c-4f4d-8179-35716959b0b2","Type":"ContainerStarted","Data":"1d19d093543669e298353b699bdccde259e990288064bca6ece4b76897388f04"} Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.282497 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.290480 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.315737 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:59:44 crc kubenswrapper[4854]: E1007 13:59:44.316500 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7139c6b-fad9-41da-ac5e-feff21db38c1" containerName="nova-scheduler-scheduler" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.316520 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7139c6b-fad9-41da-ac5e-feff21db38c1" containerName="nova-scheduler-scheduler" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.316745 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7139c6b-fad9-41da-ac5e-feff21db38c1" containerName="nova-scheduler-scheduler" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.317844 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.323588 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.346557 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.459633 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ccb0960-6b05-4548-9d24-65538a53bac0-config-data\") pod \"nova-scheduler-0\" (UID: \"6ccb0960-6b05-4548-9d24-65538a53bac0\") " pod="openstack/nova-scheduler-0" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.459749 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ccb0960-6b05-4548-9d24-65538a53bac0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ccb0960-6b05-4548-9d24-65538a53bac0\") " pod="openstack/nova-scheduler-0" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.459902 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6kww\" (UniqueName: \"kubernetes.io/projected/6ccb0960-6b05-4548-9d24-65538a53bac0-kube-api-access-m6kww\") pod \"nova-scheduler-0\" (UID: \"6ccb0960-6b05-4548-9d24-65538a53bac0\") " pod="openstack/nova-scheduler-0" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.561904 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ccb0960-6b05-4548-9d24-65538a53bac0-config-data\") pod \"nova-scheduler-0\" (UID: \"6ccb0960-6b05-4548-9d24-65538a53bac0\") " pod="openstack/nova-scheduler-0" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.562031 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ccb0960-6b05-4548-9d24-65538a53bac0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ccb0960-6b05-4548-9d24-65538a53bac0\") " pod="openstack/nova-scheduler-0" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.562103 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6kww\" (UniqueName: \"kubernetes.io/projected/6ccb0960-6b05-4548-9d24-65538a53bac0-kube-api-access-m6kww\") pod \"nova-scheduler-0\" (UID: \"6ccb0960-6b05-4548-9d24-65538a53bac0\") " pod="openstack/nova-scheduler-0" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.569021 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ccb0960-6b05-4548-9d24-65538a53bac0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ccb0960-6b05-4548-9d24-65538a53bac0\") " pod="openstack/nova-scheduler-0" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.572785 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ccb0960-6b05-4548-9d24-65538a53bac0-config-data\") pod \"nova-scheduler-0\" (UID: \"6ccb0960-6b05-4548-9d24-65538a53bac0\") " pod="openstack/nova-scheduler-0" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.582117 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6kww\" (UniqueName: \"kubernetes.io/projected/6ccb0960-6b05-4548-9d24-65538a53bac0-kube-api-access-m6kww\") pod \"nova-scheduler-0\" (UID: \"6ccb0960-6b05-4548-9d24-65538a53bac0\") " pod="openstack/nova-scheduler-0" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.648319 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.726578 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7139c6b-fad9-41da-ac5e-feff21db38c1" path="/var/lib/kubelet/pods/a7139c6b-fad9-41da-ac5e-feff21db38c1/volumes" Oct 07 13:59:44 crc kubenswrapper[4854]: I1007 13:59:44.727287 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b62572-0d75-4e00-8215-0efe097aa5ad" path="/var/lib/kubelet/pods/d4b62572-0d75-4e00-8215-0efe097aa5ad/volumes" Oct 07 13:59:45 crc kubenswrapper[4854]: I1007 13:59:45.136061 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 13:59:45 crc kubenswrapper[4854]: I1007 13:59:45.266006 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ccb0960-6b05-4548-9d24-65538a53bac0","Type":"ContainerStarted","Data":"25e06ef36a4269fdf10ef198957a8ade6ecc8a8f037d073a2c0979b49f08fe20"} Oct 07 13:59:45 crc kubenswrapper[4854]: I1007 13:59:45.267877 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54138e60-1a8c-4f4d-8179-35716959b0b2","Type":"ContainerStarted","Data":"576dc60cb46b84846dadae86fbc29c26242b9882698ba05c1721911ee044989b"} Oct 07 13:59:45 crc kubenswrapper[4854]: I1007 13:59:45.292337 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.292050947 podStartE2EDuration="3.292050947s" podCreationTimestamp="2025-10-07 13:59:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:59:45.289516744 +0000 UTC m=+5701.277348999" watchObservedRunningTime="2025-10-07 13:59:45.292050947 +0000 UTC m=+5701.279883202" Oct 07 13:59:45 crc kubenswrapper[4854]: I1007 13:59:45.452135 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:46 crc kubenswrapper[4854]: I1007 13:59:46.285315 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54138e60-1a8c-4f4d-8179-35716959b0b2","Type":"ContainerStarted","Data":"db3ad8cc3b3119aba55866de81a6fcb5996998bd79b7382a5d19db51b3aab5b7"} Oct 07 13:59:46 crc kubenswrapper[4854]: I1007 13:59:46.290117 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ccb0960-6b05-4548-9d24-65538a53bac0","Type":"ContainerStarted","Data":"1388830712f06a409d1bb2a6a13fb394eba9c882cd3cf368287304ee41f9a32f"} Oct 07 13:59:46 crc kubenswrapper[4854]: I1007 13:59:46.325510 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.325479731 podStartE2EDuration="3.325479731s" podCreationTimestamp="2025-10-07 13:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:59:46.314574399 +0000 UTC m=+5702.302406724" watchObservedRunningTime="2025-10-07 13:59:46.325479731 +0000 UTC m=+5702.313312016" Oct 07 13:59:46 crc kubenswrapper[4854]: I1007 13:59:46.354888 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.354859381 podStartE2EDuration="2.354859381s" podCreationTimestamp="2025-10-07 13:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:59:46.34047771 +0000 UTC m=+5702.328310045" watchObservedRunningTime="2025-10-07 13:59:46.354859381 +0000 UTC m=+5702.342691646" Oct 07 13:59:48 crc kubenswrapper[4854]: I1007 13:59:48.624204 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:59:48 crc kubenswrapper[4854]: I1007 13:59:48.624527 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 13:59:49 crc kubenswrapper[4854]: I1007 13:59:49.654958 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 13:59:50 crc kubenswrapper[4854]: I1007 13:59:50.451620 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:50 crc kubenswrapper[4854]: I1007 13:59:50.466570 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:50 crc kubenswrapper[4854]: I1007 13:59:50.507268 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 07 13:59:51 crc kubenswrapper[4854]: I1007 13:59:51.360004 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 07 13:59:51 crc kubenswrapper[4854]: I1007 13:59:51.509931 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 07 13:59:52 crc kubenswrapper[4854]: I1007 13:59:52.895138 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 13:59:52 crc kubenswrapper[4854]: I1007 13:59:52.895669 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 13:59:53 crc kubenswrapper[4854]: I1007 13:59:53.624201 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 13:59:53 crc kubenswrapper[4854]: I1007 13:59:53.624528 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 13:59:53 crc kubenswrapper[4854]: I1007 13:59:53.977382 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f3779dac-2010-491e-b67c-9e54dc96802e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.86:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:59:53 crc kubenswrapper[4854]: I1007 13:59:53.978240 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f3779dac-2010-491e-b67c-9e54dc96802e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.86:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:59:54 crc kubenswrapper[4854]: I1007 13:59:54.648855 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 13:59:54 crc kubenswrapper[4854]: I1007 13:59:54.693281 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 13:59:54 crc kubenswrapper[4854]: I1007 13:59:54.706491 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="54138e60-1a8c-4f4d-8179-35716959b0b2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.87:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:59:54 crc kubenswrapper[4854]: I1007 13:59:54.706446 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="54138e60-1a8c-4f4d-8179-35716959b0b2" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.87:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 13:59:55 crc kubenswrapper[4854]: I1007 13:59:55.432289 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.628247 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.630237 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.632710 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.724315 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.780376 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.780444 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.780477 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-config-data\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.780551 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gx7p\" (UniqueName: \"kubernetes.io/projected/da397325-9681-4562-b3fe-22cf4ef798b1-kube-api-access-9gx7p\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.780628 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-scripts\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.780720 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da397325-9681-4562-b3fe-22cf4ef798b1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.882597 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da397325-9681-4562-b3fe-22cf4ef798b1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.882653 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.882694 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.882727 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-config-data\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.882735 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da397325-9681-4562-b3fe-22cf4ef798b1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.882759 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gx7p\" (UniqueName: \"kubernetes.io/projected/da397325-9681-4562-b3fe-22cf4ef798b1-kube-api-access-9gx7p\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.882983 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-scripts\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.889026 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-scripts\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.889188 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-config-data\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.890636 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.912895 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gx7p\" (UniqueName: \"kubernetes.io/projected/da397325-9681-4562-b3fe-22cf4ef798b1-kube-api-access-9gx7p\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:57 crc kubenswrapper[4854]: I1007 13:59:57.919852 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " pod="openstack/cinder-scheduler-0" Oct 07 13:59:58 crc kubenswrapper[4854]: I1007 13:59:58.011258 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 13:59:58 crc kubenswrapper[4854]: I1007 13:59:58.512085 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 13:59:58 crc kubenswrapper[4854]: W1007 13:59:58.518849 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda397325_9681_4562_b3fe_22cf4ef798b1.slice/crio-530801e9a65308f1069c2e13e008e3e05f28a2c584cd69507c06c5f038e33347 WatchSource:0}: Error finding container 530801e9a65308f1069c2e13e008e3e05f28a2c584cd69507c06c5f038e33347: Status 404 returned error can't find the container with id 530801e9a65308f1069c2e13e008e3e05f28a2c584cd69507c06c5f038e33347 Oct 07 13:59:58 crc kubenswrapper[4854]: I1007 13:59:58.872380 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 13:59:58 crc kubenswrapper[4854]: I1007 13:59:58.873103 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5826d74d-f392-425f-8d56-7c04c7a67ed7" containerName="cinder-api-log" containerID="cri-o://39540665e851bdcde17936534d7635fc62c065a102d446e32338dfbc58e5ced0" gracePeriod=30 Oct 07 13:59:58 crc kubenswrapper[4854]: I1007 13:59:58.873250 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5826d74d-f392-425f-8d56-7c04c7a67ed7" containerName="cinder-api" containerID="cri-o://1871b936d0f6ff5f03623762fc356d0693e8dd110d47b103caa243cd30043372" gracePeriod=30 Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.441202 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.443644 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.445500 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.445979 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da397325-9681-4562-b3fe-22cf4ef798b1","Type":"ContainerStarted","Data":"1c910129a1e3be84a3fba506faef90c519cc4ec252bf5968fe99f7cb28006ba6"} Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.446021 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da397325-9681-4562-b3fe-22cf4ef798b1","Type":"ContainerStarted","Data":"530801e9a65308f1069c2e13e008e3e05f28a2c584cd69507c06c5f038e33347"} Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.452420 4854 generic.go:334] "Generic (PLEG): container finished" podID="5826d74d-f392-425f-8d56-7c04c7a67ed7" containerID="39540665e851bdcde17936534d7635fc62c065a102d446e32338dfbc58e5ced0" exitCode=143 Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.452468 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5826d74d-f392-425f-8d56-7c04c7a67ed7","Type":"ContainerDied","Data":"39540665e851bdcde17936534d7635fc62c065a102d446e32338dfbc58e5ced0"} Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.492372 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640071 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640129 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640166 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640188 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640204 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640219 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-run\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640239 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640287 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640313 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r966p\" (UniqueName: \"kubernetes.io/projected/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-kube-api-access-r966p\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640330 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640366 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640385 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640403 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640431 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640465 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.640483 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.742700 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.742792 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.742818 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.742846 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.742878 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.742900 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.742920 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-run\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.742945 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.742950 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.742996 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743010 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743035 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r966p\" (UniqueName: \"kubernetes.io/projected/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-kube-api-access-r966p\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743025 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-run\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743075 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743137 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743253 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743280 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743306 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743347 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743351 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743397 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743468 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743491 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743565 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743598 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.743787 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.753258 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.762282 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.762396 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.767077 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.773599 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.775910 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r966p\" (UniqueName: \"kubernetes.io/projected/c22041bc-4e60-4e0c-8209-856fb1e2ba7a-kube-api-access-r966p\") pod \"cinder-volume-volume1-0\" (UID: \"c22041bc-4e60-4e0c-8209-856fb1e2ba7a\") " pod="openstack/cinder-volume-volume1-0" Oct 07 13:59:59 crc kubenswrapper[4854]: I1007 13:59:59.784203 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.016652 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.018815 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.021400 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.040981 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.054984 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055035 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055053 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055079 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-etc-nvme\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055108 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-scripts\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055160 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-sys\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055201 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055224 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-ceph\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055247 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-config-data\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055273 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055474 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-config-data-custom\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055508 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055541 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-lib-modules\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055575 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-dev\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055604 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-run\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.055641 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrntw\" (UniqueName: \"kubernetes.io/projected/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-kube-api-access-hrntw\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.145385 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh"] Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.146892 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.151456 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.151753 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.154701 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh"] Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157553 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-config-data-custom\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157590 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157614 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-lib-modules\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157636 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjbb7\" (UniqueName: \"kubernetes.io/projected/6a28dc48-91f4-4fc6-854d-058aab9daf21-kube-api-access-zjbb7\") pod \"collect-profiles-29330760-vtxjh\" (UID: \"6a28dc48-91f4-4fc6-854d-058aab9daf21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157654 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a28dc48-91f4-4fc6-854d-058aab9daf21-secret-volume\") pod \"collect-profiles-29330760-vtxjh\" (UID: \"6a28dc48-91f4-4fc6-854d-058aab9daf21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157677 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-dev\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157699 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-run\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157722 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrntw\" (UniqueName: \"kubernetes.io/projected/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-kube-api-access-hrntw\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157748 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157767 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157782 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157800 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-etc-nvme\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157820 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-scripts\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157839 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a28dc48-91f4-4fc6-854d-058aab9daf21-config-volume\") pod \"collect-profiles-29330760-vtxjh\" (UID: \"6a28dc48-91f4-4fc6-854d-058aab9daf21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157864 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-sys\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157898 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157915 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-ceph\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157936 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-config-data\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157939 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-run\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.157954 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.158005 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.158318 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.158345 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.158374 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.159996 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-etc-nvme\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.160489 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-lib-modules\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.160527 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.160557 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-sys\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.160578 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-dev\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.166245 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.176044 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-scripts\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.176111 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-config-data\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.176389 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-ceph\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.187548 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrntw\" (UniqueName: \"kubernetes.io/projected/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-kube-api-access-hrntw\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.192272 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c03852fe-4f34-4fff-b7a4-7063ce3d2f29-config-data-custom\") pod \"cinder-backup-0\" (UID: \"c03852fe-4f34-4fff-b7a4-7063ce3d2f29\") " pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.259788 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a28dc48-91f4-4fc6-854d-058aab9daf21-secret-volume\") pod \"collect-profiles-29330760-vtxjh\" (UID: \"6a28dc48-91f4-4fc6-854d-058aab9daf21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.260246 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjbb7\" (UniqueName: \"kubernetes.io/projected/6a28dc48-91f4-4fc6-854d-058aab9daf21-kube-api-access-zjbb7\") pod \"collect-profiles-29330760-vtxjh\" (UID: \"6a28dc48-91f4-4fc6-854d-058aab9daf21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.260347 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a28dc48-91f4-4fc6-854d-058aab9daf21-config-volume\") pod \"collect-profiles-29330760-vtxjh\" (UID: \"6a28dc48-91f4-4fc6-854d-058aab9daf21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.261343 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a28dc48-91f4-4fc6-854d-058aab9daf21-config-volume\") pod \"collect-profiles-29330760-vtxjh\" (UID: \"6a28dc48-91f4-4fc6-854d-058aab9daf21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.265587 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a28dc48-91f4-4fc6-854d-058aab9daf21-secret-volume\") pod \"collect-profiles-29330760-vtxjh\" (UID: \"6a28dc48-91f4-4fc6-854d-058aab9daf21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.277858 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjbb7\" (UniqueName: \"kubernetes.io/projected/6a28dc48-91f4-4fc6-854d-058aab9daf21-kube-api-access-zjbb7\") pod \"collect-profiles-29330760-vtxjh\" (UID: \"6a28dc48-91f4-4fc6-854d-058aab9daf21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.364008 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.383357 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.509518 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.537443 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:00:00 crc kubenswrapper[4854]: W1007 14:00:00.874901 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc03852fe_4f34_4fff_b7a4_7063ce3d2f29.slice/crio-8db9dd51ca497f0261847d4f5f809e1ab3f408f9de4156a09fc86ad86d204fed WatchSource:0}: Error finding container 8db9dd51ca497f0261847d4f5f809e1ab3f408f9de4156a09fc86ad86d204fed: Status 404 returned error can't find the container with id 8db9dd51ca497f0261847d4f5f809e1ab3f408f9de4156a09fc86ad86d204fed Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.876967 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 07 14:00:00 crc kubenswrapper[4854]: I1007 14:00:00.908953 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh"] Oct 07 14:00:00 crc kubenswrapper[4854]: W1007 14:00:00.912731 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a28dc48_91f4_4fc6_854d_058aab9daf21.slice/crio-7cafdb11cadb9193983dc702e7d10083fe84edf558f29a610ee2fa6198f27ab8 WatchSource:0}: Error finding container 7cafdb11cadb9193983dc702e7d10083fe84edf558f29a610ee2fa6198f27ab8: Status 404 returned error can't find the container with id 7cafdb11cadb9193983dc702e7d10083fe84edf558f29a610ee2fa6198f27ab8 Oct 07 14:00:01 crc kubenswrapper[4854]: I1007 14:00:01.485023 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"c03852fe-4f34-4fff-b7a4-7063ce3d2f29","Type":"ContainerStarted","Data":"8db9dd51ca497f0261847d4f5f809e1ab3f408f9de4156a09fc86ad86d204fed"} Oct 07 14:00:01 crc kubenswrapper[4854]: I1007 14:00:01.497842 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da397325-9681-4562-b3fe-22cf4ef798b1","Type":"ContainerStarted","Data":"d240a18319c953df222887cbd670e5a624cc6767133fe11c0d232abcbf7a9b22"} Oct 07 14:00:01 crc kubenswrapper[4854]: I1007 14:00:01.502298 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" event={"ID":"6a28dc48-91f4-4fc6-854d-058aab9daf21","Type":"ContainerStarted","Data":"f84e1c192cd00627c4033c462e4e78c3cd6da4dcf31eeb4d0396300e389306fd"} Oct 07 14:00:01 crc kubenswrapper[4854]: I1007 14:00:01.502356 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" event={"ID":"6a28dc48-91f4-4fc6-854d-058aab9daf21","Type":"ContainerStarted","Data":"7cafdb11cadb9193983dc702e7d10083fe84edf558f29a610ee2fa6198f27ab8"} Oct 07 14:00:01 crc kubenswrapper[4854]: I1007 14:00:01.507368 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c22041bc-4e60-4e0c-8209-856fb1e2ba7a","Type":"ContainerStarted","Data":"22b7adc11e226e0b0fd736382b0c87be597c7f69bba8c7d48249658e979b0ac4"} Oct 07 14:00:01 crc kubenswrapper[4854]: I1007 14:00:01.524076 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.524050012 podStartE2EDuration="4.524050012s" podCreationTimestamp="2025-10-07 13:59:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:00:01.521525889 +0000 UTC m=+5717.509358144" watchObservedRunningTime="2025-10-07 14:00:01.524050012 +0000 UTC m=+5717.511882267" Oct 07 14:00:01 crc kubenswrapper[4854]: I1007 14:00:01.963418 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="5826d74d-f392-425f-8d56-7c04c7a67ed7" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.82:8776/healthcheck\": read tcp 10.217.0.2:33286->10.217.1.82:8776: read: connection reset by peer" Oct 07 14:00:02 crc kubenswrapper[4854]: I1007 14:00:02.527526 4854 generic.go:334] "Generic (PLEG): container finished" podID="6a28dc48-91f4-4fc6-854d-058aab9daf21" containerID="f84e1c192cd00627c4033c462e4e78c3cd6da4dcf31eeb4d0396300e389306fd" exitCode=0 Oct 07 14:00:02 crc kubenswrapper[4854]: I1007 14:00:02.527601 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" event={"ID":"6a28dc48-91f4-4fc6-854d-058aab9daf21","Type":"ContainerDied","Data":"f84e1c192cd00627c4033c462e4e78c3cd6da4dcf31eeb4d0396300e389306fd"} Oct 07 14:00:02 crc kubenswrapper[4854]: I1007 14:00:02.537996 4854 generic.go:334] "Generic (PLEG): container finished" podID="5826d74d-f392-425f-8d56-7c04c7a67ed7" containerID="1871b936d0f6ff5f03623762fc356d0693e8dd110d47b103caa243cd30043372" exitCode=0 Oct 07 14:00:02 crc kubenswrapper[4854]: I1007 14:00:02.538263 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5826d74d-f392-425f-8d56-7c04c7a67ed7","Type":"ContainerDied","Data":"1871b936d0f6ff5f03623762fc356d0693e8dd110d47b103caa243cd30043372"} Oct 07 14:00:02 crc kubenswrapper[4854]: I1007 14:00:02.945246 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 14:00:02 crc kubenswrapper[4854]: I1007 14:00:02.945771 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 14:00:02 crc kubenswrapper[4854]: I1007 14:00:02.946322 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 14:00:02 crc kubenswrapper[4854]: I1007 14:00:02.946554 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 14:00:02 crc kubenswrapper[4854]: I1007 14:00:02.951414 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 14:00:02 crc kubenswrapper[4854]: I1007 14:00:02.952037 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 14:00:03 crc kubenswrapper[4854]: I1007 14:00:03.011813 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 14:00:03 crc kubenswrapper[4854]: I1007 14:00:03.629772 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 14:00:03 crc kubenswrapper[4854]: I1007 14:00:03.631960 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 14:00:03 crc kubenswrapper[4854]: I1007 14:00:03.634370 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.565611 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" event={"ID":"6a28dc48-91f4-4fc6-854d-058aab9daf21","Type":"ContainerDied","Data":"7cafdb11cadb9193983dc702e7d10083fe84edf558f29a610ee2fa6198f27ab8"} Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.565989 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cafdb11cadb9193983dc702e7d10083fe84edf558f29a610ee2fa6198f27ab8" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.569399 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5826d74d-f392-425f-8d56-7c04c7a67ed7","Type":"ContainerDied","Data":"184db8c5630bf3f8d199019bdb64e50e8ce8fe31feb12c7859d833971cc499bd"} Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.569462 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="184db8c5630bf3f8d199019bdb64e50e8ce8fe31feb12c7859d833971cc499bd" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.575593 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.649986 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.659061 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.809519 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-combined-ca-bundle\") pod \"5826d74d-f392-425f-8d56-7c04c7a67ed7\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.809789 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-scripts\") pod \"5826d74d-f392-425f-8d56-7c04c7a67ed7\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.809840 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjbb7\" (UniqueName: \"kubernetes.io/projected/6a28dc48-91f4-4fc6-854d-058aab9daf21-kube-api-access-zjbb7\") pod \"6a28dc48-91f4-4fc6-854d-058aab9daf21\" (UID: \"6a28dc48-91f4-4fc6-854d-058aab9daf21\") " Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.809863 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a28dc48-91f4-4fc6-854d-058aab9daf21-config-volume\") pod \"6a28dc48-91f4-4fc6-854d-058aab9daf21\" (UID: \"6a28dc48-91f4-4fc6-854d-058aab9daf21\") " Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.809907 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5826d74d-f392-425f-8d56-7c04c7a67ed7-logs\") pod \"5826d74d-f392-425f-8d56-7c04c7a67ed7\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.809975 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5826d74d-f392-425f-8d56-7c04c7a67ed7-etc-machine-id\") pod \"5826d74d-f392-425f-8d56-7c04c7a67ed7\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.810006 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pjtf\" (UniqueName: \"kubernetes.io/projected/5826d74d-f392-425f-8d56-7c04c7a67ed7-kube-api-access-2pjtf\") pod \"5826d74d-f392-425f-8d56-7c04c7a67ed7\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.810063 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a28dc48-91f4-4fc6-854d-058aab9daf21-secret-volume\") pod \"6a28dc48-91f4-4fc6-854d-058aab9daf21\" (UID: \"6a28dc48-91f4-4fc6-854d-058aab9daf21\") " Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.810105 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-config-data-custom\") pod \"5826d74d-f392-425f-8d56-7c04c7a67ed7\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.810123 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-config-data\") pod \"5826d74d-f392-425f-8d56-7c04c7a67ed7\" (UID: \"5826d74d-f392-425f-8d56-7c04c7a67ed7\") " Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.810632 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5826d74d-f392-425f-8d56-7c04c7a67ed7-logs" (OuterVolumeSpecName: "logs") pod "5826d74d-f392-425f-8d56-7c04c7a67ed7" (UID: "5826d74d-f392-425f-8d56-7c04c7a67ed7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.811086 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a28dc48-91f4-4fc6-854d-058aab9daf21-config-volume" (OuterVolumeSpecName: "config-volume") pod "6a28dc48-91f4-4fc6-854d-058aab9daf21" (UID: "6a28dc48-91f4-4fc6-854d-058aab9daf21"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.811091 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5826d74d-f392-425f-8d56-7c04c7a67ed7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5826d74d-f392-425f-8d56-7c04c7a67ed7" (UID: "5826d74d-f392-425f-8d56-7c04c7a67ed7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.816042 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a28dc48-91f4-4fc6-854d-058aab9daf21-kube-api-access-zjbb7" (OuterVolumeSpecName: "kube-api-access-zjbb7") pod "6a28dc48-91f4-4fc6-854d-058aab9daf21" (UID: "6a28dc48-91f4-4fc6-854d-058aab9daf21"). InnerVolumeSpecName "kube-api-access-zjbb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.820936 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5826d74d-f392-425f-8d56-7c04c7a67ed7" (UID: "5826d74d-f392-425f-8d56-7c04c7a67ed7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.821560 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-scripts" (OuterVolumeSpecName: "scripts") pod "5826d74d-f392-425f-8d56-7c04c7a67ed7" (UID: "5826d74d-f392-425f-8d56-7c04c7a67ed7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.821745 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a28dc48-91f4-4fc6-854d-058aab9daf21-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6a28dc48-91f4-4fc6-854d-058aab9daf21" (UID: "6a28dc48-91f4-4fc6-854d-058aab9daf21"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.826421 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5826d74d-f392-425f-8d56-7c04c7a67ed7-kube-api-access-2pjtf" (OuterVolumeSpecName: "kube-api-access-2pjtf") pod "5826d74d-f392-425f-8d56-7c04c7a67ed7" (UID: "5826d74d-f392-425f-8d56-7c04c7a67ed7"). InnerVolumeSpecName "kube-api-access-2pjtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.843558 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5826d74d-f392-425f-8d56-7c04c7a67ed7" (UID: "5826d74d-f392-425f-8d56-7c04c7a67ed7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.867826 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-config-data" (OuterVolumeSpecName: "config-data") pod "5826d74d-f392-425f-8d56-7c04c7a67ed7" (UID: "5826d74d-f392-425f-8d56-7c04c7a67ed7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.912521 4854 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a28dc48-91f4-4fc6-854d-058aab9daf21-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.912559 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.912570 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.912579 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.912587 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5826d74d-f392-425f-8d56-7c04c7a67ed7-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.912596 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjbb7\" (UniqueName: \"kubernetes.io/projected/6a28dc48-91f4-4fc6-854d-058aab9daf21-kube-api-access-zjbb7\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.912604 4854 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a28dc48-91f4-4fc6-854d-058aab9daf21-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.912612 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5826d74d-f392-425f-8d56-7c04c7a67ed7-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.912620 4854 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5826d74d-f392-425f-8d56-7c04c7a67ed7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:04 crc kubenswrapper[4854]: I1007 14:00:04.912628 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pjtf\" (UniqueName: \"kubernetes.io/projected/5826d74d-f392-425f-8d56-7c04c7a67ed7-kube-api-access-2pjtf\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.577751 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.577844 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.634064 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.647549 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.662270 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 14:00:05 crc kubenswrapper[4854]: E1007 14:00:05.663542 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a28dc48-91f4-4fc6-854d-058aab9daf21" containerName="collect-profiles" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.663681 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a28dc48-91f4-4fc6-854d-058aab9daf21" containerName="collect-profiles" Oct 07 14:00:05 crc kubenswrapper[4854]: E1007 14:00:05.663784 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5826d74d-f392-425f-8d56-7c04c7a67ed7" containerName="cinder-api-log" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.663854 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5826d74d-f392-425f-8d56-7c04c7a67ed7" containerName="cinder-api-log" Oct 07 14:00:05 crc kubenswrapper[4854]: E1007 14:00:05.663943 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5826d74d-f392-425f-8d56-7c04c7a67ed7" containerName="cinder-api" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.664015 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5826d74d-f392-425f-8d56-7c04c7a67ed7" containerName="cinder-api" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.664380 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5826d74d-f392-425f-8d56-7c04c7a67ed7" containerName="cinder-api" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.664488 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5826d74d-f392-425f-8d56-7c04c7a67ed7" containerName="cinder-api-log" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.664675 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a28dc48-91f4-4fc6-854d-058aab9daf21" containerName="collect-profiles" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.666036 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.669179 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.671923 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.734180 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87"] Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.742779 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330715-6dw87"] Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.832285 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-config-data\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.832377 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-config-data-custom\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.832417 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.832705 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.832847 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-logs\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.832958 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-scripts\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.833021 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwdn2\" (UniqueName: \"kubernetes.io/projected/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-kube-api-access-lwdn2\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.935331 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-scripts\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.935397 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwdn2\" (UniqueName: \"kubernetes.io/projected/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-kube-api-access-lwdn2\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.935468 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-config-data\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.935495 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-config-data-custom\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.935531 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.935640 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.935676 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-logs\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.936184 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-logs\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.936226 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.940952 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-config-data-custom\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.941019 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.941027 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-scripts\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.941438 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-config-data\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.960598 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwdn2\" (UniqueName: \"kubernetes.io/projected/9ebc9353-0fbe-4d1f-8e95-7b3a716adc28-kube-api-access-lwdn2\") pod \"cinder-api-0\" (UID: \"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28\") " pod="openstack/cinder-api-0" Oct 07 14:00:05 crc kubenswrapper[4854]: I1007 14:00:05.985825 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 14:00:06 crc kubenswrapper[4854]: I1007 14:00:06.722259 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42109e34-6105-420f-9841-80ce763db33c" path="/var/lib/kubelet/pods/42109e34-6105-420f-9841-80ce763db33c/volumes" Oct 07 14:00:06 crc kubenswrapper[4854]: I1007 14:00:06.724889 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5826d74d-f392-425f-8d56-7c04c7a67ed7" path="/var/lib/kubelet/pods/5826d74d-f392-425f-8d56-7c04c7a67ed7/volumes" Oct 07 14:00:06 crc kubenswrapper[4854]: I1007 14:00:06.788166 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 14:00:06 crc kubenswrapper[4854]: W1007 14:00:06.796538 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ebc9353_0fbe_4d1f_8e95_7b3a716adc28.slice/crio-153f1ede572644f9638bc65dd5f42910b4d8f1beee6ec19c17f1268101cd833a WatchSource:0}: Error finding container 153f1ede572644f9638bc65dd5f42910b4d8f1beee6ec19c17f1268101cd833a: Status 404 returned error can't find the container with id 153f1ede572644f9638bc65dd5f42910b4d8f1beee6ec19c17f1268101cd833a Oct 07 14:00:07 crc kubenswrapper[4854]: I1007 14:00:07.599873 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28","Type":"ContainerStarted","Data":"153f1ede572644f9638bc65dd5f42910b4d8f1beee6ec19c17f1268101cd833a"} Oct 07 14:00:08 crc kubenswrapper[4854]: I1007 14:00:08.241973 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 14:00:08 crc kubenswrapper[4854]: I1007 14:00:08.284660 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:00:08 crc kubenswrapper[4854]: I1007 14:00:08.622746 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28","Type":"ContainerStarted","Data":"2bec75de9e3b3d35f1607b7651b05f76996bd02868dd88e13364bc7e941611c2"} Oct 07 14:00:08 crc kubenswrapper[4854]: I1007 14:00:08.623023 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="da397325-9681-4562-b3fe-22cf4ef798b1" containerName="cinder-scheduler" containerID="cri-o://1c910129a1e3be84a3fba506faef90c519cc4ec252bf5968fe99f7cb28006ba6" gracePeriod=30 Oct 07 14:00:08 crc kubenswrapper[4854]: I1007 14:00:08.623081 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="da397325-9681-4562-b3fe-22cf4ef798b1" containerName="probe" containerID="cri-o://d240a18319c953df222887cbd670e5a624cc6767133fe11c0d232abcbf7a9b22" gracePeriod=30 Oct 07 14:00:10 crc kubenswrapper[4854]: I1007 14:00:10.651574 4854 generic.go:334] "Generic (PLEG): container finished" podID="da397325-9681-4562-b3fe-22cf4ef798b1" containerID="d240a18319c953df222887cbd670e5a624cc6767133fe11c0d232abcbf7a9b22" exitCode=0 Oct 07 14:00:10 crc kubenswrapper[4854]: I1007 14:00:10.652277 4854 generic.go:334] "Generic (PLEG): container finished" podID="da397325-9681-4562-b3fe-22cf4ef798b1" containerID="1c910129a1e3be84a3fba506faef90c519cc4ec252bf5968fe99f7cb28006ba6" exitCode=0 Oct 07 14:00:10 crc kubenswrapper[4854]: I1007 14:00:10.651672 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da397325-9681-4562-b3fe-22cf4ef798b1","Type":"ContainerDied","Data":"d240a18319c953df222887cbd670e5a624cc6767133fe11c0d232abcbf7a9b22"} Oct 07 14:00:10 crc kubenswrapper[4854]: I1007 14:00:10.652337 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da397325-9681-4562-b3fe-22cf4ef798b1","Type":"ContainerDied","Data":"1c910129a1e3be84a3fba506faef90c519cc4ec252bf5968fe99f7cb28006ba6"} Oct 07 14:00:10 crc kubenswrapper[4854]: I1007 14:00:10.807753 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:00:10 crc kubenswrapper[4854]: I1007 14:00:10.808011 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.177123 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.255020 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-config-data\") pod \"da397325-9681-4562-b3fe-22cf4ef798b1\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.255950 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da397325-9681-4562-b3fe-22cf4ef798b1-etc-machine-id\") pod \"da397325-9681-4562-b3fe-22cf4ef798b1\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.256026 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-scripts\") pod \"da397325-9681-4562-b3fe-22cf4ef798b1\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.256081 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gx7p\" (UniqueName: \"kubernetes.io/projected/da397325-9681-4562-b3fe-22cf4ef798b1-kube-api-access-9gx7p\") pod \"da397325-9681-4562-b3fe-22cf4ef798b1\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.256229 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-combined-ca-bundle\") pod \"da397325-9681-4562-b3fe-22cf4ef798b1\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.256280 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-config-data-custom\") pod \"da397325-9681-4562-b3fe-22cf4ef798b1\" (UID: \"da397325-9681-4562-b3fe-22cf4ef798b1\") " Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.259698 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da397325-9681-4562-b3fe-22cf4ef798b1" (UID: "da397325-9681-4562-b3fe-22cf4ef798b1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.260465 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-scripts" (OuterVolumeSpecName: "scripts") pod "da397325-9681-4562-b3fe-22cf4ef798b1" (UID: "da397325-9681-4562-b3fe-22cf4ef798b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.260568 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da397325-9681-4562-b3fe-22cf4ef798b1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "da397325-9681-4562-b3fe-22cf4ef798b1" (UID: "da397325-9681-4562-b3fe-22cf4ef798b1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.266344 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da397325-9681-4562-b3fe-22cf4ef798b1-kube-api-access-9gx7p" (OuterVolumeSpecName: "kube-api-access-9gx7p") pod "da397325-9681-4562-b3fe-22cf4ef798b1" (UID: "da397325-9681-4562-b3fe-22cf4ef798b1"). InnerVolumeSpecName "kube-api-access-9gx7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.314889 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da397325-9681-4562-b3fe-22cf4ef798b1" (UID: "da397325-9681-4562-b3fe-22cf4ef798b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.358637 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.358687 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.358697 4854 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da397325-9681-4562-b3fe-22cf4ef798b1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.358707 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.358716 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gx7p\" (UniqueName: \"kubernetes.io/projected/da397325-9681-4562-b3fe-22cf4ef798b1-kube-api-access-9gx7p\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.375763 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-config-data" (OuterVolumeSpecName: "config-data") pod "da397325-9681-4562-b3fe-22cf4ef798b1" (UID: "da397325-9681-4562-b3fe-22cf4ef798b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.460500 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da397325-9681-4562-b3fe-22cf4ef798b1-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.678791 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"c03852fe-4f34-4fff-b7a4-7063ce3d2f29","Type":"ContainerStarted","Data":"e1dc2c11075b96a67c7bf6a1b23083aae1ceeba690c9f2f433fd29d29207f6e2"} Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.683199 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da397325-9681-4562-b3fe-22cf4ef798b1","Type":"ContainerDied","Data":"530801e9a65308f1069c2e13e008e3e05f28a2c584cd69507c06c5f038e33347"} Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.683312 4854 scope.go:117] "RemoveContainer" containerID="d240a18319c953df222887cbd670e5a624cc6767133fe11c0d232abcbf7a9b22" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.683451 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.744395 4854 scope.go:117] "RemoveContainer" containerID="1c910129a1e3be84a3fba506faef90c519cc4ec252bf5968fe99f7cb28006ba6" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.746958 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.764860 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.779683 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:00:11 crc kubenswrapper[4854]: E1007 14:00:11.780128 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da397325-9681-4562-b3fe-22cf4ef798b1" containerName="probe" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.780157 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="da397325-9681-4562-b3fe-22cf4ef798b1" containerName="probe" Oct 07 14:00:11 crc kubenswrapper[4854]: E1007 14:00:11.780186 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da397325-9681-4562-b3fe-22cf4ef798b1" containerName="cinder-scheduler" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.780194 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="da397325-9681-4562-b3fe-22cf4ef798b1" containerName="cinder-scheduler" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.780360 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="da397325-9681-4562-b3fe-22cf4ef798b1" containerName="cinder-scheduler" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.780391 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="da397325-9681-4562-b3fe-22cf4ef798b1" containerName="probe" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.781475 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.783410 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.787317 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.867932 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a6c31e6-c617-401c-a6c5-c76f945460b7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.867981 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6c31e6-c617-401c-a6c5-c76f945460b7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.868018 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6c31e6-c617-401c-a6c5-c76f945460b7-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.868043 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6c31e6-c617-401c-a6c5-c76f945460b7-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.868137 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a6c31e6-c617-401c-a6c5-c76f945460b7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.868511 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsljl\" (UniqueName: \"kubernetes.io/projected/5a6c31e6-c617-401c-a6c5-c76f945460b7-kube-api-access-wsljl\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.970742 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6c31e6-c617-401c-a6c5-c76f945460b7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.971170 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6c31e6-c617-401c-a6c5-c76f945460b7-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.971205 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6c31e6-c617-401c-a6c5-c76f945460b7-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.971253 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a6c31e6-c617-401c-a6c5-c76f945460b7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.971364 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsljl\" (UniqueName: \"kubernetes.io/projected/5a6c31e6-c617-401c-a6c5-c76f945460b7-kube-api-access-wsljl\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.971431 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a6c31e6-c617-401c-a6c5-c76f945460b7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.971510 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5a6c31e6-c617-401c-a6c5-c76f945460b7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.978958 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6c31e6-c617-401c-a6c5-c76f945460b7-scripts\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.979161 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a6c31e6-c617-401c-a6c5-c76f945460b7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.979969 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6c31e6-c617-401c-a6c5-c76f945460b7-config-data\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.985883 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6c31e6-c617-401c-a6c5-c76f945460b7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:11 crc kubenswrapper[4854]: I1007 14:00:11.991950 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsljl\" (UniqueName: \"kubernetes.io/projected/5a6c31e6-c617-401c-a6c5-c76f945460b7-kube-api-access-wsljl\") pod \"cinder-scheduler-0\" (UID: \"5a6c31e6-c617-401c-a6c5-c76f945460b7\") " pod="openstack/cinder-scheduler-0" Oct 07 14:00:12 crc kubenswrapper[4854]: I1007 14:00:12.181913 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 14:00:12 crc kubenswrapper[4854]: I1007 14:00:12.698355 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:00:12 crc kubenswrapper[4854]: I1007 14:00:12.700466 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"c03852fe-4f34-4fff-b7a4-7063ce3d2f29","Type":"ContainerStarted","Data":"aabe489a8d0c23630b3f5ceb9f22aa22de200f1ded380658d897dbdf8dbe2adf"} Oct 07 14:00:12 crc kubenswrapper[4854]: I1007 14:00:12.730782 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.661375676 podStartE2EDuration="13.730564001s" podCreationTimestamp="2025-10-07 13:59:59 +0000 UTC" firstStartedPulling="2025-10-07 14:00:00.877769141 +0000 UTC m=+5716.865601396" lastFinishedPulling="2025-10-07 14:00:10.946957426 +0000 UTC m=+5726.934789721" observedRunningTime="2025-10-07 14:00:12.726787453 +0000 UTC m=+5728.714619708" watchObservedRunningTime="2025-10-07 14:00:12.730564001 +0000 UTC m=+5728.718396256" Oct 07 14:00:12 crc kubenswrapper[4854]: I1007 14:00:12.731800 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da397325-9681-4562-b3fe-22cf4ef798b1" path="/var/lib/kubelet/pods/da397325-9681-4562-b3fe-22cf4ef798b1/volumes" Oct 07 14:00:12 crc kubenswrapper[4854]: I1007 14:00:12.733876 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 14:00:12 crc kubenswrapper[4854]: I1007 14:00:12.733923 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9ebc9353-0fbe-4d1f-8e95-7b3a716adc28","Type":"ContainerStarted","Data":"f092cb96359abe2de77b7f6e1681a1b0a7636013ab3a31fe48052aef7ddf419d"} Oct 07 14:00:12 crc kubenswrapper[4854]: I1007 14:00:12.733948 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c22041bc-4e60-4e0c-8209-856fb1e2ba7a","Type":"ContainerStarted","Data":"7861d4269b5ed90bd203eefa806a0f851c6b6e421764ed63d0c4a41131586ca2"} Oct 07 14:00:12 crc kubenswrapper[4854]: I1007 14:00:12.733968 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c22041bc-4e60-4e0c-8209-856fb1e2ba7a","Type":"ContainerStarted","Data":"282198e65af9a76ce876bab3724d752a5c1414b359a1a8ea97795f645187a222"} Oct 07 14:00:12 crc kubenswrapper[4854]: I1007 14:00:12.761571 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.761535636 podStartE2EDuration="7.761535636s" podCreationTimestamp="2025-10-07 14:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:00:12.74419406 +0000 UTC m=+5728.732026355" watchObservedRunningTime="2025-10-07 14:00:12.761535636 +0000 UTC m=+5728.749367891" Oct 07 14:00:12 crc kubenswrapper[4854]: I1007 14:00:12.794778 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.411754051 podStartE2EDuration="13.794757925s" podCreationTimestamp="2025-10-07 13:59:59 +0000 UTC" firstStartedPulling="2025-10-07 14:00:00.537069234 +0000 UTC m=+5716.524901489" lastFinishedPulling="2025-10-07 14:00:10.920073068 +0000 UTC m=+5726.907905363" observedRunningTime="2025-10-07 14:00:12.77777848 +0000 UTC m=+5728.765610785" watchObservedRunningTime="2025-10-07 14:00:12.794757925 +0000 UTC m=+5728.782590170" Oct 07 14:00:13 crc kubenswrapper[4854]: I1007 14:00:13.735067 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a6c31e6-c617-401c-a6c5-c76f945460b7","Type":"ContainerStarted","Data":"3f15ef2a5775bb4c44482a704df5b300b8e4a46878c0d2e0eaa7b893f7cb76cf"} Oct 07 14:00:13 crc kubenswrapper[4854]: I1007 14:00:13.735677 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a6c31e6-c617-401c-a6c5-c76f945460b7","Type":"ContainerStarted","Data":"97c1267c00445ababfdf4db153178eb88a21a89b46324da3969d1c68270c6aa5"} Oct 07 14:00:14 crc kubenswrapper[4854]: I1007 14:00:14.785074 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 07 14:00:15 crc kubenswrapper[4854]: I1007 14:00:15.364388 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 07 14:00:15 crc kubenswrapper[4854]: I1007 14:00:15.755993 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5a6c31e6-c617-401c-a6c5-c76f945460b7","Type":"ContainerStarted","Data":"9a6c245f95edaa98c032dc7bfac10f5dc27cb12d193cb3e3bd64f4781d9c2ac2"} Oct 07 14:00:15 crc kubenswrapper[4854]: I1007 14:00:15.784408 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.784379368 podStartE2EDuration="4.784379368s" podCreationTimestamp="2025-10-07 14:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:00:15.780869897 +0000 UTC m=+5731.768702152" watchObservedRunningTime="2025-10-07 14:00:15.784379368 +0000 UTC m=+5731.772211643" Oct 07 14:00:17 crc kubenswrapper[4854]: I1007 14:00:17.182066 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 14:00:20 crc kubenswrapper[4854]: I1007 14:00:20.047596 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 07 14:00:20 crc kubenswrapper[4854]: I1007 14:00:20.574744 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 07 14:00:20 crc kubenswrapper[4854]: I1007 14:00:20.971510 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ltlgp"] Oct 07 14:00:20 crc kubenswrapper[4854]: I1007 14:00:20.975983 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:20 crc kubenswrapper[4854]: I1007 14:00:20.981703 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ltlgp"] Oct 07 14:00:21 crc kubenswrapper[4854]: I1007 14:00:21.080841 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd67c5c4-b532-4b9c-9524-425f70cc5a36-utilities\") pod \"redhat-operators-ltlgp\" (UID: \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\") " pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:21 crc kubenswrapper[4854]: I1007 14:00:21.081084 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4bvc\" (UniqueName: \"kubernetes.io/projected/bd67c5c4-b532-4b9c-9524-425f70cc5a36-kube-api-access-z4bvc\") pod \"redhat-operators-ltlgp\" (UID: \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\") " pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:21 crc kubenswrapper[4854]: I1007 14:00:21.081189 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd67c5c4-b532-4b9c-9524-425f70cc5a36-catalog-content\") pod \"redhat-operators-ltlgp\" (UID: \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\") " pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:21 crc kubenswrapper[4854]: I1007 14:00:21.182560 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd67c5c4-b532-4b9c-9524-425f70cc5a36-utilities\") pod \"redhat-operators-ltlgp\" (UID: \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\") " pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:21 crc kubenswrapper[4854]: I1007 14:00:21.182696 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4bvc\" (UniqueName: \"kubernetes.io/projected/bd67c5c4-b532-4b9c-9524-425f70cc5a36-kube-api-access-z4bvc\") pod \"redhat-operators-ltlgp\" (UID: \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\") " pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:21 crc kubenswrapper[4854]: I1007 14:00:21.182731 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd67c5c4-b532-4b9c-9524-425f70cc5a36-catalog-content\") pod \"redhat-operators-ltlgp\" (UID: \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\") " pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:21 crc kubenswrapper[4854]: I1007 14:00:21.183363 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd67c5c4-b532-4b9c-9524-425f70cc5a36-utilities\") pod \"redhat-operators-ltlgp\" (UID: \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\") " pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:21 crc kubenswrapper[4854]: I1007 14:00:21.183678 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd67c5c4-b532-4b9c-9524-425f70cc5a36-catalog-content\") pod \"redhat-operators-ltlgp\" (UID: \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\") " pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:21 crc kubenswrapper[4854]: I1007 14:00:21.203556 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4bvc\" (UniqueName: \"kubernetes.io/projected/bd67c5c4-b532-4b9c-9524-425f70cc5a36-kube-api-access-z4bvc\") pod \"redhat-operators-ltlgp\" (UID: \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\") " pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:21 crc kubenswrapper[4854]: I1007 14:00:21.306781 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:21 crc kubenswrapper[4854]: I1007 14:00:21.817488 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ltlgp"] Oct 07 14:00:22 crc kubenswrapper[4854]: I1007 14:00:22.440572 4854 scope.go:117] "RemoveContainer" containerID="2d664f6a62b3f053630760e41eabc66518e87658cc380cb1496f3eda655566b3" Oct 07 14:00:22 crc kubenswrapper[4854]: I1007 14:00:22.449358 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 14:00:22 crc kubenswrapper[4854]: I1007 14:00:22.826247 4854 generic.go:334] "Generic (PLEG): container finished" podID="bd67c5c4-b532-4b9c-9524-425f70cc5a36" containerID="fc3de4f03c564ddcc74822533139b27224ecfe197b5ec54e10d7e5ea398190fa" exitCode=0 Oct 07 14:00:22 crc kubenswrapper[4854]: I1007 14:00:22.826713 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltlgp" event={"ID":"bd67c5c4-b532-4b9c-9524-425f70cc5a36","Type":"ContainerDied","Data":"fc3de4f03c564ddcc74822533139b27224ecfe197b5ec54e10d7e5ea398190fa"} Oct 07 14:00:22 crc kubenswrapper[4854]: I1007 14:00:22.826901 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltlgp" event={"ID":"bd67c5c4-b532-4b9c-9524-425f70cc5a36","Type":"ContainerStarted","Data":"79173cbf3ae545896d34f2392752e3b650675d96339b2908535903ef8c862f89"} Oct 07 14:00:23 crc kubenswrapper[4854]: I1007 14:00:23.114452 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 07 14:00:24 crc kubenswrapper[4854]: I1007 14:00:24.845908 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltlgp" event={"ID":"bd67c5c4-b532-4b9c-9524-425f70cc5a36","Type":"ContainerStarted","Data":"2d7c597a54ca2288c29b00d1590765075a7290f56656fa6bb1ef66bce0a95069"} Oct 07 14:00:26 crc kubenswrapper[4854]: I1007 14:00:26.867844 4854 generic.go:334] "Generic (PLEG): container finished" podID="bd67c5c4-b532-4b9c-9524-425f70cc5a36" containerID="2d7c597a54ca2288c29b00d1590765075a7290f56656fa6bb1ef66bce0a95069" exitCode=0 Oct 07 14:00:26 crc kubenswrapper[4854]: I1007 14:00:26.867933 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltlgp" event={"ID":"bd67c5c4-b532-4b9c-9524-425f70cc5a36","Type":"ContainerDied","Data":"2d7c597a54ca2288c29b00d1590765075a7290f56656fa6bb1ef66bce0a95069"} Oct 07 14:00:27 crc kubenswrapper[4854]: I1007 14:00:27.880643 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltlgp" event={"ID":"bd67c5c4-b532-4b9c-9524-425f70cc5a36","Type":"ContainerStarted","Data":"6c2bba93520c18957928ea7c12c0a526873f50eba401372fa52f97cf35b8fff2"} Oct 07 14:00:27 crc kubenswrapper[4854]: I1007 14:00:27.903375 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ltlgp" podStartSLOduration=3.39370881 podStartE2EDuration="7.903345884s" podCreationTimestamp="2025-10-07 14:00:20 +0000 UTC" firstStartedPulling="2025-10-07 14:00:22.828349322 +0000 UTC m=+5738.816181577" lastFinishedPulling="2025-10-07 14:00:27.337986396 +0000 UTC m=+5743.325818651" observedRunningTime="2025-10-07 14:00:27.899224116 +0000 UTC m=+5743.887056411" watchObservedRunningTime="2025-10-07 14:00:27.903345884 +0000 UTC m=+5743.891178179" Oct 07 14:00:31 crc kubenswrapper[4854]: I1007 14:00:31.307374 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:31 crc kubenswrapper[4854]: I1007 14:00:31.307854 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:32 crc kubenswrapper[4854]: I1007 14:00:32.371143 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ltlgp" podUID="bd67c5c4-b532-4b9c-9524-425f70cc5a36" containerName="registry-server" probeResult="failure" output=< Oct 07 14:00:32 crc kubenswrapper[4854]: timeout: failed to connect service ":50051" within 1s Oct 07 14:00:32 crc kubenswrapper[4854]: > Oct 07 14:00:37 crc kubenswrapper[4854]: I1007 14:00:37.085926 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-pblnp"] Oct 07 14:00:37 crc kubenswrapper[4854]: I1007 14:00:37.100501 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-pblnp"] Oct 07 14:00:38 crc kubenswrapper[4854]: I1007 14:00:38.722538 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d502c4b-5a47-4c65-846d-fb7256c0124f" path="/var/lib/kubelet/pods/9d502c4b-5a47-4c65-846d-fb7256c0124f/volumes" Oct 07 14:00:40 crc kubenswrapper[4854]: I1007 14:00:40.808261 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:00:40 crc kubenswrapper[4854]: I1007 14:00:40.808390 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:00:41 crc kubenswrapper[4854]: I1007 14:00:41.390491 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:41 crc kubenswrapper[4854]: I1007 14:00:41.454021 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:41 crc kubenswrapper[4854]: I1007 14:00:41.646287 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ltlgp"] Oct 07 14:00:43 crc kubenswrapper[4854]: I1007 14:00:43.070100 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ltlgp" podUID="bd67c5c4-b532-4b9c-9524-425f70cc5a36" containerName="registry-server" containerID="cri-o://6c2bba93520c18957928ea7c12c0a526873f50eba401372fa52f97cf35b8fff2" gracePeriod=2 Oct 07 14:00:43 crc kubenswrapper[4854]: I1007 14:00:43.535977 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:43 crc kubenswrapper[4854]: I1007 14:00:43.611453 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4bvc\" (UniqueName: \"kubernetes.io/projected/bd67c5c4-b532-4b9c-9524-425f70cc5a36-kube-api-access-z4bvc\") pod \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\" (UID: \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\") " Oct 07 14:00:43 crc kubenswrapper[4854]: I1007 14:00:43.611645 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd67c5c4-b532-4b9c-9524-425f70cc5a36-utilities\") pod \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\" (UID: \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\") " Oct 07 14:00:43 crc kubenswrapper[4854]: I1007 14:00:43.611716 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd67c5c4-b532-4b9c-9524-425f70cc5a36-catalog-content\") pod \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\" (UID: \"bd67c5c4-b532-4b9c-9524-425f70cc5a36\") " Oct 07 14:00:43 crc kubenswrapper[4854]: I1007 14:00:43.612803 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd67c5c4-b532-4b9c-9524-425f70cc5a36-utilities" (OuterVolumeSpecName: "utilities") pod "bd67c5c4-b532-4b9c-9524-425f70cc5a36" (UID: "bd67c5c4-b532-4b9c-9524-425f70cc5a36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:00:43 crc kubenswrapper[4854]: I1007 14:00:43.617914 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd67c5c4-b532-4b9c-9524-425f70cc5a36-kube-api-access-z4bvc" (OuterVolumeSpecName: "kube-api-access-z4bvc") pod "bd67c5c4-b532-4b9c-9524-425f70cc5a36" (UID: "bd67c5c4-b532-4b9c-9524-425f70cc5a36"). InnerVolumeSpecName "kube-api-access-z4bvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:00:43 crc kubenswrapper[4854]: I1007 14:00:43.711005 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd67c5c4-b532-4b9c-9524-425f70cc5a36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd67c5c4-b532-4b9c-9524-425f70cc5a36" (UID: "bd67c5c4-b532-4b9c-9524-425f70cc5a36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:00:43 crc kubenswrapper[4854]: I1007 14:00:43.713505 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd67c5c4-b532-4b9c-9524-425f70cc5a36-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:43 crc kubenswrapper[4854]: I1007 14:00:43.713532 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4bvc\" (UniqueName: \"kubernetes.io/projected/bd67c5c4-b532-4b9c-9524-425f70cc5a36-kube-api-access-z4bvc\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:43 crc kubenswrapper[4854]: I1007 14:00:43.713544 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd67c5c4-b532-4b9c-9524-425f70cc5a36-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.083968 4854 generic.go:334] "Generic (PLEG): container finished" podID="bd67c5c4-b532-4b9c-9524-425f70cc5a36" containerID="6c2bba93520c18957928ea7c12c0a526873f50eba401372fa52f97cf35b8fff2" exitCode=0 Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.084047 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltlgp" event={"ID":"bd67c5c4-b532-4b9c-9524-425f70cc5a36","Type":"ContainerDied","Data":"6c2bba93520c18957928ea7c12c0a526873f50eba401372fa52f97cf35b8fff2"} Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.084109 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ltlgp" event={"ID":"bd67c5c4-b532-4b9c-9524-425f70cc5a36","Type":"ContainerDied","Data":"79173cbf3ae545896d34f2392752e3b650675d96339b2908535903ef8c862f89"} Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.084140 4854 scope.go:117] "RemoveContainer" containerID="6c2bba93520c18957928ea7c12c0a526873f50eba401372fa52f97cf35b8fff2" Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.084202 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ltlgp" Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.117666 4854 scope.go:117] "RemoveContainer" containerID="2d7c597a54ca2288c29b00d1590765075a7290f56656fa6bb1ef66bce0a95069" Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.126089 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ltlgp"] Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.143496 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ltlgp"] Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.149991 4854 scope.go:117] "RemoveContainer" containerID="fc3de4f03c564ddcc74822533139b27224ecfe197b5ec54e10d7e5ea398190fa" Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.177940 4854 scope.go:117] "RemoveContainer" containerID="6c2bba93520c18957928ea7c12c0a526873f50eba401372fa52f97cf35b8fff2" Oct 07 14:00:44 crc kubenswrapper[4854]: E1007 14:00:44.178929 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c2bba93520c18957928ea7c12c0a526873f50eba401372fa52f97cf35b8fff2\": container with ID starting with 6c2bba93520c18957928ea7c12c0a526873f50eba401372fa52f97cf35b8fff2 not found: ID does not exist" containerID="6c2bba93520c18957928ea7c12c0a526873f50eba401372fa52f97cf35b8fff2" Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.179044 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2bba93520c18957928ea7c12c0a526873f50eba401372fa52f97cf35b8fff2"} err="failed to get container status \"6c2bba93520c18957928ea7c12c0a526873f50eba401372fa52f97cf35b8fff2\": rpc error: code = NotFound desc = could not find container \"6c2bba93520c18957928ea7c12c0a526873f50eba401372fa52f97cf35b8fff2\": container with ID starting with 6c2bba93520c18957928ea7c12c0a526873f50eba401372fa52f97cf35b8fff2 not found: ID does not exist" Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.179125 4854 scope.go:117] "RemoveContainer" containerID="2d7c597a54ca2288c29b00d1590765075a7290f56656fa6bb1ef66bce0a95069" Oct 07 14:00:44 crc kubenswrapper[4854]: E1007 14:00:44.179519 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d7c597a54ca2288c29b00d1590765075a7290f56656fa6bb1ef66bce0a95069\": container with ID starting with 2d7c597a54ca2288c29b00d1590765075a7290f56656fa6bb1ef66bce0a95069 not found: ID does not exist" containerID="2d7c597a54ca2288c29b00d1590765075a7290f56656fa6bb1ef66bce0a95069" Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.179552 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7c597a54ca2288c29b00d1590765075a7290f56656fa6bb1ef66bce0a95069"} err="failed to get container status \"2d7c597a54ca2288c29b00d1590765075a7290f56656fa6bb1ef66bce0a95069\": rpc error: code = NotFound desc = could not find container \"2d7c597a54ca2288c29b00d1590765075a7290f56656fa6bb1ef66bce0a95069\": container with ID starting with 2d7c597a54ca2288c29b00d1590765075a7290f56656fa6bb1ef66bce0a95069 not found: ID does not exist" Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.179576 4854 scope.go:117] "RemoveContainer" containerID="fc3de4f03c564ddcc74822533139b27224ecfe197b5ec54e10d7e5ea398190fa" Oct 07 14:00:44 crc kubenswrapper[4854]: E1007 14:00:44.179950 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc3de4f03c564ddcc74822533139b27224ecfe197b5ec54e10d7e5ea398190fa\": container with ID starting with fc3de4f03c564ddcc74822533139b27224ecfe197b5ec54e10d7e5ea398190fa not found: ID does not exist" containerID="fc3de4f03c564ddcc74822533139b27224ecfe197b5ec54e10d7e5ea398190fa" Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.179973 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc3de4f03c564ddcc74822533139b27224ecfe197b5ec54e10d7e5ea398190fa"} err="failed to get container status \"fc3de4f03c564ddcc74822533139b27224ecfe197b5ec54e10d7e5ea398190fa\": rpc error: code = NotFound desc = could not find container \"fc3de4f03c564ddcc74822533139b27224ecfe197b5ec54e10d7e5ea398190fa\": container with ID starting with fc3de4f03c564ddcc74822533139b27224ecfe197b5ec54e10d7e5ea398190fa not found: ID does not exist" Oct 07 14:00:44 crc kubenswrapper[4854]: I1007 14:00:44.721834 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd67c5c4-b532-4b9c-9524-425f70cc5a36" path="/var/lib/kubelet/pods/bd67c5c4-b532-4b9c-9524-425f70cc5a36/volumes" Oct 07 14:00:47 crc kubenswrapper[4854]: I1007 14:00:47.028540 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4e30-account-create-8dlnn"] Oct 07 14:00:47 crc kubenswrapper[4854]: I1007 14:00:47.038381 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4e30-account-create-8dlnn"] Oct 07 14:00:48 crc kubenswrapper[4854]: I1007 14:00:48.722469 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cabac5b5-1b98-4966-8046-c3c6a34d5c06" path="/var/lib/kubelet/pods/cabac5b5-1b98-4966-8046-c3c6a34d5c06/volumes" Oct 07 14:00:54 crc kubenswrapper[4854]: I1007 14:00:54.052663 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-q66ps"] Oct 07 14:00:54 crc kubenswrapper[4854]: I1007 14:00:54.067517 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-q66ps"] Oct 07 14:00:54 crc kubenswrapper[4854]: I1007 14:00:54.716303 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faca3163-f21e-446d-9672-6273c04b186d" path="/var/lib/kubelet/pods/faca3163-f21e-446d-9672-6273c04b186d/volumes" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.176326 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29330761-b8lkv"] Oct 07 14:01:00 crc kubenswrapper[4854]: E1007 14:01:00.177903 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd67c5c4-b532-4b9c-9524-425f70cc5a36" containerName="extract-utilities" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.177941 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd67c5c4-b532-4b9c-9524-425f70cc5a36" containerName="extract-utilities" Oct 07 14:01:00 crc kubenswrapper[4854]: E1007 14:01:00.177989 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd67c5c4-b532-4b9c-9524-425f70cc5a36" containerName="extract-content" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.178009 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd67c5c4-b532-4b9c-9524-425f70cc5a36" containerName="extract-content" Oct 07 14:01:00 crc kubenswrapper[4854]: E1007 14:01:00.178076 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd67c5c4-b532-4b9c-9524-425f70cc5a36" containerName="registry-server" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.178097 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd67c5c4-b532-4b9c-9524-425f70cc5a36" containerName="registry-server" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.178642 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd67c5c4-b532-4b9c-9524-425f70cc5a36" containerName="registry-server" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.185430 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.188108 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29330761-b8lkv"] Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.279247 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6ghx\" (UniqueName: \"kubernetes.io/projected/52399450-0564-41c2-86d5-f9b533025254-kube-api-access-r6ghx\") pod \"keystone-cron-29330761-b8lkv\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.279438 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-combined-ca-bundle\") pod \"keystone-cron-29330761-b8lkv\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.279489 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-fernet-keys\") pod \"keystone-cron-29330761-b8lkv\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.279568 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-config-data\") pod \"keystone-cron-29330761-b8lkv\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.381396 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-combined-ca-bundle\") pod \"keystone-cron-29330761-b8lkv\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.381464 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-fernet-keys\") pod \"keystone-cron-29330761-b8lkv\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.381534 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-config-data\") pod \"keystone-cron-29330761-b8lkv\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.381605 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6ghx\" (UniqueName: \"kubernetes.io/projected/52399450-0564-41c2-86d5-f9b533025254-kube-api-access-r6ghx\") pod \"keystone-cron-29330761-b8lkv\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.399580 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-fernet-keys\") pod \"keystone-cron-29330761-b8lkv\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.399929 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-config-data\") pod \"keystone-cron-29330761-b8lkv\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.400492 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-combined-ca-bundle\") pod \"keystone-cron-29330761-b8lkv\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.404247 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6ghx\" (UniqueName: \"kubernetes.io/projected/52399450-0564-41c2-86d5-f9b533025254-kube-api-access-r6ghx\") pod \"keystone-cron-29330761-b8lkv\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.516827 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:00 crc kubenswrapper[4854]: W1007 14:01:00.994511 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52399450_0564_41c2_86d5_f9b533025254.slice/crio-30480965cb7a03f85985752f31555c8f0feb6efbca7b01f7189ae0156b19f383 WatchSource:0}: Error finding container 30480965cb7a03f85985752f31555c8f0feb6efbca7b01f7189ae0156b19f383: Status 404 returned error can't find the container with id 30480965cb7a03f85985752f31555c8f0feb6efbca7b01f7189ae0156b19f383 Oct 07 14:01:00 crc kubenswrapper[4854]: I1007 14:01:00.998485 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29330761-b8lkv"] Oct 07 14:01:01 crc kubenswrapper[4854]: I1007 14:01:01.274659 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330761-b8lkv" event={"ID":"52399450-0564-41c2-86d5-f9b533025254","Type":"ContainerStarted","Data":"2fe8a237ec6440c2028c18e671107336285b4ebda70da3fc459408ee30d099d2"} Oct 07 14:01:01 crc kubenswrapper[4854]: I1007 14:01:01.275895 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330761-b8lkv" event={"ID":"52399450-0564-41c2-86d5-f9b533025254","Type":"ContainerStarted","Data":"30480965cb7a03f85985752f31555c8f0feb6efbca7b01f7189ae0156b19f383"} Oct 07 14:01:01 crc kubenswrapper[4854]: I1007 14:01:01.294860 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29330761-b8lkv" podStartSLOduration=1.294840051 podStartE2EDuration="1.294840051s" podCreationTimestamp="2025-10-07 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:01:01.290092575 +0000 UTC m=+5777.277924830" watchObservedRunningTime="2025-10-07 14:01:01.294840051 +0000 UTC m=+5777.282672306" Oct 07 14:01:03 crc kubenswrapper[4854]: I1007 14:01:03.298277 4854 generic.go:334] "Generic (PLEG): container finished" podID="52399450-0564-41c2-86d5-f9b533025254" containerID="2fe8a237ec6440c2028c18e671107336285b4ebda70da3fc459408ee30d099d2" exitCode=0 Oct 07 14:01:03 crc kubenswrapper[4854]: I1007 14:01:03.298386 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330761-b8lkv" event={"ID":"52399450-0564-41c2-86d5-f9b533025254","Type":"ContainerDied","Data":"2fe8a237ec6440c2028c18e671107336285b4ebda70da3fc459408ee30d099d2"} Oct 07 14:01:04 crc kubenswrapper[4854]: I1007 14:01:04.745966 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:04 crc kubenswrapper[4854]: I1007 14:01:04.884257 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6ghx\" (UniqueName: \"kubernetes.io/projected/52399450-0564-41c2-86d5-f9b533025254-kube-api-access-r6ghx\") pod \"52399450-0564-41c2-86d5-f9b533025254\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " Oct 07 14:01:04 crc kubenswrapper[4854]: I1007 14:01:04.884309 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-fernet-keys\") pod \"52399450-0564-41c2-86d5-f9b533025254\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " Oct 07 14:01:04 crc kubenswrapper[4854]: I1007 14:01:04.884440 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-combined-ca-bundle\") pod \"52399450-0564-41c2-86d5-f9b533025254\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " Oct 07 14:01:04 crc kubenswrapper[4854]: I1007 14:01:04.884583 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-config-data\") pod \"52399450-0564-41c2-86d5-f9b533025254\" (UID: \"52399450-0564-41c2-86d5-f9b533025254\") " Oct 07 14:01:04 crc kubenswrapper[4854]: I1007 14:01:04.891289 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "52399450-0564-41c2-86d5-f9b533025254" (UID: "52399450-0564-41c2-86d5-f9b533025254"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:01:04 crc kubenswrapper[4854]: I1007 14:01:04.894422 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52399450-0564-41c2-86d5-f9b533025254-kube-api-access-r6ghx" (OuterVolumeSpecName: "kube-api-access-r6ghx") pod "52399450-0564-41c2-86d5-f9b533025254" (UID: "52399450-0564-41c2-86d5-f9b533025254"). InnerVolumeSpecName "kube-api-access-r6ghx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:01:04 crc kubenswrapper[4854]: I1007 14:01:04.935226 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52399450-0564-41c2-86d5-f9b533025254" (UID: "52399450-0564-41c2-86d5-f9b533025254"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:01:04 crc kubenswrapper[4854]: I1007 14:01:04.941360 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-config-data" (OuterVolumeSpecName: "config-data") pod "52399450-0564-41c2-86d5-f9b533025254" (UID: "52399450-0564-41c2-86d5-f9b533025254"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:01:04 crc kubenswrapper[4854]: I1007 14:01:04.986752 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:04 crc kubenswrapper[4854]: I1007 14:01:04.986786 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6ghx\" (UniqueName: \"kubernetes.io/projected/52399450-0564-41c2-86d5-f9b533025254-kube-api-access-r6ghx\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:04 crc kubenswrapper[4854]: I1007 14:01:04.986800 4854 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:04 crc kubenswrapper[4854]: I1007 14:01:04.986810 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52399450-0564-41c2-86d5-f9b533025254-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:05 crc kubenswrapper[4854]: I1007 14:01:05.325528 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330761-b8lkv" event={"ID":"52399450-0564-41c2-86d5-f9b533025254","Type":"ContainerDied","Data":"30480965cb7a03f85985752f31555c8f0feb6efbca7b01f7189ae0156b19f383"} Oct 07 14:01:05 crc kubenswrapper[4854]: I1007 14:01:05.325611 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30480965cb7a03f85985752f31555c8f0feb6efbca7b01f7189ae0156b19f383" Oct 07 14:01:05 crc kubenswrapper[4854]: I1007 14:01:05.326048 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330761-b8lkv" Oct 07 14:01:07 crc kubenswrapper[4854]: I1007 14:01:07.068860 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-257hr"] Oct 07 14:01:07 crc kubenswrapper[4854]: I1007 14:01:07.092562 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-257hr"] Oct 07 14:01:08 crc kubenswrapper[4854]: I1007 14:01:08.734377 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad13166b-d165-4483-9e36-967308c3f93b" path="/var/lib/kubelet/pods/ad13166b-d165-4483-9e36-967308c3f93b/volumes" Oct 07 14:01:10 crc kubenswrapper[4854]: I1007 14:01:10.807462 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:01:10 crc kubenswrapper[4854]: I1007 14:01:10.807557 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:01:10 crc kubenswrapper[4854]: I1007 14:01:10.807622 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 14:01:10 crc kubenswrapper[4854]: I1007 14:01:10.808658 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"faf9abd5a5e7753883aa95fb4b691a8be7baef96f7df5d4fa745c25c86f27153"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:01:10 crc kubenswrapper[4854]: I1007 14:01:10.808821 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://faf9abd5a5e7753883aa95fb4b691a8be7baef96f7df5d4fa745c25c86f27153" gracePeriod=600 Oct 07 14:01:11 crc kubenswrapper[4854]: I1007 14:01:11.397879 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="faf9abd5a5e7753883aa95fb4b691a8be7baef96f7df5d4fa745c25c86f27153" exitCode=0 Oct 07 14:01:11 crc kubenswrapper[4854]: I1007 14:01:11.397977 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"faf9abd5a5e7753883aa95fb4b691a8be7baef96f7df5d4fa745c25c86f27153"} Oct 07 14:01:11 crc kubenswrapper[4854]: I1007 14:01:11.399141 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b"} Oct 07 14:01:11 crc kubenswrapper[4854]: I1007 14:01:11.399408 4854 scope.go:117] "RemoveContainer" containerID="8f3095d8e8fcb8dde419577c13f52762f3d4a6c040f7266414571999ecd4046e" Oct 07 14:01:22 crc kubenswrapper[4854]: I1007 14:01:22.719398 4854 scope.go:117] "RemoveContainer" containerID="0d2342f0757e582d0325164ed00c5243aa7aa673ca6407273fe28faa0f3874d4" Oct 07 14:01:22 crc kubenswrapper[4854]: I1007 14:01:22.756950 4854 scope.go:117] "RemoveContainer" containerID="39512b3157f2fc83c7c64b3331b61692b937f98a439b3da1fb9b2d783bb5ab8d" Oct 07 14:01:22 crc kubenswrapper[4854]: I1007 14:01:22.816878 4854 scope.go:117] "RemoveContainer" containerID="48a55825ee74722bc5f78d76b3d4b17f0e142addc7a31a7c32a9a0dddc32f3e9" Oct 07 14:01:22 crc kubenswrapper[4854]: I1007 14:01:22.860855 4854 scope.go:117] "RemoveContainer" containerID="65bbe81ddb3ca6061d2d7b38db772081fe329ef5506ca532e06711bc89ff752b" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.168664 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5cddp"] Oct 07 14:02:14 crc kubenswrapper[4854]: E1007 14:02:14.169920 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52399450-0564-41c2-86d5-f9b533025254" containerName="keystone-cron" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.169943 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="52399450-0564-41c2-86d5-f9b533025254" containerName="keystone-cron" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.170362 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="52399450-0564-41c2-86d5-f9b533025254" containerName="keystone-cron" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.171420 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.174770 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zj4pn" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.176490 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5cddp"] Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.177517 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.232066 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-xvl8g"] Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.234013 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.248340 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-xvl8g"] Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.292803 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs44w\" (UniqueName: \"kubernetes.io/projected/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-kube-api-access-qs44w\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.292899 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-var-run-ovn\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.292941 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-var-log-ovn\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.292967 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-var-run\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.293331 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-scripts\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.394738 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/953c368f-f670-44f8-b0ee-62a86bb2f5a9-var-lib\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.395060 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-scripts\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.395094 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/953c368f-f670-44f8-b0ee-62a86bb2f5a9-var-run\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.395122 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs44w\" (UniqueName: \"kubernetes.io/projected/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-kube-api-access-qs44w\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.395178 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/953c368f-f670-44f8-b0ee-62a86bb2f5a9-var-log\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.395326 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plt5q\" (UniqueName: \"kubernetes.io/projected/953c368f-f670-44f8-b0ee-62a86bb2f5a9-kube-api-access-plt5q\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.395432 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/953c368f-f670-44f8-b0ee-62a86bb2f5a9-scripts\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.395547 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-var-run-ovn\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.395585 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/953c368f-f670-44f8-b0ee-62a86bb2f5a9-etc-ovs\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.395701 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-var-log-ovn\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.395779 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-var-run\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.395869 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-var-run-ovn\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.395878 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-var-log-ovn\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.396002 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-var-run\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.397636 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-scripts\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.426380 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs44w\" (UniqueName: \"kubernetes.io/projected/3d9803b6-b7f2-461c-9d3b-1fb1f39839e9-kube-api-access-qs44w\") pod \"ovn-controller-5cddp\" (UID: \"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9\") " pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.494544 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5cddp" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.498270 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/953c368f-f670-44f8-b0ee-62a86bb2f5a9-var-run\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.498358 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/953c368f-f670-44f8-b0ee-62a86bb2f5a9-var-log\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.498385 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plt5q\" (UniqueName: \"kubernetes.io/projected/953c368f-f670-44f8-b0ee-62a86bb2f5a9-kube-api-access-plt5q\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.498423 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/953c368f-f670-44f8-b0ee-62a86bb2f5a9-scripts\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.498464 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/953c368f-f670-44f8-b0ee-62a86bb2f5a9-etc-ovs\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.498468 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/953c368f-f670-44f8-b0ee-62a86bb2f5a9-var-log\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.498459 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/953c368f-f670-44f8-b0ee-62a86bb2f5a9-var-run\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.498562 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/953c368f-f670-44f8-b0ee-62a86bb2f5a9-etc-ovs\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.498605 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/953c368f-f670-44f8-b0ee-62a86bb2f5a9-var-lib\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.498721 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/953c368f-f670-44f8-b0ee-62a86bb2f5a9-var-lib\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.500898 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/953c368f-f670-44f8-b0ee-62a86bb2f5a9-scripts\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.525972 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plt5q\" (UniqueName: \"kubernetes.io/projected/953c368f-f670-44f8-b0ee-62a86bb2f5a9-kube-api-access-plt5q\") pod \"ovn-controller-ovs-xvl8g\" (UID: \"953c368f-f670-44f8-b0ee-62a86bb2f5a9\") " pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:14 crc kubenswrapper[4854]: I1007 14:02:14.558160 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.032078 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5cddp"] Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.177136 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5cddp" event={"ID":"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9","Type":"ContainerStarted","Data":"9357c7723f47b079bb3424dc1812b560da023024c0c0469daff53d83a14d043a"} Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.565548 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-xvl8g"] Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.691811 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-chjfz"] Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.693796 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-chjfz" Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.698582 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.700511 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-chjfz"] Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.839157 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/149c8d6e-1ee3-4211-af06-36a5eea10742-ovn-rundir\") pod \"ovn-controller-metrics-chjfz\" (UID: \"149c8d6e-1ee3-4211-af06-36a5eea10742\") " pod="openstack/ovn-controller-metrics-chjfz" Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.839321 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xtq5\" (UniqueName: \"kubernetes.io/projected/149c8d6e-1ee3-4211-af06-36a5eea10742-kube-api-access-7xtq5\") pod \"ovn-controller-metrics-chjfz\" (UID: \"149c8d6e-1ee3-4211-af06-36a5eea10742\") " pod="openstack/ovn-controller-metrics-chjfz" Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.839353 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/149c8d6e-1ee3-4211-af06-36a5eea10742-config\") pod \"ovn-controller-metrics-chjfz\" (UID: \"149c8d6e-1ee3-4211-af06-36a5eea10742\") " pod="openstack/ovn-controller-metrics-chjfz" Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.839380 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/149c8d6e-1ee3-4211-af06-36a5eea10742-ovs-rundir\") pod \"ovn-controller-metrics-chjfz\" (UID: \"149c8d6e-1ee3-4211-af06-36a5eea10742\") " pod="openstack/ovn-controller-metrics-chjfz" Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.940551 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xtq5\" (UniqueName: \"kubernetes.io/projected/149c8d6e-1ee3-4211-af06-36a5eea10742-kube-api-access-7xtq5\") pod \"ovn-controller-metrics-chjfz\" (UID: \"149c8d6e-1ee3-4211-af06-36a5eea10742\") " pod="openstack/ovn-controller-metrics-chjfz" Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.940611 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/149c8d6e-1ee3-4211-af06-36a5eea10742-config\") pod \"ovn-controller-metrics-chjfz\" (UID: \"149c8d6e-1ee3-4211-af06-36a5eea10742\") " pod="openstack/ovn-controller-metrics-chjfz" Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.940645 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/149c8d6e-1ee3-4211-af06-36a5eea10742-ovs-rundir\") pod \"ovn-controller-metrics-chjfz\" (UID: \"149c8d6e-1ee3-4211-af06-36a5eea10742\") " pod="openstack/ovn-controller-metrics-chjfz" Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.940690 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/149c8d6e-1ee3-4211-af06-36a5eea10742-ovn-rundir\") pod \"ovn-controller-metrics-chjfz\" (UID: \"149c8d6e-1ee3-4211-af06-36a5eea10742\") " pod="openstack/ovn-controller-metrics-chjfz" Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.940954 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/149c8d6e-1ee3-4211-af06-36a5eea10742-ovn-rundir\") pod \"ovn-controller-metrics-chjfz\" (UID: \"149c8d6e-1ee3-4211-af06-36a5eea10742\") " pod="openstack/ovn-controller-metrics-chjfz" Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.941097 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/149c8d6e-1ee3-4211-af06-36a5eea10742-ovs-rundir\") pod \"ovn-controller-metrics-chjfz\" (UID: \"149c8d6e-1ee3-4211-af06-36a5eea10742\") " pod="openstack/ovn-controller-metrics-chjfz" Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.941594 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/149c8d6e-1ee3-4211-af06-36a5eea10742-config\") pod \"ovn-controller-metrics-chjfz\" (UID: \"149c8d6e-1ee3-4211-af06-36a5eea10742\") " pod="openstack/ovn-controller-metrics-chjfz" Oct 07 14:02:15 crc kubenswrapper[4854]: I1007 14:02:15.960557 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xtq5\" (UniqueName: \"kubernetes.io/projected/149c8d6e-1ee3-4211-af06-36a5eea10742-kube-api-access-7xtq5\") pod \"ovn-controller-metrics-chjfz\" (UID: \"149c8d6e-1ee3-4211-af06-36a5eea10742\") " pod="openstack/ovn-controller-metrics-chjfz" Oct 07 14:02:16 crc kubenswrapper[4854]: I1007 14:02:16.053697 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-chjfz" Oct 07 14:02:16 crc kubenswrapper[4854]: I1007 14:02:16.191080 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5cddp" event={"ID":"3d9803b6-b7f2-461c-9d3b-1fb1f39839e9","Type":"ContainerStarted","Data":"2aa4a986afff0ccf054887376a43b8f08db653706c2baaaaa77ec8c42b26610e"} Oct 07 14:02:16 crc kubenswrapper[4854]: I1007 14:02:16.191399 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-5cddp" Oct 07 14:02:16 crc kubenswrapper[4854]: I1007 14:02:16.193304 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xvl8g" event={"ID":"953c368f-f670-44f8-b0ee-62a86bb2f5a9","Type":"ContainerStarted","Data":"74e62367112e39c74ddfbc7389a7708adac51f1e47f4b885ea1997133e08fe97"} Oct 07 14:02:16 crc kubenswrapper[4854]: I1007 14:02:16.193354 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xvl8g" event={"ID":"953c368f-f670-44f8-b0ee-62a86bb2f5a9","Type":"ContainerStarted","Data":"2a4a16bf4a87bae23a6c91b54d0249fc32f4d5cb9bc3377f68ccd759af0a305b"} Oct 07 14:02:16 crc kubenswrapper[4854]: I1007 14:02:16.210899 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5cddp" podStartSLOduration=2.210881585 podStartE2EDuration="2.210881585s" podCreationTimestamp="2025-10-07 14:02:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:02:16.206046237 +0000 UTC m=+5852.193878482" watchObservedRunningTime="2025-10-07 14:02:16.210881585 +0000 UTC m=+5852.198713840" Oct 07 14:02:16 crc kubenswrapper[4854]: I1007 14:02:16.502244 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-chjfz"] Oct 07 14:02:16 crc kubenswrapper[4854]: W1007 14:02:16.507725 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod149c8d6e_1ee3_4211_af06_36a5eea10742.slice/crio-17920f0cb1010ec9941384c10310346a405d2fc3be5a9ff33546aa8a4fa67a80 WatchSource:0}: Error finding container 17920f0cb1010ec9941384c10310346a405d2fc3be5a9ff33546aa8a4fa67a80: Status 404 returned error can't find the container with id 17920f0cb1010ec9941384c10310346a405d2fc3be5a9ff33546aa8a4fa67a80 Oct 07 14:02:17 crc kubenswrapper[4854]: I1007 14:02:17.121977 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-wjrd7"] Oct 07 14:02:17 crc kubenswrapper[4854]: I1007 14:02:17.124009 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-wjrd7" Oct 07 14:02:17 crc kubenswrapper[4854]: I1007 14:02:17.132725 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-wjrd7"] Oct 07 14:02:17 crc kubenswrapper[4854]: I1007 14:02:17.170461 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2x7x\" (UniqueName: \"kubernetes.io/projected/4a661f17-e028-427d-8a00-76cdffdea5ba-kube-api-access-q2x7x\") pod \"octavia-db-create-wjrd7\" (UID: \"4a661f17-e028-427d-8a00-76cdffdea5ba\") " pod="openstack/octavia-db-create-wjrd7" Oct 07 14:02:17 crc kubenswrapper[4854]: I1007 14:02:17.205385 4854 generic.go:334] "Generic (PLEG): container finished" podID="953c368f-f670-44f8-b0ee-62a86bb2f5a9" containerID="74e62367112e39c74ddfbc7389a7708adac51f1e47f4b885ea1997133e08fe97" exitCode=0 Oct 07 14:02:17 crc kubenswrapper[4854]: I1007 14:02:17.205458 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xvl8g" event={"ID":"953c368f-f670-44f8-b0ee-62a86bb2f5a9","Type":"ContainerDied","Data":"74e62367112e39c74ddfbc7389a7708adac51f1e47f4b885ea1997133e08fe97"} Oct 07 14:02:17 crc kubenswrapper[4854]: I1007 14:02:17.209757 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-chjfz" event={"ID":"149c8d6e-1ee3-4211-af06-36a5eea10742","Type":"ContainerStarted","Data":"aa5b3a3fa4d7f0f43d19a0ed811ec189359cb445bebb12a6542bde9acd967279"} Oct 07 14:02:17 crc kubenswrapper[4854]: I1007 14:02:17.209796 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-chjfz" event={"ID":"149c8d6e-1ee3-4211-af06-36a5eea10742","Type":"ContainerStarted","Data":"17920f0cb1010ec9941384c10310346a405d2fc3be5a9ff33546aa8a4fa67a80"} Oct 07 14:02:17 crc kubenswrapper[4854]: I1007 14:02:17.261650 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-chjfz" podStartSLOduration=2.261623994 podStartE2EDuration="2.261623994s" podCreationTimestamp="2025-10-07 14:02:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:02:17.249447086 +0000 UTC m=+5853.237279361" watchObservedRunningTime="2025-10-07 14:02:17.261623994 +0000 UTC m=+5853.249456259" Oct 07 14:02:17 crc kubenswrapper[4854]: I1007 14:02:17.275832 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2x7x\" (UniqueName: \"kubernetes.io/projected/4a661f17-e028-427d-8a00-76cdffdea5ba-kube-api-access-q2x7x\") pod \"octavia-db-create-wjrd7\" (UID: \"4a661f17-e028-427d-8a00-76cdffdea5ba\") " pod="openstack/octavia-db-create-wjrd7" Oct 07 14:02:17 crc kubenswrapper[4854]: I1007 14:02:17.298142 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2x7x\" (UniqueName: \"kubernetes.io/projected/4a661f17-e028-427d-8a00-76cdffdea5ba-kube-api-access-q2x7x\") pod \"octavia-db-create-wjrd7\" (UID: \"4a661f17-e028-427d-8a00-76cdffdea5ba\") " pod="openstack/octavia-db-create-wjrd7" Oct 07 14:02:17 crc kubenswrapper[4854]: I1007 14:02:17.454936 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-wjrd7" Oct 07 14:02:17 crc kubenswrapper[4854]: I1007 14:02:17.922356 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-wjrd7"] Oct 07 14:02:18 crc kubenswrapper[4854]: I1007 14:02:18.220349 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xvl8g" event={"ID":"953c368f-f670-44f8-b0ee-62a86bb2f5a9","Type":"ContainerStarted","Data":"3c00893df10f11689a45c9963f9f0f73b59c3aa24e0d698c4a8b2d93f458e42f"} Oct 07 14:02:18 crc kubenswrapper[4854]: I1007 14:02:18.220415 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xvl8g" event={"ID":"953c368f-f670-44f8-b0ee-62a86bb2f5a9","Type":"ContainerStarted","Data":"4fb0c04dd1dd5ab444f3c0a502fe6de7ab75597c112110cc08a403b3d097bdc2"} Oct 07 14:02:18 crc kubenswrapper[4854]: I1007 14:02:18.220544 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:18 crc kubenswrapper[4854]: I1007 14:02:18.220572 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:18 crc kubenswrapper[4854]: I1007 14:02:18.222680 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-wjrd7" event={"ID":"4a661f17-e028-427d-8a00-76cdffdea5ba","Type":"ContainerStarted","Data":"f93d0b34ee32037545a664909f46902df7c999eb1881c553523483e37e858157"} Oct 07 14:02:18 crc kubenswrapper[4854]: I1007 14:02:18.222716 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-wjrd7" event={"ID":"4a661f17-e028-427d-8a00-76cdffdea5ba","Type":"ContainerStarted","Data":"87f28eeff9eb4d6776601f5243e9a56f86b6bca290e69acee02b7e739205d7ee"} Oct 07 14:02:18 crc kubenswrapper[4854]: I1007 14:02:18.283479 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-xvl8g" podStartSLOduration=4.283369955 podStartE2EDuration="4.283369955s" podCreationTimestamp="2025-10-07 14:02:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:02:18.270363384 +0000 UTC m=+5854.258195649" watchObservedRunningTime="2025-10-07 14:02:18.283369955 +0000 UTC m=+5854.271202230" Oct 07 14:02:18 crc kubenswrapper[4854]: I1007 14:02:18.296912 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-create-wjrd7" podStartSLOduration=1.296891922 podStartE2EDuration="1.296891922s" podCreationTimestamp="2025-10-07 14:02:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:02:18.286749882 +0000 UTC m=+5854.274582137" watchObservedRunningTime="2025-10-07 14:02:18.296891922 +0000 UTC m=+5854.284724187" Oct 07 14:02:19 crc kubenswrapper[4854]: I1007 14:02:19.236534 4854 generic.go:334] "Generic (PLEG): container finished" podID="4a661f17-e028-427d-8a00-76cdffdea5ba" containerID="f93d0b34ee32037545a664909f46902df7c999eb1881c553523483e37e858157" exitCode=0 Oct 07 14:02:19 crc kubenswrapper[4854]: I1007 14:02:19.236610 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-wjrd7" event={"ID":"4a661f17-e028-427d-8a00-76cdffdea5ba","Type":"ContainerDied","Data":"f93d0b34ee32037545a664909f46902df7c999eb1881c553523483e37e858157"} Oct 07 14:02:20 crc kubenswrapper[4854]: I1007 14:02:20.664390 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-wjrd7" Oct 07 14:02:20 crc kubenswrapper[4854]: I1007 14:02:20.739465 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2x7x\" (UniqueName: \"kubernetes.io/projected/4a661f17-e028-427d-8a00-76cdffdea5ba-kube-api-access-q2x7x\") pod \"4a661f17-e028-427d-8a00-76cdffdea5ba\" (UID: \"4a661f17-e028-427d-8a00-76cdffdea5ba\") " Oct 07 14:02:20 crc kubenswrapper[4854]: I1007 14:02:20.745284 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a661f17-e028-427d-8a00-76cdffdea5ba-kube-api-access-q2x7x" (OuterVolumeSpecName: "kube-api-access-q2x7x") pod "4a661f17-e028-427d-8a00-76cdffdea5ba" (UID: "4a661f17-e028-427d-8a00-76cdffdea5ba"). InnerVolumeSpecName "kube-api-access-q2x7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:02:20 crc kubenswrapper[4854]: I1007 14:02:20.841278 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2x7x\" (UniqueName: \"kubernetes.io/projected/4a661f17-e028-427d-8a00-76cdffdea5ba-kube-api-access-q2x7x\") on node \"crc\" DevicePath \"\"" Oct 07 14:02:21 crc kubenswrapper[4854]: I1007 14:02:21.260832 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-wjrd7" event={"ID":"4a661f17-e028-427d-8a00-76cdffdea5ba","Type":"ContainerDied","Data":"87f28eeff9eb4d6776601f5243e9a56f86b6bca290e69acee02b7e739205d7ee"} Oct 07 14:02:21 crc kubenswrapper[4854]: I1007 14:02:21.260886 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87f28eeff9eb4d6776601f5243e9a56f86b6bca290e69acee02b7e739205d7ee" Oct 07 14:02:21 crc kubenswrapper[4854]: I1007 14:02:21.260956 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-wjrd7" Oct 07 14:02:29 crc kubenswrapper[4854]: I1007 14:02:29.042791 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-4d76-account-create-825gq"] Oct 07 14:02:29 crc kubenswrapper[4854]: E1007 14:02:29.044274 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a661f17-e028-427d-8a00-76cdffdea5ba" containerName="mariadb-database-create" Oct 07 14:02:29 crc kubenswrapper[4854]: I1007 14:02:29.044311 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a661f17-e028-427d-8a00-76cdffdea5ba" containerName="mariadb-database-create" Oct 07 14:02:29 crc kubenswrapper[4854]: I1007 14:02:29.044723 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a661f17-e028-427d-8a00-76cdffdea5ba" containerName="mariadb-database-create" Oct 07 14:02:29 crc kubenswrapper[4854]: I1007 14:02:29.045964 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-4d76-account-create-825gq" Oct 07 14:02:29 crc kubenswrapper[4854]: I1007 14:02:29.048516 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Oct 07 14:02:29 crc kubenswrapper[4854]: I1007 14:02:29.052665 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-4d76-account-create-825gq"] Oct 07 14:02:29 crc kubenswrapper[4854]: I1007 14:02:29.129541 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t67r\" (UniqueName: \"kubernetes.io/projected/f3b22209-9bb6-4ae3-bcf8-4530fee452c5-kube-api-access-5t67r\") pod \"octavia-4d76-account-create-825gq\" (UID: \"f3b22209-9bb6-4ae3-bcf8-4530fee452c5\") " pod="openstack/octavia-4d76-account-create-825gq" Oct 07 14:02:29 crc kubenswrapper[4854]: I1007 14:02:29.232115 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t67r\" (UniqueName: \"kubernetes.io/projected/f3b22209-9bb6-4ae3-bcf8-4530fee452c5-kube-api-access-5t67r\") pod \"octavia-4d76-account-create-825gq\" (UID: \"f3b22209-9bb6-4ae3-bcf8-4530fee452c5\") " pod="openstack/octavia-4d76-account-create-825gq" Oct 07 14:02:29 crc kubenswrapper[4854]: I1007 14:02:29.256757 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t67r\" (UniqueName: \"kubernetes.io/projected/f3b22209-9bb6-4ae3-bcf8-4530fee452c5-kube-api-access-5t67r\") pod \"octavia-4d76-account-create-825gq\" (UID: \"f3b22209-9bb6-4ae3-bcf8-4530fee452c5\") " pod="openstack/octavia-4d76-account-create-825gq" Oct 07 14:02:29 crc kubenswrapper[4854]: I1007 14:02:29.381564 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-4d76-account-create-825gq" Oct 07 14:02:29 crc kubenswrapper[4854]: I1007 14:02:29.858839 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-4d76-account-create-825gq"] Oct 07 14:02:30 crc kubenswrapper[4854]: I1007 14:02:30.362009 4854 generic.go:334] "Generic (PLEG): container finished" podID="f3b22209-9bb6-4ae3-bcf8-4530fee452c5" containerID="8a2d62b0e54763882d622240e5dc7552851155a784e35405c9243ecb3a76f780" exitCode=0 Oct 07 14:02:30 crc kubenswrapper[4854]: I1007 14:02:30.362133 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-4d76-account-create-825gq" event={"ID":"f3b22209-9bb6-4ae3-bcf8-4530fee452c5","Type":"ContainerDied","Data":"8a2d62b0e54763882d622240e5dc7552851155a784e35405c9243ecb3a76f780"} Oct 07 14:02:30 crc kubenswrapper[4854]: I1007 14:02:30.362384 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-4d76-account-create-825gq" event={"ID":"f3b22209-9bb6-4ae3-bcf8-4530fee452c5","Type":"ContainerStarted","Data":"4edc8dd99a0b8dcf1767eb354d3dd86471b98ae4c2d84fbd0e9a36aabad077e6"} Oct 07 14:02:31 crc kubenswrapper[4854]: I1007 14:02:31.780176 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-4d76-account-create-825gq" Oct 07 14:02:31 crc kubenswrapper[4854]: I1007 14:02:31.885926 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t67r\" (UniqueName: \"kubernetes.io/projected/f3b22209-9bb6-4ae3-bcf8-4530fee452c5-kube-api-access-5t67r\") pod \"f3b22209-9bb6-4ae3-bcf8-4530fee452c5\" (UID: \"f3b22209-9bb6-4ae3-bcf8-4530fee452c5\") " Oct 07 14:02:31 crc kubenswrapper[4854]: I1007 14:02:31.894534 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3b22209-9bb6-4ae3-bcf8-4530fee452c5-kube-api-access-5t67r" (OuterVolumeSpecName: "kube-api-access-5t67r") pod "f3b22209-9bb6-4ae3-bcf8-4530fee452c5" (UID: "f3b22209-9bb6-4ae3-bcf8-4530fee452c5"). InnerVolumeSpecName "kube-api-access-5t67r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:02:31 crc kubenswrapper[4854]: I1007 14:02:31.988334 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t67r\" (UniqueName: \"kubernetes.io/projected/f3b22209-9bb6-4ae3-bcf8-4530fee452c5-kube-api-access-5t67r\") on node \"crc\" DevicePath \"\"" Oct 07 14:02:32 crc kubenswrapper[4854]: I1007 14:02:32.392849 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-4d76-account-create-825gq" event={"ID":"f3b22209-9bb6-4ae3-bcf8-4530fee452c5","Type":"ContainerDied","Data":"4edc8dd99a0b8dcf1767eb354d3dd86471b98ae4c2d84fbd0e9a36aabad077e6"} Oct 07 14:02:32 crc kubenswrapper[4854]: I1007 14:02:32.393471 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4edc8dd99a0b8dcf1767eb354d3dd86471b98ae4c2d84fbd0e9a36aabad077e6" Oct 07 14:02:32 crc kubenswrapper[4854]: I1007 14:02:32.392935 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-4d76-account-create-825gq" Oct 07 14:02:36 crc kubenswrapper[4854]: I1007 14:02:36.096872 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-q6n4n"] Oct 07 14:02:36 crc kubenswrapper[4854]: E1007 14:02:36.097813 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3b22209-9bb6-4ae3-bcf8-4530fee452c5" containerName="mariadb-account-create" Oct 07 14:02:36 crc kubenswrapper[4854]: I1007 14:02:36.097825 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b22209-9bb6-4ae3-bcf8-4530fee452c5" containerName="mariadb-account-create" Oct 07 14:02:36 crc kubenswrapper[4854]: I1007 14:02:36.098009 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3b22209-9bb6-4ae3-bcf8-4530fee452c5" containerName="mariadb-account-create" Oct 07 14:02:36 crc kubenswrapper[4854]: I1007 14:02:36.098707 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-q6n4n" Oct 07 14:02:36 crc kubenswrapper[4854]: I1007 14:02:36.111119 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-q6n4n"] Oct 07 14:02:36 crc kubenswrapper[4854]: I1007 14:02:36.169097 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd8tf\" (UniqueName: \"kubernetes.io/projected/d99fd817-df4d-4f1f-8915-a2e87d2266a9-kube-api-access-hd8tf\") pod \"octavia-persistence-db-create-q6n4n\" (UID: \"d99fd817-df4d-4f1f-8915-a2e87d2266a9\") " pod="openstack/octavia-persistence-db-create-q6n4n" Oct 07 14:02:36 crc kubenswrapper[4854]: I1007 14:02:36.270936 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd8tf\" (UniqueName: \"kubernetes.io/projected/d99fd817-df4d-4f1f-8915-a2e87d2266a9-kube-api-access-hd8tf\") pod \"octavia-persistence-db-create-q6n4n\" (UID: \"d99fd817-df4d-4f1f-8915-a2e87d2266a9\") " pod="openstack/octavia-persistence-db-create-q6n4n" Oct 07 14:02:36 crc kubenswrapper[4854]: I1007 14:02:36.312340 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd8tf\" (UniqueName: \"kubernetes.io/projected/d99fd817-df4d-4f1f-8915-a2e87d2266a9-kube-api-access-hd8tf\") pod \"octavia-persistence-db-create-q6n4n\" (UID: \"d99fd817-df4d-4f1f-8915-a2e87d2266a9\") " pod="openstack/octavia-persistence-db-create-q6n4n" Oct 07 14:02:36 crc kubenswrapper[4854]: I1007 14:02:36.430049 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-q6n4n" Oct 07 14:02:36 crc kubenswrapper[4854]: I1007 14:02:36.916054 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-q6n4n"] Oct 07 14:02:37 crc kubenswrapper[4854]: I1007 14:02:37.447536 4854 generic.go:334] "Generic (PLEG): container finished" podID="d99fd817-df4d-4f1f-8915-a2e87d2266a9" containerID="85f0ed58c1f0270d454f5b478f6549461d576fed90482ad5e3de7205eeaa3af0" exitCode=0 Oct 07 14:02:37 crc kubenswrapper[4854]: I1007 14:02:37.447655 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-q6n4n" event={"ID":"d99fd817-df4d-4f1f-8915-a2e87d2266a9","Type":"ContainerDied","Data":"85f0ed58c1f0270d454f5b478f6549461d576fed90482ad5e3de7205eeaa3af0"} Oct 07 14:02:37 crc kubenswrapper[4854]: I1007 14:02:37.447880 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-q6n4n" event={"ID":"d99fd817-df4d-4f1f-8915-a2e87d2266a9","Type":"ContainerStarted","Data":"4bf29c2d5746e95d8926fe4d867029669efad2aab464cf743495de5ebd819ea2"} Oct 07 14:02:38 crc kubenswrapper[4854]: I1007 14:02:38.835089 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-q6n4n" Oct 07 14:02:38 crc kubenswrapper[4854]: I1007 14:02:38.926706 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd8tf\" (UniqueName: \"kubernetes.io/projected/d99fd817-df4d-4f1f-8915-a2e87d2266a9-kube-api-access-hd8tf\") pod \"d99fd817-df4d-4f1f-8915-a2e87d2266a9\" (UID: \"d99fd817-df4d-4f1f-8915-a2e87d2266a9\") " Oct 07 14:02:38 crc kubenswrapper[4854]: I1007 14:02:38.931374 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d99fd817-df4d-4f1f-8915-a2e87d2266a9-kube-api-access-hd8tf" (OuterVolumeSpecName: "kube-api-access-hd8tf") pod "d99fd817-df4d-4f1f-8915-a2e87d2266a9" (UID: "d99fd817-df4d-4f1f-8915-a2e87d2266a9"). InnerVolumeSpecName "kube-api-access-hd8tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:02:39 crc kubenswrapper[4854]: I1007 14:02:39.029806 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd8tf\" (UniqueName: \"kubernetes.io/projected/d99fd817-df4d-4f1f-8915-a2e87d2266a9-kube-api-access-hd8tf\") on node \"crc\" DevicePath \"\"" Oct 07 14:02:39 crc kubenswrapper[4854]: I1007 14:02:39.475180 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-q6n4n" event={"ID":"d99fd817-df4d-4f1f-8915-a2e87d2266a9","Type":"ContainerDied","Data":"4bf29c2d5746e95d8926fe4d867029669efad2aab464cf743495de5ebd819ea2"} Oct 07 14:02:39 crc kubenswrapper[4854]: I1007 14:02:39.475532 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bf29c2d5746e95d8926fe4d867029669efad2aab464cf743495de5ebd819ea2" Oct 07 14:02:39 crc kubenswrapper[4854]: I1007 14:02:39.475304 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-q6n4n" Oct 07 14:02:47 crc kubenswrapper[4854]: I1007 14:02:47.214777 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-414c-account-create-fwl6z"] Oct 07 14:02:47 crc kubenswrapper[4854]: E1007 14:02:47.215753 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d99fd817-df4d-4f1f-8915-a2e87d2266a9" containerName="mariadb-database-create" Oct 07 14:02:47 crc kubenswrapper[4854]: I1007 14:02:47.215772 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d99fd817-df4d-4f1f-8915-a2e87d2266a9" containerName="mariadb-database-create" Oct 07 14:02:47 crc kubenswrapper[4854]: I1007 14:02:47.216047 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="d99fd817-df4d-4f1f-8915-a2e87d2266a9" containerName="mariadb-database-create" Oct 07 14:02:47 crc kubenswrapper[4854]: I1007 14:02:47.216957 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-414c-account-create-fwl6z" Oct 07 14:02:47 crc kubenswrapper[4854]: I1007 14:02:47.221106 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Oct 07 14:02:47 crc kubenswrapper[4854]: I1007 14:02:47.226087 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-414c-account-create-fwl6z"] Oct 07 14:02:47 crc kubenswrapper[4854]: I1007 14:02:47.311962 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxkv6\" (UniqueName: \"kubernetes.io/projected/b2c854b9-6e47-40df-ac06-bcce0489547b-kube-api-access-dxkv6\") pod \"octavia-414c-account-create-fwl6z\" (UID: \"b2c854b9-6e47-40df-ac06-bcce0489547b\") " pod="openstack/octavia-414c-account-create-fwl6z" Oct 07 14:02:47 crc kubenswrapper[4854]: I1007 14:02:47.414339 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxkv6\" (UniqueName: \"kubernetes.io/projected/b2c854b9-6e47-40df-ac06-bcce0489547b-kube-api-access-dxkv6\") pod \"octavia-414c-account-create-fwl6z\" (UID: \"b2c854b9-6e47-40df-ac06-bcce0489547b\") " pod="openstack/octavia-414c-account-create-fwl6z" Oct 07 14:02:47 crc kubenswrapper[4854]: I1007 14:02:47.442481 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxkv6\" (UniqueName: \"kubernetes.io/projected/b2c854b9-6e47-40df-ac06-bcce0489547b-kube-api-access-dxkv6\") pod \"octavia-414c-account-create-fwl6z\" (UID: \"b2c854b9-6e47-40df-ac06-bcce0489547b\") " pod="openstack/octavia-414c-account-create-fwl6z" Oct 07 14:02:47 crc kubenswrapper[4854]: I1007 14:02:47.540684 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-414c-account-create-fwl6z" Oct 07 14:02:48 crc kubenswrapper[4854]: W1007 14:02:48.033697 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2c854b9_6e47_40df_ac06_bcce0489547b.slice/crio-6fb04759c592141807f10bb13f80aef5ba82b3a9af8685f9ea1f296d6ffe6a06 WatchSource:0}: Error finding container 6fb04759c592141807f10bb13f80aef5ba82b3a9af8685f9ea1f296d6ffe6a06: Status 404 returned error can't find the container with id 6fb04759c592141807f10bb13f80aef5ba82b3a9af8685f9ea1f296d6ffe6a06 Oct 07 14:02:48 crc kubenswrapper[4854]: I1007 14:02:48.035658 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-414c-account-create-fwl6z"] Oct 07 14:02:48 crc kubenswrapper[4854]: I1007 14:02:48.571866 4854 generic.go:334] "Generic (PLEG): container finished" podID="b2c854b9-6e47-40df-ac06-bcce0489547b" containerID="784c0b89017c39a90ac2e572fe3b83c0969a8eb7e49d6082619c0859201f921a" exitCode=0 Oct 07 14:02:48 crc kubenswrapper[4854]: I1007 14:02:48.572069 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-414c-account-create-fwl6z" event={"ID":"b2c854b9-6e47-40df-ac06-bcce0489547b","Type":"ContainerDied","Data":"784c0b89017c39a90ac2e572fe3b83c0969a8eb7e49d6082619c0859201f921a"} Oct 07 14:02:48 crc kubenswrapper[4854]: I1007 14:02:48.572243 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-414c-account-create-fwl6z" event={"ID":"b2c854b9-6e47-40df-ac06-bcce0489547b","Type":"ContainerStarted","Data":"6fb04759c592141807f10bb13f80aef5ba82b3a9af8685f9ea1f296d6ffe6a06"} Oct 07 14:02:49 crc kubenswrapper[4854]: I1007 14:02:49.553351 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5cddp" podUID="3d9803b6-b7f2-461c-9d3b-1fb1f39839e9" containerName="ovn-controller" probeResult="failure" output=< Oct 07 14:02:49 crc kubenswrapper[4854]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 07 14:02:49 crc kubenswrapper[4854]: > Oct 07 14:02:49 crc kubenswrapper[4854]: I1007 14:02:49.644472 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:49 crc kubenswrapper[4854]: I1007 14:02:49.659957 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-xvl8g" Oct 07 14:02:49 crc kubenswrapper[4854]: I1007 14:02:49.794194 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5cddp-config-v55wm"] Oct 07 14:02:49 crc kubenswrapper[4854]: I1007 14:02:49.795335 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:49 crc kubenswrapper[4854]: I1007 14:02:49.799492 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 07 14:02:49 crc kubenswrapper[4854]: I1007 14:02:49.806798 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5cddp-config-v55wm"] Oct 07 14:02:49 crc kubenswrapper[4854]: I1007 14:02:49.968211 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-run\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:49 crc kubenswrapper[4854]: I1007 14:02:49.968273 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jprwz\" (UniqueName: \"kubernetes.io/projected/985cd74c-8a17-440c-b3bc-533447a4edd4-kube-api-access-jprwz\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:49 crc kubenswrapper[4854]: I1007 14:02:49.968323 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-run-ovn\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:49 crc kubenswrapper[4854]: I1007 14:02:49.968365 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-log-ovn\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:49 crc kubenswrapper[4854]: I1007 14:02:49.968513 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/985cd74c-8a17-440c-b3bc-533447a4edd4-additional-scripts\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:49 crc kubenswrapper[4854]: I1007 14:02:49.968711 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/985cd74c-8a17-440c-b3bc-533447a4edd4-scripts\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:49 crc kubenswrapper[4854]: I1007 14:02:49.985579 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-414c-account-create-fwl6z" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.070724 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxkv6\" (UniqueName: \"kubernetes.io/projected/b2c854b9-6e47-40df-ac06-bcce0489547b-kube-api-access-dxkv6\") pod \"b2c854b9-6e47-40df-ac06-bcce0489547b\" (UID: \"b2c854b9-6e47-40df-ac06-bcce0489547b\") " Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.071644 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/985cd74c-8a17-440c-b3bc-533447a4edd4-scripts\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.071769 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-run\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.071809 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jprwz\" (UniqueName: \"kubernetes.io/projected/985cd74c-8a17-440c-b3bc-533447a4edd4-kube-api-access-jprwz\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.071864 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-run-ovn\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.071907 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-log-ovn\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.071960 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/985cd74c-8a17-440c-b3bc-533447a4edd4-additional-scripts\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.072025 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-run\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.072109 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-run-ovn\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.072437 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-log-ovn\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.072801 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/985cd74c-8a17-440c-b3bc-533447a4edd4-additional-scripts\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.073601 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/985cd74c-8a17-440c-b3bc-533447a4edd4-scripts\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.077033 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c854b9-6e47-40df-ac06-bcce0489547b-kube-api-access-dxkv6" (OuterVolumeSpecName: "kube-api-access-dxkv6") pod "b2c854b9-6e47-40df-ac06-bcce0489547b" (UID: "b2c854b9-6e47-40df-ac06-bcce0489547b"). InnerVolumeSpecName "kube-api-access-dxkv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.089831 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jprwz\" (UniqueName: \"kubernetes.io/projected/985cd74c-8a17-440c-b3bc-533447a4edd4-kube-api-access-jprwz\") pod \"ovn-controller-5cddp-config-v55wm\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.123501 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.173612 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxkv6\" (UniqueName: \"kubernetes.io/projected/b2c854b9-6e47-40df-ac06-bcce0489547b-kube-api-access-dxkv6\") on node \"crc\" DevicePath \"\"" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.579950 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5cddp-config-v55wm"] Oct 07 14:02:50 crc kubenswrapper[4854]: W1007 14:02:50.579957 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod985cd74c_8a17_440c_b3bc_533447a4edd4.slice/crio-434eff6eca7dcece5ad79c23f0965419e484c305dd73076d564835254842beb2 WatchSource:0}: Error finding container 434eff6eca7dcece5ad79c23f0965419e484c305dd73076d564835254842beb2: Status 404 returned error can't find the container with id 434eff6eca7dcece5ad79c23f0965419e484c305dd73076d564835254842beb2 Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.593556 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5cddp-config-v55wm" event={"ID":"985cd74c-8a17-440c-b3bc-533447a4edd4","Type":"ContainerStarted","Data":"434eff6eca7dcece5ad79c23f0965419e484c305dd73076d564835254842beb2"} Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.595636 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-414c-account-create-fwl6z" event={"ID":"b2c854b9-6e47-40df-ac06-bcce0489547b","Type":"ContainerDied","Data":"6fb04759c592141807f10bb13f80aef5ba82b3a9af8685f9ea1f296d6ffe6a06"} Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.595664 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-414c-account-create-fwl6z" Oct 07 14:02:50 crc kubenswrapper[4854]: I1007 14:02:50.595686 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fb04759c592141807f10bb13f80aef5ba82b3a9af8685f9ea1f296d6ffe6a06" Oct 07 14:02:51 crc kubenswrapper[4854]: I1007 14:02:51.611883 4854 generic.go:334] "Generic (PLEG): container finished" podID="985cd74c-8a17-440c-b3bc-533447a4edd4" containerID="bfb141acdac6641fdd80c3f01c194ca941338295668bc6f98d8ea4a469471a17" exitCode=0 Oct 07 14:02:51 crc kubenswrapper[4854]: I1007 14:02:51.612435 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5cddp-config-v55wm" event={"ID":"985cd74c-8a17-440c-b3bc-533447a4edd4","Type":"ContainerDied","Data":"bfb141acdac6641fdd80c3f01c194ca941338295668bc6f98d8ea4a469471a17"} Oct 07 14:02:52 crc kubenswrapper[4854]: I1007 14:02:52.983736 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.134396 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-log-ovn\") pod \"985cd74c-8a17-440c-b3bc-533447a4edd4\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.134468 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/985cd74c-8a17-440c-b3bc-533447a4edd4-additional-scripts\") pod \"985cd74c-8a17-440c-b3bc-533447a4edd4\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.134667 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/985cd74c-8a17-440c-b3bc-533447a4edd4-scripts\") pod \"985cd74c-8a17-440c-b3bc-533447a4edd4\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.134701 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-run\") pod \"985cd74c-8a17-440c-b3bc-533447a4edd4\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.134741 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-run-ovn\") pod \"985cd74c-8a17-440c-b3bc-533447a4edd4\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.134775 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jprwz\" (UniqueName: \"kubernetes.io/projected/985cd74c-8a17-440c-b3bc-533447a4edd4-kube-api-access-jprwz\") pod \"985cd74c-8a17-440c-b3bc-533447a4edd4\" (UID: \"985cd74c-8a17-440c-b3bc-533447a4edd4\") " Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.135909 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/985cd74c-8a17-440c-b3bc-533447a4edd4-scripts" (OuterVolumeSpecName: "scripts") pod "985cd74c-8a17-440c-b3bc-533447a4edd4" (UID: "985cd74c-8a17-440c-b3bc-533447a4edd4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.136101 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "985cd74c-8a17-440c-b3bc-533447a4edd4" (UID: "985cd74c-8a17-440c-b3bc-533447a4edd4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.136089 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-run" (OuterVolumeSpecName: "var-run") pod "985cd74c-8a17-440c-b3bc-533447a4edd4" (UID: "985cd74c-8a17-440c-b3bc-533447a4edd4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.136322 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/985cd74c-8a17-440c-b3bc-533447a4edd4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "985cd74c-8a17-440c-b3bc-533447a4edd4" (UID: "985cd74c-8a17-440c-b3bc-533447a4edd4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.136372 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "985cd74c-8a17-440c-b3bc-533447a4edd4" (UID: "985cd74c-8a17-440c-b3bc-533447a4edd4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.139919 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/985cd74c-8a17-440c-b3bc-533447a4edd4-kube-api-access-jprwz" (OuterVolumeSpecName: "kube-api-access-jprwz") pod "985cd74c-8a17-440c-b3bc-533447a4edd4" (UID: "985cd74c-8a17-440c-b3bc-533447a4edd4"). InnerVolumeSpecName "kube-api-access-jprwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.236637 4854 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.236667 4854 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/985cd74c-8a17-440c-b3bc-533447a4edd4-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.236677 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/985cd74c-8a17-440c-b3bc-533447a4edd4-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.236710 4854 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-run\") on node \"crc\" DevicePath \"\"" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.236719 4854 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/985cd74c-8a17-440c-b3bc-533447a4edd4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.236735 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jprwz\" (UniqueName: \"kubernetes.io/projected/985cd74c-8a17-440c-b3bc-533447a4edd4-kube-api-access-jprwz\") on node \"crc\" DevicePath \"\"" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.254445 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5d6s8"] Oct 07 14:02:53 crc kubenswrapper[4854]: E1007 14:02:53.254897 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c854b9-6e47-40df-ac06-bcce0489547b" containerName="mariadb-account-create" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.254917 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c854b9-6e47-40df-ac06-bcce0489547b" containerName="mariadb-account-create" Oct 07 14:02:53 crc kubenswrapper[4854]: E1007 14:02:53.254959 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985cd74c-8a17-440c-b3bc-533447a4edd4" containerName="ovn-config" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.254965 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="985cd74c-8a17-440c-b3bc-533447a4edd4" containerName="ovn-config" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.255135 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="985cd74c-8a17-440c-b3bc-533447a4edd4" containerName="ovn-config" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.255170 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c854b9-6e47-40df-ac06-bcce0489547b" containerName="mariadb-account-create" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.256696 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.269722 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5d6s8"] Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.337876 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adc47d70-c658-494d-a6c4-6bb88e047b75-utilities\") pod \"community-operators-5d6s8\" (UID: \"adc47d70-c658-494d-a6c4-6bb88e047b75\") " pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.338209 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adc47d70-c658-494d-a6c4-6bb88e047b75-catalog-content\") pod \"community-operators-5d6s8\" (UID: \"adc47d70-c658-494d-a6c4-6bb88e047b75\") " pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.338239 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkg79\" (UniqueName: \"kubernetes.io/projected/adc47d70-c658-494d-a6c4-6bb88e047b75-kube-api-access-jkg79\") pod \"community-operators-5d6s8\" (UID: \"adc47d70-c658-494d-a6c4-6bb88e047b75\") " pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.439985 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adc47d70-c658-494d-a6c4-6bb88e047b75-utilities\") pod \"community-operators-5d6s8\" (UID: \"adc47d70-c658-494d-a6c4-6bb88e047b75\") " pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.440040 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adc47d70-c658-494d-a6c4-6bb88e047b75-catalog-content\") pod \"community-operators-5d6s8\" (UID: \"adc47d70-c658-494d-a6c4-6bb88e047b75\") " pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.440072 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg79\" (UniqueName: \"kubernetes.io/projected/adc47d70-c658-494d-a6c4-6bb88e047b75-kube-api-access-jkg79\") pod \"community-operators-5d6s8\" (UID: \"adc47d70-c658-494d-a6c4-6bb88e047b75\") " pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.440760 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adc47d70-c658-494d-a6c4-6bb88e047b75-utilities\") pod \"community-operators-5d6s8\" (UID: \"adc47d70-c658-494d-a6c4-6bb88e047b75\") " pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.440831 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adc47d70-c658-494d-a6c4-6bb88e047b75-catalog-content\") pod \"community-operators-5d6s8\" (UID: \"adc47d70-c658-494d-a6c4-6bb88e047b75\") " pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.483220 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkg79\" (UniqueName: \"kubernetes.io/projected/adc47d70-c658-494d-a6c4-6bb88e047b75-kube-api-access-jkg79\") pod \"community-operators-5d6s8\" (UID: \"adc47d70-c658-494d-a6c4-6bb88e047b75\") " pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.546902 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-7ff49dfc98-fd4k5"] Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.548849 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.552189 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-tmnj8" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.553680 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.563058 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.567820 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-7ff49dfc98-fd4k5"] Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.612955 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.647445 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff25e0e6-68d6-4c15-8ba6-8582764830ce-combined-ca-bundle\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.647507 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/ff25e0e6-68d6-4c15-8ba6-8582764830ce-octavia-run\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.647593 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff25e0e6-68d6-4c15-8ba6-8582764830ce-scripts\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.647688 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff25e0e6-68d6-4c15-8ba6-8582764830ce-config-data\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.647727 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ff25e0e6-68d6-4c15-8ba6-8582764830ce-config-data-merged\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.653491 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5cddp-config-v55wm" event={"ID":"985cd74c-8a17-440c-b3bc-533447a4edd4","Type":"ContainerDied","Data":"434eff6eca7dcece5ad79c23f0965419e484c305dd73076d564835254842beb2"} Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.653535 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="434eff6eca7dcece5ad79c23f0965419e484c305dd73076d564835254842beb2" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.653598 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5cddp-config-v55wm" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.749223 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff25e0e6-68d6-4c15-8ba6-8582764830ce-scripts\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.749321 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff25e0e6-68d6-4c15-8ba6-8582764830ce-config-data\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.749350 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ff25e0e6-68d6-4c15-8ba6-8582764830ce-config-data-merged\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.749397 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff25e0e6-68d6-4c15-8ba6-8582764830ce-combined-ca-bundle\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.749416 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/ff25e0e6-68d6-4c15-8ba6-8582764830ce-octavia-run\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.749824 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/ff25e0e6-68d6-4c15-8ba6-8582764830ce-octavia-run\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.751980 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ff25e0e6-68d6-4c15-8ba6-8582764830ce-config-data-merged\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.760696 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff25e0e6-68d6-4c15-8ba6-8582764830ce-config-data\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.760907 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff25e0e6-68d6-4c15-8ba6-8582764830ce-scripts\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.767000 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff25e0e6-68d6-4c15-8ba6-8582764830ce-combined-ca-bundle\") pod \"octavia-api-7ff49dfc98-fd4k5\" (UID: \"ff25e0e6-68d6-4c15-8ba6-8582764830ce\") " pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:53 crc kubenswrapper[4854]: I1007 14:02:53.874887 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:02:54 crc kubenswrapper[4854]: I1007 14:02:54.078938 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5cddp-config-v55wm"] Oct 07 14:02:54 crc kubenswrapper[4854]: I1007 14:02:54.089468 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-5cddp-config-v55wm"] Oct 07 14:02:54 crc kubenswrapper[4854]: I1007 14:02:54.103483 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5d6s8"] Oct 07 14:02:54 crc kubenswrapper[4854]: I1007 14:02:54.298699 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-7ff49dfc98-fd4k5"] Oct 07 14:02:54 crc kubenswrapper[4854]: W1007 14:02:54.306358 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff25e0e6_68d6_4c15_8ba6_8582764830ce.slice/crio-7c3bd707e8e31b2ec44c5ecf14a7916de192083c59c68b856e46f75d160c95cf WatchSource:0}: Error finding container 7c3bd707e8e31b2ec44c5ecf14a7916de192083c59c68b856e46f75d160c95cf: Status 404 returned error can't find the container with id 7c3bd707e8e31b2ec44c5ecf14a7916de192083c59c68b856e46f75d160c95cf Oct 07 14:02:54 crc kubenswrapper[4854]: I1007 14:02:54.559448 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-5cddp" Oct 07 14:02:54 crc kubenswrapper[4854]: I1007 14:02:54.665987 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7ff49dfc98-fd4k5" event={"ID":"ff25e0e6-68d6-4c15-8ba6-8582764830ce","Type":"ContainerStarted","Data":"7c3bd707e8e31b2ec44c5ecf14a7916de192083c59c68b856e46f75d160c95cf"} Oct 07 14:02:54 crc kubenswrapper[4854]: I1007 14:02:54.668108 4854 generic.go:334] "Generic (PLEG): container finished" podID="adc47d70-c658-494d-a6c4-6bb88e047b75" containerID="a45308c3ff9a77517c6ec6d3429cba20083bf118d9dbf67695ad611d65821bdb" exitCode=0 Oct 07 14:02:54 crc kubenswrapper[4854]: I1007 14:02:54.668183 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5d6s8" event={"ID":"adc47d70-c658-494d-a6c4-6bb88e047b75","Type":"ContainerDied","Data":"a45308c3ff9a77517c6ec6d3429cba20083bf118d9dbf67695ad611d65821bdb"} Oct 07 14:02:54 crc kubenswrapper[4854]: I1007 14:02:54.668211 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5d6s8" event={"ID":"adc47d70-c658-494d-a6c4-6bb88e047b75","Type":"ContainerStarted","Data":"6cd245f069fc913c20fb15b42f6b4a26e28a2e96cb23d2d94e88ff6272126dd3"} Oct 07 14:02:54 crc kubenswrapper[4854]: I1007 14:02:54.717333 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="985cd74c-8a17-440c-b3bc-533447a4edd4" path="/var/lib/kubelet/pods/985cd74c-8a17-440c-b3bc-533447a4edd4/volumes" Oct 07 14:02:56 crc kubenswrapper[4854]: I1007 14:02:56.689847 4854 generic.go:334] "Generic (PLEG): container finished" podID="adc47d70-c658-494d-a6c4-6bb88e047b75" containerID="9c290d051539376fb53abe3db5abc3366e66645e5b69d9b2f04c229129307120" exitCode=0 Oct 07 14:02:56 crc kubenswrapper[4854]: I1007 14:02:56.689921 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5d6s8" event={"ID":"adc47d70-c658-494d-a6c4-6bb88e047b75","Type":"ContainerDied","Data":"9c290d051539376fb53abe3db5abc3366e66645e5b69d9b2f04c229129307120"} Oct 07 14:03:03 crc kubenswrapper[4854]: I1007 14:03:03.766130 4854 generic.go:334] "Generic (PLEG): container finished" podID="ff25e0e6-68d6-4c15-8ba6-8582764830ce" containerID="210b7321c66055146eca804bb4b2a8ecae0795cd0f4897dfa62b24b35f4271d9" exitCode=0 Oct 07 14:03:03 crc kubenswrapper[4854]: I1007 14:03:03.766237 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7ff49dfc98-fd4k5" event={"ID":"ff25e0e6-68d6-4c15-8ba6-8582764830ce","Type":"ContainerDied","Data":"210b7321c66055146eca804bb4b2a8ecae0795cd0f4897dfa62b24b35f4271d9"} Oct 07 14:03:03 crc kubenswrapper[4854]: I1007 14:03:03.771054 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5d6s8" event={"ID":"adc47d70-c658-494d-a6c4-6bb88e047b75","Type":"ContainerStarted","Data":"b9bca530b2253a9e35354d62081489edce2a783ec869d18ef429754c1e8fbd71"} Oct 07 14:03:03 crc kubenswrapper[4854]: I1007 14:03:03.824208 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5d6s8" podStartSLOduration=8.343139168 podStartE2EDuration="10.824185294s" podCreationTimestamp="2025-10-07 14:02:53 +0000 UTC" firstStartedPulling="2025-10-07 14:02:54.670767964 +0000 UTC m=+5890.658600229" lastFinishedPulling="2025-10-07 14:02:57.1518141 +0000 UTC m=+5893.139646355" observedRunningTime="2025-10-07 14:03:03.821859508 +0000 UTC m=+5899.809691773" watchObservedRunningTime="2025-10-07 14:03:03.824185294 +0000 UTC m=+5899.812017549" Oct 07 14:03:04 crc kubenswrapper[4854]: I1007 14:03:04.798893 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7ff49dfc98-fd4k5" event={"ID":"ff25e0e6-68d6-4c15-8ba6-8582764830ce","Type":"ContainerStarted","Data":"89dc54f8701aa63500036a926751f1d39bfbff70eba23dadeea3efe5a0788efc"} Oct 07 14:03:04 crc kubenswrapper[4854]: I1007 14:03:04.799353 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7ff49dfc98-fd4k5" event={"ID":"ff25e0e6-68d6-4c15-8ba6-8582764830ce","Type":"ContainerStarted","Data":"d9cdf3b05d974c554204063d3d0c118b238189b2df9e558bfd25017e16582dd0"} Oct 07 14:03:04 crc kubenswrapper[4854]: I1007 14:03:04.820821 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-7ff49dfc98-fd4k5" podStartSLOduration=3.177441055 podStartE2EDuration="11.820803428s" podCreationTimestamp="2025-10-07 14:02:53 +0000 UTC" firstStartedPulling="2025-10-07 14:02:54.307945354 +0000 UTC m=+5890.295777609" lastFinishedPulling="2025-10-07 14:03:02.951307717 +0000 UTC m=+5898.939139982" observedRunningTime="2025-10-07 14:03:04.817494053 +0000 UTC m=+5900.805326318" watchObservedRunningTime="2025-10-07 14:03:04.820803428 +0000 UTC m=+5900.808635683" Oct 07 14:03:05 crc kubenswrapper[4854]: I1007 14:03:05.810478 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:03:05 crc kubenswrapper[4854]: I1007 14:03:05.810771 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:03:13 crc kubenswrapper[4854]: I1007 14:03:13.613684 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:03:13 crc kubenswrapper[4854]: I1007 14:03:13.614195 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:03:14 crc kubenswrapper[4854]: I1007 14:03:14.680919 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5d6s8" podUID="adc47d70-c658-494d-a6c4-6bb88e047b75" containerName="registry-server" probeResult="failure" output=< Oct 07 14:03:14 crc kubenswrapper[4854]: timeout: failed to connect service ":50051" within 1s Oct 07 14:03:14 crc kubenswrapper[4854]: > Oct 07 14:03:21 crc kubenswrapper[4854]: I1007 14:03:21.918332 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-c8jtv"] Oct 07 14:03:21 crc kubenswrapper[4854]: I1007 14:03:21.922014 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:21 crc kubenswrapper[4854]: I1007 14:03:21.928539 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Oct 07 14:03:21 crc kubenswrapper[4854]: I1007 14:03:21.928785 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Oct 07 14:03:21 crc kubenswrapper[4854]: I1007 14:03:21.928977 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Oct 07 14:03:21 crc kubenswrapper[4854]: I1007 14:03:21.931589 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-c8jtv"] Oct 07 14:03:21 crc kubenswrapper[4854]: I1007 14:03:21.963358 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7a15ef-3061-4eef-b5ec-f5933a93a797-scripts\") pod \"octavia-rsyslog-c8jtv\" (UID: \"cd7a15ef-3061-4eef-b5ec-f5933a93a797\") " pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:21 crc kubenswrapper[4854]: I1007 14:03:21.963526 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cd7a15ef-3061-4eef-b5ec-f5933a93a797-config-data-merged\") pod \"octavia-rsyslog-c8jtv\" (UID: \"cd7a15ef-3061-4eef-b5ec-f5933a93a797\") " pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:21 crc kubenswrapper[4854]: I1007 14:03:21.963602 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cd7a15ef-3061-4eef-b5ec-f5933a93a797-hm-ports\") pod \"octavia-rsyslog-c8jtv\" (UID: \"cd7a15ef-3061-4eef-b5ec-f5933a93a797\") " pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:21 crc kubenswrapper[4854]: I1007 14:03:21.963708 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7a15ef-3061-4eef-b5ec-f5933a93a797-config-data\") pod \"octavia-rsyslog-c8jtv\" (UID: \"cd7a15ef-3061-4eef-b5ec-f5933a93a797\") " pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.065364 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cd7a15ef-3061-4eef-b5ec-f5933a93a797-config-data-merged\") pod \"octavia-rsyslog-c8jtv\" (UID: \"cd7a15ef-3061-4eef-b5ec-f5933a93a797\") " pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.065451 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cd7a15ef-3061-4eef-b5ec-f5933a93a797-hm-ports\") pod \"octavia-rsyslog-c8jtv\" (UID: \"cd7a15ef-3061-4eef-b5ec-f5933a93a797\") " pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.065518 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7a15ef-3061-4eef-b5ec-f5933a93a797-config-data\") pod \"octavia-rsyslog-c8jtv\" (UID: \"cd7a15ef-3061-4eef-b5ec-f5933a93a797\") " pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.065623 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7a15ef-3061-4eef-b5ec-f5933a93a797-scripts\") pod \"octavia-rsyslog-c8jtv\" (UID: \"cd7a15ef-3061-4eef-b5ec-f5933a93a797\") " pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.066058 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cd7a15ef-3061-4eef-b5ec-f5933a93a797-config-data-merged\") pod \"octavia-rsyslog-c8jtv\" (UID: \"cd7a15ef-3061-4eef-b5ec-f5933a93a797\") " pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.066982 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cd7a15ef-3061-4eef-b5ec-f5933a93a797-hm-ports\") pod \"octavia-rsyslog-c8jtv\" (UID: \"cd7a15ef-3061-4eef-b5ec-f5933a93a797\") " pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.070806 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7a15ef-3061-4eef-b5ec-f5933a93a797-config-data\") pod \"octavia-rsyslog-c8jtv\" (UID: \"cd7a15ef-3061-4eef-b5ec-f5933a93a797\") " pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.071497 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7a15ef-3061-4eef-b5ec-f5933a93a797-scripts\") pod \"octavia-rsyslog-c8jtv\" (UID: \"cd7a15ef-3061-4eef-b5ec-f5933a93a797\") " pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.250279 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.814941 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-c8jtv"] Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.937794 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-c464q"] Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.939327 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-c464q" Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.949840 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.952207 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-c464q"] Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.982294 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/85bbef8f-9a78-4c68-b0d0-6eb59418a6f6-httpd-config\") pod \"octavia-image-upload-59f8cff499-c464q\" (UID: \"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6\") " pod="openstack/octavia-image-upload-59f8cff499-c464q" Oct 07 14:03:22 crc kubenswrapper[4854]: I1007 14:03:22.982507 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/85bbef8f-9a78-4c68-b0d0-6eb59418a6f6-amphora-image\") pod \"octavia-image-upload-59f8cff499-c464q\" (UID: \"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6\") " pod="openstack/octavia-image-upload-59f8cff499-c464q" Oct 07 14:03:23 crc kubenswrapper[4854]: I1007 14:03:23.003946 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-c8jtv" event={"ID":"cd7a15ef-3061-4eef-b5ec-f5933a93a797","Type":"ContainerStarted","Data":"6f89d4b6507ab6a6b84c38fe1364828de7b099d5fcc3f6cbcc19f8c545b94bde"} Oct 07 14:03:23 crc kubenswrapper[4854]: I1007 14:03:23.083979 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/85bbef8f-9a78-4c68-b0d0-6eb59418a6f6-httpd-config\") pod \"octavia-image-upload-59f8cff499-c464q\" (UID: \"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6\") " pod="openstack/octavia-image-upload-59f8cff499-c464q" Oct 07 14:03:23 crc kubenswrapper[4854]: I1007 14:03:23.084092 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/85bbef8f-9a78-4c68-b0d0-6eb59418a6f6-amphora-image\") pod \"octavia-image-upload-59f8cff499-c464q\" (UID: \"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6\") " pod="openstack/octavia-image-upload-59f8cff499-c464q" Oct 07 14:03:23 crc kubenswrapper[4854]: I1007 14:03:23.084573 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/85bbef8f-9a78-4c68-b0d0-6eb59418a6f6-amphora-image\") pod \"octavia-image-upload-59f8cff499-c464q\" (UID: \"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6\") " pod="openstack/octavia-image-upload-59f8cff499-c464q" Oct 07 14:03:23 crc kubenswrapper[4854]: I1007 14:03:23.093449 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/85bbef8f-9a78-4c68-b0d0-6eb59418a6f6-httpd-config\") pod \"octavia-image-upload-59f8cff499-c464q\" (UID: \"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6\") " pod="openstack/octavia-image-upload-59f8cff499-c464q" Oct 07 14:03:23 crc kubenswrapper[4854]: I1007 14:03:23.273795 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-c464q" Oct 07 14:03:23 crc kubenswrapper[4854]: I1007 14:03:23.555950 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-c464q"] Oct 07 14:03:23 crc kubenswrapper[4854]: W1007 14:03:23.571652 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85bbef8f_9a78_4c68_b0d0_6eb59418a6f6.slice/crio-d2d6918cdea23171cd5eaca9e48f19f15d29b5293830867123878fe54e30cb98 WatchSource:0}: Error finding container d2d6918cdea23171cd5eaca9e48f19f15d29b5293830867123878fe54e30cb98: Status 404 returned error can't find the container with id d2d6918cdea23171cd5eaca9e48f19f15d29b5293830867123878fe54e30cb98 Oct 07 14:03:24 crc kubenswrapper[4854]: I1007 14:03:24.024381 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-c464q" event={"ID":"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6","Type":"ContainerStarted","Data":"d2d6918cdea23171cd5eaca9e48f19f15d29b5293830867123878fe54e30cb98"} Oct 07 14:03:24 crc kubenswrapper[4854]: I1007 14:03:24.660536 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5d6s8" podUID="adc47d70-c658-494d-a6c4-6bb88e047b75" containerName="registry-server" probeResult="failure" output=< Oct 07 14:03:24 crc kubenswrapper[4854]: timeout: failed to connect service ":50051" within 1s Oct 07 14:03:24 crc kubenswrapper[4854]: > Oct 07 14:03:26 crc kubenswrapper[4854]: I1007 14:03:26.049366 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-c8jtv" event={"ID":"cd7a15ef-3061-4eef-b5ec-f5933a93a797","Type":"ContainerStarted","Data":"060f151a1b3f96a8994d38447e0950ad3952629aceb7d67dc84bc8f98f86a374"} Oct 07 14:03:28 crc kubenswrapper[4854]: I1007 14:03:28.072031 4854 generic.go:334] "Generic (PLEG): container finished" podID="cd7a15ef-3061-4eef-b5ec-f5933a93a797" containerID="060f151a1b3f96a8994d38447e0950ad3952629aceb7d67dc84bc8f98f86a374" exitCode=0 Oct 07 14:03:28 crc kubenswrapper[4854]: I1007 14:03:28.072392 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-c8jtv" event={"ID":"cd7a15ef-3061-4eef-b5ec-f5933a93a797","Type":"ContainerDied","Data":"060f151a1b3f96a8994d38447e0950ad3952629aceb7d67dc84bc8f98f86a374"} Oct 07 14:03:28 crc kubenswrapper[4854]: I1007 14:03:28.424009 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:03:28 crc kubenswrapper[4854]: I1007 14:03:28.453381 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-7ff49dfc98-fd4k5" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.089235 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-l6pgg"] Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.091771 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.094086 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.101816 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-l6pgg"] Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.206277 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-combined-ca-bundle\") pod \"octavia-db-sync-l6pgg\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.206380 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-config-data\") pod \"octavia-db-sync-l6pgg\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.206684 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-scripts\") pod \"octavia-db-sync-l6pgg\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.206912 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-config-data-merged\") pod \"octavia-db-sync-l6pgg\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.308417 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-combined-ca-bundle\") pod \"octavia-db-sync-l6pgg\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.308509 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-config-data\") pod \"octavia-db-sync-l6pgg\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.308561 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-scripts\") pod \"octavia-db-sync-l6pgg\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.308609 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-config-data-merged\") pod \"octavia-db-sync-l6pgg\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.309036 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-config-data-merged\") pod \"octavia-db-sync-l6pgg\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.314669 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-combined-ca-bundle\") pod \"octavia-db-sync-l6pgg\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.316068 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-config-data\") pod \"octavia-db-sync-l6pgg\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.326750 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-scripts\") pod \"octavia-db-sync-l6pgg\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.424591 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.664268 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.718074 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:03:33 crc kubenswrapper[4854]: I1007 14:03:33.900453 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5d6s8"] Oct 07 14:03:35 crc kubenswrapper[4854]: I1007 14:03:35.055953 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-r5dpl"] Oct 07 14:03:35 crc kubenswrapper[4854]: I1007 14:03:35.070059 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-r5dpl"] Oct 07 14:03:35 crc kubenswrapper[4854]: I1007 14:03:35.152303 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5d6s8" podUID="adc47d70-c658-494d-a6c4-6bb88e047b75" containerName="registry-server" containerID="cri-o://b9bca530b2253a9e35354d62081489edce2a783ec869d18ef429754c1e8fbd71" gracePeriod=2 Oct 07 14:03:35 crc kubenswrapper[4854]: I1007 14:03:35.748346 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:03:35 crc kubenswrapper[4854]: W1007 14:03:35.853821 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44e40bbe_6d98_4e62_8fc1_79dfd4ccff94.slice/crio-e2cef896d6a7803294f3bd1ccc2f0866cb6e10e0068d3e2648b83f1959093bc6 WatchSource:0}: Error finding container e2cef896d6a7803294f3bd1ccc2f0866cb6e10e0068d3e2648b83f1959093bc6: Status 404 returned error can't find the container with id e2cef896d6a7803294f3bd1ccc2f0866cb6e10e0068d3e2648b83f1959093bc6 Oct 07 14:03:35 crc kubenswrapper[4854]: I1007 14:03:35.855116 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-l6pgg"] Oct 07 14:03:35 crc kubenswrapper[4854]: I1007 14:03:35.878106 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adc47d70-c658-494d-a6c4-6bb88e047b75-catalog-content\") pod \"adc47d70-c658-494d-a6c4-6bb88e047b75\" (UID: \"adc47d70-c658-494d-a6c4-6bb88e047b75\") " Oct 07 14:03:35 crc kubenswrapper[4854]: I1007 14:03:35.878553 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkg79\" (UniqueName: \"kubernetes.io/projected/adc47d70-c658-494d-a6c4-6bb88e047b75-kube-api-access-jkg79\") pod \"adc47d70-c658-494d-a6c4-6bb88e047b75\" (UID: \"adc47d70-c658-494d-a6c4-6bb88e047b75\") " Oct 07 14:03:35 crc kubenswrapper[4854]: I1007 14:03:35.878634 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adc47d70-c658-494d-a6c4-6bb88e047b75-utilities\") pod \"adc47d70-c658-494d-a6c4-6bb88e047b75\" (UID: \"adc47d70-c658-494d-a6c4-6bb88e047b75\") " Oct 07 14:03:35 crc kubenswrapper[4854]: I1007 14:03:35.882189 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adc47d70-c658-494d-a6c4-6bb88e047b75-utilities" (OuterVolumeSpecName: "utilities") pod "adc47d70-c658-494d-a6c4-6bb88e047b75" (UID: "adc47d70-c658-494d-a6c4-6bb88e047b75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:03:35 crc kubenswrapper[4854]: I1007 14:03:35.901789 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adc47d70-c658-494d-a6c4-6bb88e047b75-kube-api-access-jkg79" (OuterVolumeSpecName: "kube-api-access-jkg79") pod "adc47d70-c658-494d-a6c4-6bb88e047b75" (UID: "adc47d70-c658-494d-a6c4-6bb88e047b75"). InnerVolumeSpecName "kube-api-access-jkg79". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:03:35 crc kubenswrapper[4854]: I1007 14:03:35.939982 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adc47d70-c658-494d-a6c4-6bb88e047b75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "adc47d70-c658-494d-a6c4-6bb88e047b75" (UID: "adc47d70-c658-494d-a6c4-6bb88e047b75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:03:35 crc kubenswrapper[4854]: I1007 14:03:35.980848 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adc47d70-c658-494d-a6c4-6bb88e047b75-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:35 crc kubenswrapper[4854]: I1007 14:03:35.980895 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkg79\" (UniqueName: \"kubernetes.io/projected/adc47d70-c658-494d-a6c4-6bb88e047b75-kube-api-access-jkg79\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:35 crc kubenswrapper[4854]: I1007 14:03:35.980912 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adc47d70-c658-494d-a6c4-6bb88e047b75-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.161487 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-l6pgg" event={"ID":"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94","Type":"ContainerStarted","Data":"248341ff6a952ec5666566969828ab89551b3ce71b595c4df9821322f0db3732"} Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.161534 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-l6pgg" event={"ID":"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94","Type":"ContainerStarted","Data":"e2cef896d6a7803294f3bd1ccc2f0866cb6e10e0068d3e2648b83f1959093bc6"} Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.165180 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-c8jtv" event={"ID":"cd7a15ef-3061-4eef-b5ec-f5933a93a797","Type":"ContainerStarted","Data":"f0391c62e422c88c8e98d79c3dfd49754da18e3bf97dd21170f2bbd541f8d463"} Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.165678 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.167007 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-c464q" event={"ID":"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6","Type":"ContainerStarted","Data":"ffb4b1423a76d4710740c43456be2e3213f08f58f8b288a7c35ea86e7a42bdf5"} Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.169059 4854 generic.go:334] "Generic (PLEG): container finished" podID="adc47d70-c658-494d-a6c4-6bb88e047b75" containerID="b9bca530b2253a9e35354d62081489edce2a783ec869d18ef429754c1e8fbd71" exitCode=0 Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.169090 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5d6s8" event={"ID":"adc47d70-c658-494d-a6c4-6bb88e047b75","Type":"ContainerDied","Data":"b9bca530b2253a9e35354d62081489edce2a783ec869d18ef429754c1e8fbd71"} Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.169105 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5d6s8" event={"ID":"adc47d70-c658-494d-a6c4-6bb88e047b75","Type":"ContainerDied","Data":"6cd245f069fc913c20fb15b42f6b4a26e28a2e96cb23d2d94e88ff6272126dd3"} Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.169122 4854 scope.go:117] "RemoveContainer" containerID="b9bca530b2253a9e35354d62081489edce2a783ec869d18ef429754c1e8fbd71" Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.169227 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5d6s8" Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.193667 4854 scope.go:117] "RemoveContainer" containerID="9c290d051539376fb53abe3db5abc3366e66645e5b69d9b2f04c229129307120" Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.203353 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-c8jtv" podStartSLOduration=2.66888127 podStartE2EDuration="15.20333584s" podCreationTimestamp="2025-10-07 14:03:21 +0000 UTC" firstStartedPulling="2025-10-07 14:03:22.798837164 +0000 UTC m=+5918.786669419" lastFinishedPulling="2025-10-07 14:03:35.333291724 +0000 UTC m=+5931.321123989" observedRunningTime="2025-10-07 14:03:36.196028441 +0000 UTC m=+5932.183860696" watchObservedRunningTime="2025-10-07 14:03:36.20333584 +0000 UTC m=+5932.191168095" Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.233809 4854 scope.go:117] "RemoveContainer" containerID="a45308c3ff9a77517c6ec6d3429cba20083bf118d9dbf67695ad611d65821bdb" Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.236580 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5d6s8"] Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.243811 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5d6s8"] Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.314560 4854 scope.go:117] "RemoveContainer" containerID="b9bca530b2253a9e35354d62081489edce2a783ec869d18ef429754c1e8fbd71" Oct 07 14:03:36 crc kubenswrapper[4854]: E1007 14:03:36.315071 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9bca530b2253a9e35354d62081489edce2a783ec869d18ef429754c1e8fbd71\": container with ID starting with b9bca530b2253a9e35354d62081489edce2a783ec869d18ef429754c1e8fbd71 not found: ID does not exist" containerID="b9bca530b2253a9e35354d62081489edce2a783ec869d18ef429754c1e8fbd71" Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.315131 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9bca530b2253a9e35354d62081489edce2a783ec869d18ef429754c1e8fbd71"} err="failed to get container status \"b9bca530b2253a9e35354d62081489edce2a783ec869d18ef429754c1e8fbd71\": rpc error: code = NotFound desc = could not find container \"b9bca530b2253a9e35354d62081489edce2a783ec869d18ef429754c1e8fbd71\": container with ID starting with b9bca530b2253a9e35354d62081489edce2a783ec869d18ef429754c1e8fbd71 not found: ID does not exist" Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.315185 4854 scope.go:117] "RemoveContainer" containerID="9c290d051539376fb53abe3db5abc3366e66645e5b69d9b2f04c229129307120" Oct 07 14:03:36 crc kubenswrapper[4854]: E1007 14:03:36.315707 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c290d051539376fb53abe3db5abc3366e66645e5b69d9b2f04c229129307120\": container with ID starting with 9c290d051539376fb53abe3db5abc3366e66645e5b69d9b2f04c229129307120 not found: ID does not exist" containerID="9c290d051539376fb53abe3db5abc3366e66645e5b69d9b2f04c229129307120" Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.315752 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c290d051539376fb53abe3db5abc3366e66645e5b69d9b2f04c229129307120"} err="failed to get container status \"9c290d051539376fb53abe3db5abc3366e66645e5b69d9b2f04c229129307120\": rpc error: code = NotFound desc = could not find container \"9c290d051539376fb53abe3db5abc3366e66645e5b69d9b2f04c229129307120\": container with ID starting with 9c290d051539376fb53abe3db5abc3366e66645e5b69d9b2f04c229129307120 not found: ID does not exist" Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.315779 4854 scope.go:117] "RemoveContainer" containerID="a45308c3ff9a77517c6ec6d3429cba20083bf118d9dbf67695ad611d65821bdb" Oct 07 14:03:36 crc kubenswrapper[4854]: E1007 14:03:36.316075 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45308c3ff9a77517c6ec6d3429cba20083bf118d9dbf67695ad611d65821bdb\": container with ID starting with a45308c3ff9a77517c6ec6d3429cba20083bf118d9dbf67695ad611d65821bdb not found: ID does not exist" containerID="a45308c3ff9a77517c6ec6d3429cba20083bf118d9dbf67695ad611d65821bdb" Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.316130 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45308c3ff9a77517c6ec6d3429cba20083bf118d9dbf67695ad611d65821bdb"} err="failed to get container status \"a45308c3ff9a77517c6ec6d3429cba20083bf118d9dbf67695ad611d65821bdb\": rpc error: code = NotFound desc = could not find container \"a45308c3ff9a77517c6ec6d3429cba20083bf118d9dbf67695ad611d65821bdb\": container with ID starting with a45308c3ff9a77517c6ec6d3429cba20083bf118d9dbf67695ad611d65821bdb not found: ID does not exist" Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.715481 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16b73e39-9c15-4fa3-9e93-896d656a9d48" path="/var/lib/kubelet/pods/16b73e39-9c15-4fa3-9e93-896d656a9d48/volumes" Oct 07 14:03:36 crc kubenswrapper[4854]: I1007 14:03:36.716671 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adc47d70-c658-494d-a6c4-6bb88e047b75" path="/var/lib/kubelet/pods/adc47d70-c658-494d-a6c4-6bb88e047b75/volumes" Oct 07 14:03:37 crc kubenswrapper[4854]: I1007 14:03:37.182422 4854 generic.go:334] "Generic (PLEG): container finished" podID="44e40bbe-6d98-4e62-8fc1-79dfd4ccff94" containerID="248341ff6a952ec5666566969828ab89551b3ce71b595c4df9821322f0db3732" exitCode=0 Oct 07 14:03:37 crc kubenswrapper[4854]: I1007 14:03:37.183536 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-l6pgg" event={"ID":"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94","Type":"ContainerDied","Data":"248341ff6a952ec5666566969828ab89551b3ce71b595c4df9821322f0db3732"} Oct 07 14:03:38 crc kubenswrapper[4854]: I1007 14:03:38.201076 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-l6pgg" event={"ID":"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94","Type":"ContainerStarted","Data":"435606fc6df192f50dbc67a2421f1ab2e060d79ad120c9643fe0d6a7abb29001"} Oct 07 14:03:38 crc kubenswrapper[4854]: I1007 14:03:38.220459 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-l6pgg" podStartSLOduration=5.220442649 podStartE2EDuration="5.220442649s" podCreationTimestamp="2025-10-07 14:03:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:03:38.217488934 +0000 UTC m=+5934.205321209" watchObservedRunningTime="2025-10-07 14:03:38.220442649 +0000 UTC m=+5934.208274904" Oct 07 14:03:40 crc kubenswrapper[4854]: I1007 14:03:40.808314 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:03:40 crc kubenswrapper[4854]: I1007 14:03:40.808847 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:03:41 crc kubenswrapper[4854]: I1007 14:03:41.237801 4854 generic.go:334] "Generic (PLEG): container finished" podID="85bbef8f-9a78-4c68-b0d0-6eb59418a6f6" containerID="ffb4b1423a76d4710740c43456be2e3213f08f58f8b288a7c35ea86e7a42bdf5" exitCode=0 Oct 07 14:03:41 crc kubenswrapper[4854]: I1007 14:03:41.237951 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-c464q" event={"ID":"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6","Type":"ContainerDied","Data":"ffb4b1423a76d4710740c43456be2e3213f08f58f8b288a7c35ea86e7a42bdf5"} Oct 07 14:03:41 crc kubenswrapper[4854]: I1007 14:03:41.241623 4854 generic.go:334] "Generic (PLEG): container finished" podID="44e40bbe-6d98-4e62-8fc1-79dfd4ccff94" containerID="435606fc6df192f50dbc67a2421f1ab2e060d79ad120c9643fe0d6a7abb29001" exitCode=0 Oct 07 14:03:41 crc kubenswrapper[4854]: I1007 14:03:41.241681 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-l6pgg" event={"ID":"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94","Type":"ContainerDied","Data":"435606fc6df192f50dbc67a2421f1ab2e060d79ad120c9643fe0d6a7abb29001"} Oct 07 14:03:42 crc kubenswrapper[4854]: I1007 14:03:42.747543 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:42 crc kubenswrapper[4854]: I1007 14:03:42.939402 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-combined-ca-bundle\") pod \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " Oct 07 14:03:42 crc kubenswrapper[4854]: I1007 14:03:42.940465 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-scripts\") pod \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " Oct 07 14:03:42 crc kubenswrapper[4854]: I1007 14:03:42.940685 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-config-data\") pod \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " Oct 07 14:03:42 crc kubenswrapper[4854]: I1007 14:03:42.940873 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-config-data-merged\") pod \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\" (UID: \"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94\") " Oct 07 14:03:42 crc kubenswrapper[4854]: I1007 14:03:42.946345 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-scripts" (OuterVolumeSpecName: "scripts") pod "44e40bbe-6d98-4e62-8fc1-79dfd4ccff94" (UID: "44e40bbe-6d98-4e62-8fc1-79dfd4ccff94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:03:42 crc kubenswrapper[4854]: I1007 14:03:42.947009 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-config-data" (OuterVolumeSpecName: "config-data") pod "44e40bbe-6d98-4e62-8fc1-79dfd4ccff94" (UID: "44e40bbe-6d98-4e62-8fc1-79dfd4ccff94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:03:42 crc kubenswrapper[4854]: I1007 14:03:42.969891 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44e40bbe-6d98-4e62-8fc1-79dfd4ccff94" (UID: "44e40bbe-6d98-4e62-8fc1-79dfd4ccff94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:03:42 crc kubenswrapper[4854]: I1007 14:03:42.978807 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "44e40bbe-6d98-4e62-8fc1-79dfd4ccff94" (UID: "44e40bbe-6d98-4e62-8fc1-79dfd4ccff94"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:03:43 crc kubenswrapper[4854]: I1007 14:03:43.043335 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:43 crc kubenswrapper[4854]: I1007 14:03:43.043368 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:43 crc kubenswrapper[4854]: I1007 14:03:43.043377 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:43 crc kubenswrapper[4854]: I1007 14:03:43.043386 4854 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94-config-data-merged\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:43 crc kubenswrapper[4854]: I1007 14:03:43.264287 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-c464q" event={"ID":"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6","Type":"ContainerStarted","Data":"e30355fad00cfe60148f9f16bfb89f71a1e68360f541c225afb696d00ad361c4"} Oct 07 14:03:43 crc kubenswrapper[4854]: I1007 14:03:43.267581 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-l6pgg" event={"ID":"44e40bbe-6d98-4e62-8fc1-79dfd4ccff94","Type":"ContainerDied","Data":"e2cef896d6a7803294f3bd1ccc2f0866cb6e10e0068d3e2648b83f1959093bc6"} Oct 07 14:03:43 crc kubenswrapper[4854]: I1007 14:03:43.267631 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2cef896d6a7803294f3bd1ccc2f0866cb6e10e0068d3e2648b83f1959093bc6" Oct 07 14:03:43 crc kubenswrapper[4854]: I1007 14:03:43.267671 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-l6pgg" Oct 07 14:03:43 crc kubenswrapper[4854]: I1007 14:03:43.328328 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-c464q" podStartSLOduration=2.507007803 podStartE2EDuration="21.328307199s" podCreationTimestamp="2025-10-07 14:03:22 +0000 UTC" firstStartedPulling="2025-10-07 14:03:23.57476417 +0000 UTC m=+5919.562596425" lastFinishedPulling="2025-10-07 14:03:42.396063556 +0000 UTC m=+5938.383895821" observedRunningTime="2025-10-07 14:03:43.297453218 +0000 UTC m=+5939.285285483" watchObservedRunningTime="2025-10-07 14:03:43.328307199 +0000 UTC m=+5939.316139444" Oct 07 14:03:45 crc kubenswrapper[4854]: I1007 14:03:45.032198 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-540c-account-create-dvvgd"] Oct 07 14:03:45 crc kubenswrapper[4854]: I1007 14:03:45.040714 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-540c-account-create-dvvgd"] Oct 07 14:03:46 crc kubenswrapper[4854]: I1007 14:03:46.725095 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec29f36-a768-4b39-beba-e680db595dbf" path="/var/lib/kubelet/pods/5ec29f36-a768-4b39-beba-e680db595dbf/volumes" Oct 07 14:03:51 crc kubenswrapper[4854]: I1007 14:03:51.025844 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-npzds"] Oct 07 14:03:51 crc kubenswrapper[4854]: I1007 14:03:51.032739 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-npzds"] Oct 07 14:03:52 crc kubenswrapper[4854]: I1007 14:03:52.331870 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-c8jtv" Oct 07 14:03:52 crc kubenswrapper[4854]: I1007 14:03:52.715982 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da560e8e-470c-4f4e-b14a-e90b4b0a40fc" path="/var/lib/kubelet/pods/da560e8e-470c-4f4e-b14a-e90b4b0a40fc/volumes" Oct 07 14:04:07 crc kubenswrapper[4854]: I1007 14:04:07.583263 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-c464q"] Oct 07 14:04:07 crc kubenswrapper[4854]: I1007 14:04:07.583972 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-c464q" podUID="85bbef8f-9a78-4c68-b0d0-6eb59418a6f6" containerName="octavia-amphora-httpd" containerID="cri-o://e30355fad00cfe60148f9f16bfb89f71a1e68360f541c225afb696d00ad361c4" gracePeriod=30 Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.233732 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-c464q" Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.361695 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/85bbef8f-9a78-4c68-b0d0-6eb59418a6f6-amphora-image\") pod \"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6\" (UID: \"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6\") " Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.361851 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/85bbef8f-9a78-4c68-b0d0-6eb59418a6f6-httpd-config\") pod \"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6\" (UID: \"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6\") " Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.395568 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85bbef8f-9a78-4c68-b0d0-6eb59418a6f6-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "85bbef8f-9a78-4c68-b0d0-6eb59418a6f6" (UID: "85bbef8f-9a78-4c68-b0d0-6eb59418a6f6"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.443620 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bbef8f-9a78-4c68-b0d0-6eb59418a6f6-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "85bbef8f-9a78-4c68-b0d0-6eb59418a6f6" (UID: "85bbef8f-9a78-4c68-b0d0-6eb59418a6f6"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.464457 4854 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/85bbef8f-9a78-4c68-b0d0-6eb59418a6f6-amphora-image\") on node \"crc\" DevicePath \"\"" Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.464504 4854 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/85bbef8f-9a78-4c68-b0d0-6eb59418a6f6-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.527004 4854 generic.go:334] "Generic (PLEG): container finished" podID="85bbef8f-9a78-4c68-b0d0-6eb59418a6f6" containerID="e30355fad00cfe60148f9f16bfb89f71a1e68360f541c225afb696d00ad361c4" exitCode=0 Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.527051 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-c464q" event={"ID":"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6","Type":"ContainerDied","Data":"e30355fad00cfe60148f9f16bfb89f71a1e68360f541c225afb696d00ad361c4"} Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.527083 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-c464q" event={"ID":"85bbef8f-9a78-4c68-b0d0-6eb59418a6f6","Type":"ContainerDied","Data":"d2d6918cdea23171cd5eaca9e48f19f15d29b5293830867123878fe54e30cb98"} Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.527100 4854 scope.go:117] "RemoveContainer" containerID="e30355fad00cfe60148f9f16bfb89f71a1e68360f541c225afb696d00ad361c4" Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.527219 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-c464q" Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.553091 4854 scope.go:117] "RemoveContainer" containerID="ffb4b1423a76d4710740c43456be2e3213f08f58f8b288a7c35ea86e7a42bdf5" Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.571576 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-c464q"] Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.576383 4854 scope.go:117] "RemoveContainer" containerID="e30355fad00cfe60148f9f16bfb89f71a1e68360f541c225afb696d00ad361c4" Oct 07 14:04:08 crc kubenswrapper[4854]: E1007 14:04:08.576766 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e30355fad00cfe60148f9f16bfb89f71a1e68360f541c225afb696d00ad361c4\": container with ID starting with e30355fad00cfe60148f9f16bfb89f71a1e68360f541c225afb696d00ad361c4 not found: ID does not exist" containerID="e30355fad00cfe60148f9f16bfb89f71a1e68360f541c225afb696d00ad361c4" Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.576801 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e30355fad00cfe60148f9f16bfb89f71a1e68360f541c225afb696d00ad361c4"} err="failed to get container status \"e30355fad00cfe60148f9f16bfb89f71a1e68360f541c225afb696d00ad361c4\": rpc error: code = NotFound desc = could not find container \"e30355fad00cfe60148f9f16bfb89f71a1e68360f541c225afb696d00ad361c4\": container with ID starting with e30355fad00cfe60148f9f16bfb89f71a1e68360f541c225afb696d00ad361c4 not found: ID does not exist" Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.576827 4854 scope.go:117] "RemoveContainer" containerID="ffb4b1423a76d4710740c43456be2e3213f08f58f8b288a7c35ea86e7a42bdf5" Oct 07 14:04:08 crc kubenswrapper[4854]: E1007 14:04:08.577183 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffb4b1423a76d4710740c43456be2e3213f08f58f8b288a7c35ea86e7a42bdf5\": container with ID starting with ffb4b1423a76d4710740c43456be2e3213f08f58f8b288a7c35ea86e7a42bdf5 not found: ID does not exist" containerID="ffb4b1423a76d4710740c43456be2e3213f08f58f8b288a7c35ea86e7a42bdf5" Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.577223 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb4b1423a76d4710740c43456be2e3213f08f58f8b288a7c35ea86e7a42bdf5"} err="failed to get container status \"ffb4b1423a76d4710740c43456be2e3213f08f58f8b288a7c35ea86e7a42bdf5\": rpc error: code = NotFound desc = could not find container \"ffb4b1423a76d4710740c43456be2e3213f08f58f8b288a7c35ea86e7a42bdf5\": container with ID starting with ffb4b1423a76d4710740c43456be2e3213f08f58f8b288a7c35ea86e7a42bdf5 not found: ID does not exist" Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.582613 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-c464q"] Oct 07 14:04:08 crc kubenswrapper[4854]: I1007 14:04:08.714610 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85bbef8f-9a78-4c68-b0d0-6eb59418a6f6" path="/var/lib/kubelet/pods/85bbef8f-9a78-4c68-b0d0-6eb59418a6f6/volumes" Oct 07 14:04:10 crc kubenswrapper[4854]: I1007 14:04:10.807838 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:04:10 crc kubenswrapper[4854]: I1007 14:04:10.808331 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:04:22 crc kubenswrapper[4854]: I1007 14:04:22.032515 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-n2lx2"] Oct 07 14:04:22 crc kubenswrapper[4854]: I1007 14:04:22.041373 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-n2lx2"] Oct 07 14:04:22 crc kubenswrapper[4854]: I1007 14:04:22.715192 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="977c4588-d10e-4792-a3e5-111bbf45f37f" path="/var/lib/kubelet/pods/977c4588-d10e-4792-a3e5-111bbf45f37f/volumes" Oct 07 14:04:23 crc kubenswrapper[4854]: I1007 14:04:23.063209 4854 scope.go:117] "RemoveContainer" containerID="c6730326036e7cf76206447bfc9286ccf192805e8a7ebd87432d6869c2f90e8c" Oct 07 14:04:23 crc kubenswrapper[4854]: I1007 14:04:23.114541 4854 scope.go:117] "RemoveContainer" containerID="a9f7a9e2f357dedfca8c3ddacc7e07c3ff5a3263c7476c830c5ac6002d376f66" Oct 07 14:04:23 crc kubenswrapper[4854]: I1007 14:04:23.153546 4854 scope.go:117] "RemoveContainer" containerID="94898441204b9ad0926883493e28808ab3574e7a59ae8f51998dd9feb030da96" Oct 07 14:04:23 crc kubenswrapper[4854]: I1007 14:04:23.235288 4854 scope.go:117] "RemoveContainer" containerID="c54d529ac6202c0e1e2098f8fd876e8a2c2c23ac1c6e793a62b7236a8e9db0c4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.520093 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-z8lh4"] Oct 07 14:04:28 crc kubenswrapper[4854]: E1007 14:04:28.521272 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adc47d70-c658-494d-a6c4-6bb88e047b75" containerName="registry-server" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.521294 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="adc47d70-c658-494d-a6c4-6bb88e047b75" containerName="registry-server" Oct 07 14:04:28 crc kubenswrapper[4854]: E1007 14:04:28.521312 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e40bbe-6d98-4e62-8fc1-79dfd4ccff94" containerName="octavia-db-sync" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.521319 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e40bbe-6d98-4e62-8fc1-79dfd4ccff94" containerName="octavia-db-sync" Oct 07 14:04:28 crc kubenswrapper[4854]: E1007 14:04:28.521335 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adc47d70-c658-494d-a6c4-6bb88e047b75" containerName="extract-utilities" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.521343 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="adc47d70-c658-494d-a6c4-6bb88e047b75" containerName="extract-utilities" Oct 07 14:04:28 crc kubenswrapper[4854]: E1007 14:04:28.521359 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e40bbe-6d98-4e62-8fc1-79dfd4ccff94" containerName="init" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.521366 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e40bbe-6d98-4e62-8fc1-79dfd4ccff94" containerName="init" Oct 07 14:04:28 crc kubenswrapper[4854]: E1007 14:04:28.521387 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bbef8f-9a78-4c68-b0d0-6eb59418a6f6" containerName="octavia-amphora-httpd" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.521395 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bbef8f-9a78-4c68-b0d0-6eb59418a6f6" containerName="octavia-amphora-httpd" Oct 07 14:04:28 crc kubenswrapper[4854]: E1007 14:04:28.521423 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bbef8f-9a78-4c68-b0d0-6eb59418a6f6" containerName="init" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.521431 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bbef8f-9a78-4c68-b0d0-6eb59418a6f6" containerName="init" Oct 07 14:04:28 crc kubenswrapper[4854]: E1007 14:04:28.521456 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adc47d70-c658-494d-a6c4-6bb88e047b75" containerName="extract-content" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.521465 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="adc47d70-c658-494d-a6c4-6bb88e047b75" containerName="extract-content" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.521683 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="85bbef8f-9a78-4c68-b0d0-6eb59418a6f6" containerName="octavia-amphora-httpd" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.521706 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="adc47d70-c658-494d-a6c4-6bb88e047b75" containerName="registry-server" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.521735 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="44e40bbe-6d98-4e62-8fc1-79dfd4ccff94" containerName="octavia-db-sync" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.523033 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.528165 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.528609 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.533482 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7ec333a0-79da-4419-b190-d49fe761f40e-amphora-certs\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.533529 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ec333a0-79da-4419-b190-d49fe761f40e-scripts\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.533555 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec333a0-79da-4419-b190-d49fe761f40e-config-data\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.533575 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7ec333a0-79da-4419-b190-d49fe761f40e-hm-ports\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.533708 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec333a0-79da-4419-b190-d49fe761f40e-combined-ca-bundle\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.533735 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7ec333a0-79da-4419-b190-d49fe761f40e-config-data-merged\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.535514 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.540567 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-z8lh4"] Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.635478 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec333a0-79da-4419-b190-d49fe761f40e-combined-ca-bundle\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.635890 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7ec333a0-79da-4419-b190-d49fe761f40e-config-data-merged\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.635947 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7ec333a0-79da-4419-b190-d49fe761f40e-amphora-certs\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.635987 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ec333a0-79da-4419-b190-d49fe761f40e-scripts\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.636020 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec333a0-79da-4419-b190-d49fe761f40e-config-data\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.636042 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7ec333a0-79da-4419-b190-d49fe761f40e-hm-ports\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.636780 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7ec333a0-79da-4419-b190-d49fe761f40e-config-data-merged\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.637432 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7ec333a0-79da-4419-b190-d49fe761f40e-hm-ports\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.641968 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ec333a0-79da-4419-b190-d49fe761f40e-scripts\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.642887 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec333a0-79da-4419-b190-d49fe761f40e-combined-ca-bundle\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.642923 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7ec333a0-79da-4419-b190-d49fe761f40e-amphora-certs\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.643965 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ec333a0-79da-4419-b190-d49fe761f40e-config-data\") pod \"octavia-healthmanager-z8lh4\" (UID: \"7ec333a0-79da-4419-b190-d49fe761f40e\") " pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:28 crc kubenswrapper[4854]: I1007 14:04:28.883747 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:29 crc kubenswrapper[4854]: I1007 14:04:29.449696 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-z8lh4"] Oct 07 14:04:29 crc kubenswrapper[4854]: I1007 14:04:29.761534 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-z8lh4" event={"ID":"7ec333a0-79da-4419-b190-d49fe761f40e","Type":"ContainerStarted","Data":"b401fa40f04e243a1bead56ca0c28d1c448d1c407bd36fc383999fe79feab1b0"} Oct 07 14:04:30 crc kubenswrapper[4854]: I1007 14:04:30.771241 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-z8lh4" event={"ID":"7ec333a0-79da-4419-b190-d49fe761f40e","Type":"ContainerStarted","Data":"f74fb75c71322f0ab5df28f2b1049c5c06edbef83702d9b0ef3b519a483418b1"} Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.154787 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-qgrhl"] Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.156501 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.158702 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.158802 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.178252 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-qgrhl"] Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.295623 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-scripts\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.295701 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-config-data\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.295721 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-amphora-certs\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.295793 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-combined-ca-bundle\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.295831 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-config-data-merged\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.295877 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-hm-ports\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.398905 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-hm-ports\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.399051 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-scripts\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.399084 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-amphora-certs\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.399121 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-config-data\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.399238 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-combined-ca-bundle\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.399285 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-config-data-merged\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.399894 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-config-data-merged\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.402605 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-hm-ports\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.409936 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-combined-ca-bundle\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.411933 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-config-data\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.412434 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-amphora-certs\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.422077 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0-scripts\") pod \"octavia-housekeeping-qgrhl\" (UID: \"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0\") " pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:31 crc kubenswrapper[4854]: I1007 14:04:31.476859 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:32 crc kubenswrapper[4854]: I1007 14:04:32.031569 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d74c-account-create-kchkm"] Oct 07 14:04:32 crc kubenswrapper[4854]: I1007 14:04:32.042469 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d74c-account-create-kchkm"] Oct 07 14:04:32 crc kubenswrapper[4854]: I1007 14:04:32.096547 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-qgrhl"] Oct 07 14:04:32 crc kubenswrapper[4854]: I1007 14:04:32.714990 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be5bf13e-a91f-45c2-b7b7-5acaa12b0c49" path="/var/lib/kubelet/pods/be5bf13e-a91f-45c2-b7b7-5acaa12b0c49/volumes" Oct 07 14:04:32 crc kubenswrapper[4854]: I1007 14:04:32.791264 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-qgrhl" event={"ID":"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0","Type":"ContainerStarted","Data":"6ceb92d938f17528600e41cd905124526aa13316867108bb3d96d9985855474a"} Oct 07 14:04:32 crc kubenswrapper[4854]: I1007 14:04:32.792900 4854 generic.go:334] "Generic (PLEG): container finished" podID="7ec333a0-79da-4419-b190-d49fe761f40e" containerID="f74fb75c71322f0ab5df28f2b1049c5c06edbef83702d9b0ef3b519a483418b1" exitCode=0 Oct 07 14:04:32 crc kubenswrapper[4854]: I1007 14:04:32.792962 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-z8lh4" event={"ID":"7ec333a0-79da-4419-b190-d49fe761f40e","Type":"ContainerDied","Data":"f74fb75c71322f0ab5df28f2b1049c5c06edbef83702d9b0ef3b519a483418b1"} Oct 07 14:04:33 crc kubenswrapper[4854]: I1007 14:04:33.805662 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-z8lh4" event={"ID":"7ec333a0-79da-4419-b190-d49fe761f40e","Type":"ContainerStarted","Data":"0a74a9c46599f3e161bc32f2b9c855644d32f4d8625500fbc306ff1aac4a3cd7"} Oct 07 14:04:33 crc kubenswrapper[4854]: I1007 14:04:33.807451 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:33 crc kubenswrapper[4854]: I1007 14:04:33.844281 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-z8lh4" podStartSLOduration=5.844260028 podStartE2EDuration="5.844260028s" podCreationTimestamp="2025-10-07 14:04:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:04:33.83944668 +0000 UTC m=+5989.827278945" watchObservedRunningTime="2025-10-07 14:04:33.844260028 +0000 UTC m=+5989.832092283" Oct 07 14:04:33 crc kubenswrapper[4854]: I1007 14:04:33.958963 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-9dv8x"] Oct 07 14:04:33 crc kubenswrapper[4854]: I1007 14:04:33.960684 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:33 crc kubenswrapper[4854]: I1007 14:04:33.964277 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Oct 07 14:04:33 crc kubenswrapper[4854]: I1007 14:04:33.964508 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Oct 07 14:04:33 crc kubenswrapper[4854]: I1007 14:04:33.977460 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-9dv8x"] Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.157435 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-hm-ports\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.157517 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-scripts\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.157547 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-config-data\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.157578 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-config-data-merged\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.157619 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-amphora-certs\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.157653 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-combined-ca-bundle\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.259761 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-hm-ports\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.259836 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-scripts\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.259878 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-config-data\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.259920 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-config-data-merged\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.259966 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-amphora-certs\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.260010 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-combined-ca-bundle\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.260725 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-config-data-merged\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.261113 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-hm-ports\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.266056 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-config-data\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.266801 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-scripts\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.269997 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-combined-ca-bundle\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.271928 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/c4b77d11-39cd-4b6b-bfe9-39c06b5ac986-amphora-certs\") pod \"octavia-worker-9dv8x\" (UID: \"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986\") " pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.286539 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.836533 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-qgrhl" event={"ID":"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0","Type":"ContainerStarted","Data":"256e2b9b92416431f9d00530d4a57c5436df75f598bb2367cbbf84a25e1ad043"} Oct 07 14:04:34 crc kubenswrapper[4854]: I1007 14:04:34.906771 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-9dv8x"] Oct 07 14:04:35 crc kubenswrapper[4854]: I1007 14:04:35.852512 4854 generic.go:334] "Generic (PLEG): container finished" podID="d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0" containerID="256e2b9b92416431f9d00530d4a57c5436df75f598bb2367cbbf84a25e1ad043" exitCode=0 Oct 07 14:04:35 crc kubenswrapper[4854]: I1007 14:04:35.854076 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-qgrhl" event={"ID":"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0","Type":"ContainerDied","Data":"256e2b9b92416431f9d00530d4a57c5436df75f598bb2367cbbf84a25e1ad043"} Oct 07 14:04:35 crc kubenswrapper[4854]: I1007 14:04:35.863391 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-9dv8x" event={"ID":"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986","Type":"ContainerStarted","Data":"e2e3060cba0e689e016eed4b1079e909f9f1e7796b8ef00dfc633366f2ffcb80"} Oct 07 14:04:36 crc kubenswrapper[4854]: I1007 14:04:36.877014 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-qgrhl" event={"ID":"d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0","Type":"ContainerStarted","Data":"c2c7d5e1a547be965548adabd711e2e98b2dbd40298a61bcc3a04caeab1b031d"} Oct 07 14:04:36 crc kubenswrapper[4854]: I1007 14:04:36.878312 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:36 crc kubenswrapper[4854]: I1007 14:04:36.881587 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-9dv8x" event={"ID":"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986","Type":"ContainerStarted","Data":"eb589eec517b04af46a5fe10334992307c6ee05dcffb97e701f3a843301b0589"} Oct 07 14:04:36 crc kubenswrapper[4854]: I1007 14:04:36.907008 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-qgrhl" podStartSLOduration=4.024424036 podStartE2EDuration="5.906983629s" podCreationTimestamp="2025-10-07 14:04:31 +0000 UTC" firstStartedPulling="2025-10-07 14:04:32.080908662 +0000 UTC m=+5988.068740927" lastFinishedPulling="2025-10-07 14:04:33.963468265 +0000 UTC m=+5989.951300520" observedRunningTime="2025-10-07 14:04:36.902490971 +0000 UTC m=+5992.890323246" watchObservedRunningTime="2025-10-07 14:04:36.906983629 +0000 UTC m=+5992.894815914" Oct 07 14:04:37 crc kubenswrapper[4854]: E1007 14:04:37.382505 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4b77d11_39cd_4b6b_bfe9_39c06b5ac986.slice/crio-eb589eec517b04af46a5fe10334992307c6ee05dcffb97e701f3a843301b0589.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4b77d11_39cd_4b6b_bfe9_39c06b5ac986.slice/crio-conmon-eb589eec517b04af46a5fe10334992307c6ee05dcffb97e701f3a843301b0589.scope\": RecentStats: unable to find data in memory cache]" Oct 07 14:04:37 crc kubenswrapper[4854]: I1007 14:04:37.891427 4854 generic.go:334] "Generic (PLEG): container finished" podID="c4b77d11-39cd-4b6b-bfe9-39c06b5ac986" containerID="eb589eec517b04af46a5fe10334992307c6ee05dcffb97e701f3a843301b0589" exitCode=0 Oct 07 14:04:37 crc kubenswrapper[4854]: I1007 14:04:37.891873 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-9dv8x" event={"ID":"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986","Type":"ContainerDied","Data":"eb589eec517b04af46a5fe10334992307c6ee05dcffb97e701f3a843301b0589"} Oct 07 14:04:38 crc kubenswrapper[4854]: I1007 14:04:38.902462 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-9dv8x" event={"ID":"c4b77d11-39cd-4b6b-bfe9-39c06b5ac986","Type":"ContainerStarted","Data":"25afbbc5772de2f118967771d2a0f5311b794c2730a884e40b446005cd3bfa79"} Oct 07 14:04:38 crc kubenswrapper[4854]: I1007 14:04:38.903190 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:38 crc kubenswrapper[4854]: I1007 14:04:38.929043 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-9dv8x" podStartSLOduration=4.804547113 podStartE2EDuration="5.929019349s" podCreationTimestamp="2025-10-07 14:04:33 +0000 UTC" firstStartedPulling="2025-10-07 14:04:34.91501802 +0000 UTC m=+5990.902850275" lastFinishedPulling="2025-10-07 14:04:36.039490256 +0000 UTC m=+5992.027322511" observedRunningTime="2025-10-07 14:04:38.927274329 +0000 UTC m=+5994.915106584" watchObservedRunningTime="2025-10-07 14:04:38.929019349 +0000 UTC m=+5994.916851624" Oct 07 14:04:40 crc kubenswrapper[4854]: I1007 14:04:40.807941 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:04:40 crc kubenswrapper[4854]: I1007 14:04:40.808343 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:04:40 crc kubenswrapper[4854]: I1007 14:04:40.808406 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 14:04:40 crc kubenswrapper[4854]: I1007 14:04:40.809422 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:04:40 crc kubenswrapper[4854]: I1007 14:04:40.809540 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" gracePeriod=600 Oct 07 14:04:40 crc kubenswrapper[4854]: E1007 14:04:40.951420 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:04:41 crc kubenswrapper[4854]: I1007 14:04:41.037593 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-gbrck"] Oct 07 14:04:41 crc kubenswrapper[4854]: I1007 14:04:41.044250 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-gbrck"] Oct 07 14:04:41 crc kubenswrapper[4854]: I1007 14:04:41.943577 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" exitCode=0 Oct 07 14:04:41 crc kubenswrapper[4854]: I1007 14:04:41.943648 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b"} Oct 07 14:04:41 crc kubenswrapper[4854]: I1007 14:04:41.943947 4854 scope.go:117] "RemoveContainer" containerID="faf9abd5a5e7753883aa95fb4b691a8be7baef96f7df5d4fa745c25c86f27153" Oct 07 14:04:41 crc kubenswrapper[4854]: I1007 14:04:41.944953 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:04:41 crc kubenswrapper[4854]: E1007 14:04:41.945638 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:04:42 crc kubenswrapper[4854]: I1007 14:04:42.715310 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48dd4b01-cc3a-47cd-b66f-3eed3eed17ac" path="/var/lib/kubelet/pods/48dd4b01-cc3a-47cd-b66f-3eed3eed17ac/volumes" Oct 07 14:04:43 crc kubenswrapper[4854]: I1007 14:04:43.940500 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-z8lh4" Oct 07 14:04:46 crc kubenswrapper[4854]: I1007 14:04:46.519909 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-qgrhl" Oct 07 14:04:49 crc kubenswrapper[4854]: I1007 14:04:49.328907 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-9dv8x" Oct 07 14:04:52 crc kubenswrapper[4854]: I1007 14:04:52.702652 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:04:52 crc kubenswrapper[4854]: E1007 14:04:52.703518 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:05:04 crc kubenswrapper[4854]: I1007 14:05:04.708093 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:05:04 crc kubenswrapper[4854]: E1007 14:05:04.708737 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:05:18 crc kubenswrapper[4854]: I1007 14:05:18.703910 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:05:18 crc kubenswrapper[4854]: E1007 14:05:18.704767 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:05:23 crc kubenswrapper[4854]: I1007 14:05:23.069394 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-qjpwd"] Oct 07 14:05:23 crc kubenswrapper[4854]: I1007 14:05:23.081957 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-qjpwd"] Oct 07 14:05:23 crc kubenswrapper[4854]: I1007 14:05:23.451662 4854 scope.go:117] "RemoveContainer" containerID="0ae0eb5a3f9e855b10dfbcee498e8ed101beb4cb2ed1ac0c248351f7554976c6" Oct 07 14:05:23 crc kubenswrapper[4854]: I1007 14:05:23.504527 4854 scope.go:117] "RemoveContainer" containerID="17bf3e4416c2cf0f463c74e0acd70e269e1dcdbc5a1c014d14def2c2f65ec64f" Oct 07 14:05:23 crc kubenswrapper[4854]: I1007 14:05:23.605966 4854 scope.go:117] "RemoveContainer" containerID="bef464f12e82227f4575a1e08ee74e815686e974f3903c927ef00f78a0e4fa41" Oct 07 14:05:23 crc kubenswrapper[4854]: I1007 14:05:23.623242 4854 scope.go:117] "RemoveContainer" containerID="c9cc2919e579bda39274620d6c337300a8496e5548752ac02b7c13031530756b" Oct 07 14:05:24 crc kubenswrapper[4854]: I1007 14:05:24.713425 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0dc3885-3e3b-40c9-a58c-d255e2f321f6" path="/var/lib/kubelet/pods/d0dc3885-3e3b-40c9-a58c-d255e2f321f6/volumes" Oct 07 14:05:32 crc kubenswrapper[4854]: I1007 14:05:32.703424 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:05:32 crc kubenswrapper[4854]: E1007 14:05:32.704470 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:05:33 crc kubenswrapper[4854]: I1007 14:05:33.042835 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5199-account-create-tqt8m"] Oct 07 14:05:33 crc kubenswrapper[4854]: I1007 14:05:33.060005 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5199-account-create-tqt8m"] Oct 07 14:05:34 crc kubenswrapper[4854]: I1007 14:05:34.714175 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4699aaa5-6e90-4333-bb68-b9294ff720d0" path="/var/lib/kubelet/pods/4699aaa5-6e90-4333-bb68-b9294ff720d0/volumes" Oct 07 14:05:42 crc kubenswrapper[4854]: I1007 14:05:42.069449 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-gmm8k"] Oct 07 14:05:42 crc kubenswrapper[4854]: I1007 14:05:42.076959 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-gmm8k"] Oct 07 14:05:42 crc kubenswrapper[4854]: I1007 14:05:42.718279 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a12bba49-d1b4-469f-96fa-c74a02c4f509" path="/var/lib/kubelet/pods/a12bba49-d1b4-469f-96fa-c74a02c4f509/volumes" Oct 07 14:05:43 crc kubenswrapper[4854]: I1007 14:05:43.959820 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7cc8fddb6f-hjsgw"] Oct 07 14:05:43 crc kubenswrapper[4854]: I1007 14:05:43.961838 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:43 crc kubenswrapper[4854]: I1007 14:05:43.966484 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 07 14:05:43 crc kubenswrapper[4854]: I1007 14:05:43.966769 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 07 14:05:43 crc kubenswrapper[4854]: I1007 14:05:43.966947 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-lwg25" Oct 07 14:05:43 crc kubenswrapper[4854]: I1007 14:05:43.966992 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 07 14:05:43 crc kubenswrapper[4854]: I1007 14:05:43.968388 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-config-data\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:43 crc kubenswrapper[4854]: I1007 14:05:43.968460 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-logs\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:43 crc kubenswrapper[4854]: I1007 14:05:43.968496 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-horizon-secret-key\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:43 crc kubenswrapper[4854]: I1007 14:05:43.968622 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-scripts\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:43 crc kubenswrapper[4854]: I1007 14:05:43.968715 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5f49\" (UniqueName: \"kubernetes.io/projected/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-kube-api-access-h5f49\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:43 crc kubenswrapper[4854]: I1007 14:05:43.972735 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cc8fddb6f-hjsgw"] Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.006345 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.006711 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6e9c69fe-fbae-4bb3-9b45-99adc96401fb" containerName="glance-log" containerID="cri-o://5650f623e058e72a787570039c9691294f481f5d28b04702b86e50542b55c0e7" gracePeriod=30 Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.006779 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6e9c69fe-fbae-4bb3-9b45-99adc96401fb" containerName="glance-httpd" containerID="cri-o://b86f007cec90b15b2a5e447fdd890d97dcf68481f31dcd42468995c12137b3e6" gracePeriod=30 Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.070991 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-config-data\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.071069 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-logs\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.071105 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-horizon-secret-key\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.071165 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-scripts\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.071211 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5f49\" (UniqueName: \"kubernetes.io/projected/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-kube-api-access-h5f49\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.072670 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-config-data\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.072945 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-logs\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.073348 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-scripts\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.073387 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.073580 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e598d141-22f5-4f79-8f6f-ca3f5051e894" containerName="glance-log" containerID="cri-o://b87e87cd1bf9464c9fcfc1bb9b5ffa1bc656a45581b40934961e33158b1b568e" gracePeriod=30 Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.074013 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e598d141-22f5-4f79-8f6f-ca3f5051e894" containerName="glance-httpd" containerID="cri-o://169ec0c025bfa7ffec1106bc9a13be9fbd77d7b8286f71799c28c7c03dc9a0eb" gracePeriod=30 Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.079746 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-horizon-secret-key\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.101383 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5f49\" (UniqueName: \"kubernetes.io/projected/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-kube-api-access-h5f49\") pod \"horizon-7cc8fddb6f-hjsgw\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.114640 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6667b4cddf-7z9f7"] Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.116718 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.131032 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6667b4cddf-7z9f7"] Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.274429 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm8k9\" (UniqueName: \"kubernetes.io/projected/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-kube-api-access-hm8k9\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.275221 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-config-data\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.275350 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-logs\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.275657 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-horizon-secret-key\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.275745 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-scripts\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.284289 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.377862 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-logs\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.377975 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-horizon-secret-key\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.378004 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-scripts\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.378054 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm8k9\" (UniqueName: \"kubernetes.io/projected/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-kube-api-access-hm8k9\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.378089 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-config-data\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.378287 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-logs\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.379285 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-config-data\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.379432 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-scripts\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.383818 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-horizon-secret-key\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.403426 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm8k9\" (UniqueName: \"kubernetes.io/projected/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-kube-api-access-hm8k9\") pod \"horizon-6667b4cddf-7z9f7\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.518067 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.635839 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cc8fddb6f-hjsgw"] Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.698141 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-78589d7845-lb7r8"] Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.742516 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.746392 4854 generic.go:334] "Generic (PLEG): container finished" podID="6e9c69fe-fbae-4bb3-9b45-99adc96401fb" containerID="5650f623e058e72a787570039c9691294f481f5d28b04702b86e50542b55c0e7" exitCode=143 Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.769109 4854 generic.go:334] "Generic (PLEG): container finished" podID="e598d141-22f5-4f79-8f6f-ca3f5051e894" containerID="b87e87cd1bf9464c9fcfc1bb9b5ffa1bc656a45581b40934961e33158b1b568e" exitCode=143 Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.771379 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78589d7845-lb7r8"] Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.771721 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6e9c69fe-fbae-4bb3-9b45-99adc96401fb","Type":"ContainerDied","Data":"5650f623e058e72a787570039c9691294f481f5d28b04702b86e50542b55c0e7"} Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.771812 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e598d141-22f5-4f79-8f6f-ca3f5051e894","Type":"ContainerDied","Data":"b87e87cd1bf9464c9fcfc1bb9b5ffa1bc656a45581b40934961e33158b1b568e"} Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.790559 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cc8fddb6f-hjsgw"] Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.796092 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.807738 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e937537-3ce6-44a9-80ca-d1378ef14544-horizon-secret-key\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.808057 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e937537-3ce6-44a9-80ca-d1378ef14544-logs\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.808354 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e937537-3ce6-44a9-80ca-d1378ef14544-config-data\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.808797 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e937537-3ce6-44a9-80ca-d1378ef14544-scripts\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.808955 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlth4\" (UniqueName: \"kubernetes.io/projected/7e937537-3ce6-44a9-80ca-d1378ef14544-kube-api-access-hlth4\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.910402 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlth4\" (UniqueName: \"kubernetes.io/projected/7e937537-3ce6-44a9-80ca-d1378ef14544-kube-api-access-hlth4\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.910506 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e937537-3ce6-44a9-80ca-d1378ef14544-horizon-secret-key\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.910568 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e937537-3ce6-44a9-80ca-d1378ef14544-logs\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.911409 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e937537-3ce6-44a9-80ca-d1378ef14544-logs\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.911940 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e937537-3ce6-44a9-80ca-d1378ef14544-config-data\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.912231 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e937537-3ce6-44a9-80ca-d1378ef14544-scripts\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.913442 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e937537-3ce6-44a9-80ca-d1378ef14544-config-data\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.913686 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e937537-3ce6-44a9-80ca-d1378ef14544-scripts\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.915354 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e937537-3ce6-44a9-80ca-d1378ef14544-horizon-secret-key\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:44 crc kubenswrapper[4854]: I1007 14:05:44.927972 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlth4\" (UniqueName: \"kubernetes.io/projected/7e937537-3ce6-44a9-80ca-d1378ef14544-kube-api-access-hlth4\") pod \"horizon-78589d7845-lb7r8\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:45 crc kubenswrapper[4854]: I1007 14:05:45.094083 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6667b4cddf-7z9f7"] Oct 07 14:05:45 crc kubenswrapper[4854]: I1007 14:05:45.098392 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:45 crc kubenswrapper[4854]: W1007 14:05:45.100488 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda79a49ce_66fc_404d_ba1d_373e0daa3f3a.slice/crio-9291251a81bb537914b8d633385008f2a0c9f22aa6188c9f0fd7f75f0657d044 WatchSource:0}: Error finding container 9291251a81bb537914b8d633385008f2a0c9f22aa6188c9f0fd7f75f0657d044: Status 404 returned error can't find the container with id 9291251a81bb537914b8d633385008f2a0c9f22aa6188c9f0fd7f75f0657d044 Oct 07 14:05:45 crc kubenswrapper[4854]: I1007 14:05:45.556369 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78589d7845-lb7r8"] Oct 07 14:05:45 crc kubenswrapper[4854]: I1007 14:05:45.702744 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:05:45 crc kubenswrapper[4854]: E1007 14:05:45.702948 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:05:45 crc kubenswrapper[4854]: I1007 14:05:45.779952 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78589d7845-lb7r8" event={"ID":"7e937537-3ce6-44a9-80ca-d1378ef14544","Type":"ContainerStarted","Data":"26aaf87732eb451a590d9d04f7fa38adb2d3e8db1d0199ea57191f1208c44cbd"} Oct 07 14:05:45 crc kubenswrapper[4854]: I1007 14:05:45.781521 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc8fddb6f-hjsgw" event={"ID":"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e","Type":"ContainerStarted","Data":"ba8a66d62c6bcd98a204eeaf8893250326469e0ef003afb4db209f36e2f980c0"} Oct 07 14:05:45 crc kubenswrapper[4854]: I1007 14:05:45.782745 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6667b4cddf-7z9f7" event={"ID":"a79a49ce-66fc-404d-ba1d-373e0daa3f3a","Type":"ContainerStarted","Data":"9291251a81bb537914b8d633385008f2a0c9f22aa6188c9f0fd7f75f0657d044"} Oct 07 14:05:47 crc kubenswrapper[4854]: I1007 14:05:47.277356 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="e598d141-22f5-4f79-8f6f-ca3f5051e894" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.45:9292/healthcheck\": dial tcp 10.217.1.45:9292: connect: connection refused" Oct 07 14:05:47 crc kubenswrapper[4854]: I1007 14:05:47.284976 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="e598d141-22f5-4f79-8f6f-ca3f5051e894" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.45:9292/healthcheck\": dial tcp 10.217.1.45:9292: connect: connection refused" Oct 07 14:05:47 crc kubenswrapper[4854]: I1007 14:05:47.811324 4854 generic.go:334] "Generic (PLEG): container finished" podID="e598d141-22f5-4f79-8f6f-ca3f5051e894" containerID="169ec0c025bfa7ffec1106bc9a13be9fbd77d7b8286f71799c28c7c03dc9a0eb" exitCode=0 Oct 07 14:05:47 crc kubenswrapper[4854]: I1007 14:05:47.811521 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e598d141-22f5-4f79-8f6f-ca3f5051e894","Type":"ContainerDied","Data":"169ec0c025bfa7ffec1106bc9a13be9fbd77d7b8286f71799c28c7c03dc9a0eb"} Oct 07 14:05:47 crc kubenswrapper[4854]: I1007 14:05:47.813603 4854 generic.go:334] "Generic (PLEG): container finished" podID="6e9c69fe-fbae-4bb3-9b45-99adc96401fb" containerID="b86f007cec90b15b2a5e447fdd890d97dcf68481f31dcd42468995c12137b3e6" exitCode=0 Oct 07 14:05:47 crc kubenswrapper[4854]: I1007 14:05:47.813694 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6e9c69fe-fbae-4bb3-9b45-99adc96401fb","Type":"ContainerDied","Data":"b86f007cec90b15b2a5e447fdd890d97dcf68481f31dcd42468995c12137b3e6"} Oct 07 14:05:47 crc kubenswrapper[4854]: I1007 14:05:47.905889 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.081281 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-httpd-run\") pod \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.081338 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-ceph\") pod \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.081400 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-scripts\") pod \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.081436 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-config-data\") pod \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.081461 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-combined-ca-bundle\") pod \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.081500 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhtkd\" (UniqueName: \"kubernetes.io/projected/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-kube-api-access-lhtkd\") pod \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.081525 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-logs\") pod \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\" (UID: \"6e9c69fe-fbae-4bb3-9b45-99adc96401fb\") " Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.082195 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6e9c69fe-fbae-4bb3-9b45-99adc96401fb" (UID: "6e9c69fe-fbae-4bb3-9b45-99adc96401fb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.082265 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-logs" (OuterVolumeSpecName: "logs") pod "6e9c69fe-fbae-4bb3-9b45-99adc96401fb" (UID: "6e9c69fe-fbae-4bb3-9b45-99adc96401fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.090320 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-ceph" (OuterVolumeSpecName: "ceph") pod "6e9c69fe-fbae-4bb3-9b45-99adc96401fb" (UID: "6e9c69fe-fbae-4bb3-9b45-99adc96401fb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.090714 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-kube-api-access-lhtkd" (OuterVolumeSpecName: "kube-api-access-lhtkd") pod "6e9c69fe-fbae-4bb3-9b45-99adc96401fb" (UID: "6e9c69fe-fbae-4bb3-9b45-99adc96401fb"). InnerVolumeSpecName "kube-api-access-lhtkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.107868 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-scripts" (OuterVolumeSpecName: "scripts") pod "6e9c69fe-fbae-4bb3-9b45-99adc96401fb" (UID: "6e9c69fe-fbae-4bb3-9b45-99adc96401fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.118327 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e9c69fe-fbae-4bb3-9b45-99adc96401fb" (UID: "6e9c69fe-fbae-4bb3-9b45-99adc96401fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.135636 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-config-data" (OuterVolumeSpecName: "config-data") pod "6e9c69fe-fbae-4bb3-9b45-99adc96401fb" (UID: "6e9c69fe-fbae-4bb3-9b45-99adc96401fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.183346 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.183384 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.183397 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.183409 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhtkd\" (UniqueName: \"kubernetes.io/projected/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-kube-api-access-lhtkd\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.183420 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.183430 4854 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.183441 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6e9c69fe-fbae-4bb3-9b45-99adc96401fb-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.827447 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6e9c69fe-fbae-4bb3-9b45-99adc96401fb","Type":"ContainerDied","Data":"8a6c8c31589d0c20774cc13ea1acef279ba367856b90af62b5dc865aff9e128c"} Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.827521 4854 scope.go:117] "RemoveContainer" containerID="b86f007cec90b15b2a5e447fdd890d97dcf68481f31dcd42468995c12137b3e6" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.827565 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.853430 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.863590 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.883088 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:05:48 crc kubenswrapper[4854]: E1007 14:05:48.883623 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e9c69fe-fbae-4bb3-9b45-99adc96401fb" containerName="glance-log" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.883639 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e9c69fe-fbae-4bb3-9b45-99adc96401fb" containerName="glance-log" Oct 07 14:05:48 crc kubenswrapper[4854]: E1007 14:05:48.883664 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e9c69fe-fbae-4bb3-9b45-99adc96401fb" containerName="glance-httpd" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.883670 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e9c69fe-fbae-4bb3-9b45-99adc96401fb" containerName="glance-httpd" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.883857 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e9c69fe-fbae-4bb3-9b45-99adc96401fb" containerName="glance-log" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.883875 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e9c69fe-fbae-4bb3-9b45-99adc96401fb" containerName="glance-httpd" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.884895 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.886946 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.901607 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c13c39f-8f80-4b98-884e-0bde905ab6f9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.901654 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7c13c39f-8f80-4b98-884e-0bde905ab6f9-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.901700 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c13c39f-8f80-4b98-884e-0bde905ab6f9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.901757 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c13c39f-8f80-4b98-884e-0bde905ab6f9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.901830 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr98l\" (UniqueName: \"kubernetes.io/projected/7c13c39f-8f80-4b98-884e-0bde905ab6f9-kube-api-access-zr98l\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.901871 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c13c39f-8f80-4b98-884e-0bde905ab6f9-logs\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.901897 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c13c39f-8f80-4b98-884e-0bde905ab6f9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:48 crc kubenswrapper[4854]: I1007 14:05:48.910854 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.009169 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c13c39f-8f80-4b98-884e-0bde905ab6f9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.009298 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c13c39f-8f80-4b98-884e-0bde905ab6f9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.009433 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr98l\" (UniqueName: \"kubernetes.io/projected/7c13c39f-8f80-4b98-884e-0bde905ab6f9-kube-api-access-zr98l\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.009484 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c13c39f-8f80-4b98-884e-0bde905ab6f9-logs\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.009521 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c13c39f-8f80-4b98-884e-0bde905ab6f9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.009612 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c13c39f-8f80-4b98-884e-0bde905ab6f9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.009616 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c13c39f-8f80-4b98-884e-0bde905ab6f9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.009638 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7c13c39f-8f80-4b98-884e-0bde905ab6f9-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.009827 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c13c39f-8f80-4b98-884e-0bde905ab6f9-logs\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.030502 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c13c39f-8f80-4b98-884e-0bde905ab6f9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.037221 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c13c39f-8f80-4b98-884e-0bde905ab6f9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.038017 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7c13c39f-8f80-4b98-884e-0bde905ab6f9-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.042346 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr98l\" (UniqueName: \"kubernetes.io/projected/7c13c39f-8f80-4b98-884e-0bde905ab6f9-kube-api-access-zr98l\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.062950 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c13c39f-8f80-4b98-884e-0bde905ab6f9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7c13c39f-8f80-4b98-884e-0bde905ab6f9\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:05:49 crc kubenswrapper[4854]: I1007 14:05:49.216260 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 14:05:50 crc kubenswrapper[4854]: I1007 14:05:50.717653 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e9c69fe-fbae-4bb3-9b45-99adc96401fb" path="/var/lib/kubelet/pods/6e9c69fe-fbae-4bb3-9b45-99adc96401fb/volumes" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.044742 4854 scope.go:117] "RemoveContainer" containerID="5650f623e058e72a787570039c9691294f481f5d28b04702b86e50542b55c0e7" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.187284 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.303196 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-config-data\") pod \"e598d141-22f5-4f79-8f6f-ca3f5051e894\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.303631 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-combined-ca-bundle\") pod \"e598d141-22f5-4f79-8f6f-ca3f5051e894\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.303712 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-scripts\") pod \"e598d141-22f5-4f79-8f6f-ca3f5051e894\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.303845 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e598d141-22f5-4f79-8f6f-ca3f5051e894-httpd-run\") pod \"e598d141-22f5-4f79-8f6f-ca3f5051e894\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.303865 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e598d141-22f5-4f79-8f6f-ca3f5051e894-logs\") pod \"e598d141-22f5-4f79-8f6f-ca3f5051e894\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.303938 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e598d141-22f5-4f79-8f6f-ca3f5051e894-ceph\") pod \"e598d141-22f5-4f79-8f6f-ca3f5051e894\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.303965 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9j9z\" (UniqueName: \"kubernetes.io/projected/e598d141-22f5-4f79-8f6f-ca3f5051e894-kube-api-access-j9j9z\") pod \"e598d141-22f5-4f79-8f6f-ca3f5051e894\" (UID: \"e598d141-22f5-4f79-8f6f-ca3f5051e894\") " Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.304636 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e598d141-22f5-4f79-8f6f-ca3f5051e894-logs" (OuterVolumeSpecName: "logs") pod "e598d141-22f5-4f79-8f6f-ca3f5051e894" (UID: "e598d141-22f5-4f79-8f6f-ca3f5051e894"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.304786 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e598d141-22f5-4f79-8f6f-ca3f5051e894-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e598d141-22f5-4f79-8f6f-ca3f5051e894" (UID: "e598d141-22f5-4f79-8f6f-ca3f5051e894"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.313634 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e598d141-22f5-4f79-8f6f-ca3f5051e894-ceph" (OuterVolumeSpecName: "ceph") pod "e598d141-22f5-4f79-8f6f-ca3f5051e894" (UID: "e598d141-22f5-4f79-8f6f-ca3f5051e894"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.322452 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e598d141-22f5-4f79-8f6f-ca3f5051e894-kube-api-access-j9j9z" (OuterVolumeSpecName: "kube-api-access-j9j9z") pod "e598d141-22f5-4f79-8f6f-ca3f5051e894" (UID: "e598d141-22f5-4f79-8f6f-ca3f5051e894"). InnerVolumeSpecName "kube-api-access-j9j9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.324098 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-scripts" (OuterVolumeSpecName: "scripts") pod "e598d141-22f5-4f79-8f6f-ca3f5051e894" (UID: "e598d141-22f5-4f79-8f6f-ca3f5051e894"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.398020 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e598d141-22f5-4f79-8f6f-ca3f5051e894" (UID: "e598d141-22f5-4f79-8f6f-ca3f5051e894"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.407678 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.407727 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.407739 4854 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e598d141-22f5-4f79-8f6f-ca3f5051e894-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.407747 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e598d141-22f5-4f79-8f6f-ca3f5051e894-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.407755 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e598d141-22f5-4f79-8f6f-ca3f5051e894-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.407766 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9j9z\" (UniqueName: \"kubernetes.io/projected/e598d141-22f5-4f79-8f6f-ca3f5051e894-kube-api-access-j9j9z\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.416861 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-config-data" (OuterVolumeSpecName: "config-data") pod "e598d141-22f5-4f79-8f6f-ca3f5051e894" (UID: "e598d141-22f5-4f79-8f6f-ca3f5051e894"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.509563 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e598d141-22f5-4f79-8f6f-ca3f5051e894-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.755140 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.884647 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc8fddb6f-hjsgw" event={"ID":"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e","Type":"ContainerStarted","Data":"387346c2ce812d489e65fd622f0f8faa465e508c13fb53dddfff2849c57cb6d7"} Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.884701 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc8fddb6f-hjsgw" event={"ID":"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e","Type":"ContainerStarted","Data":"37abe06239a795e4745b619a4e4d84a4a95ee98a58d5f2e89752c7a2c7f8585d"} Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.884809 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cc8fddb6f-hjsgw" podUID="a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" containerName="horizon-log" containerID="cri-o://37abe06239a795e4745b619a4e4d84a4a95ee98a58d5f2e89752c7a2c7f8585d" gracePeriod=30 Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.884857 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cc8fddb6f-hjsgw" podUID="a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" containerName="horizon" containerID="cri-o://387346c2ce812d489e65fd622f0f8faa465e508c13fb53dddfff2849c57cb6d7" gracePeriod=30 Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.887166 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78589d7845-lb7r8" event={"ID":"7e937537-3ce6-44a9-80ca-d1378ef14544","Type":"ContainerStarted","Data":"17f219ef799682b0ffa1592f854acfcfbf6d9d555d1fad79287bf1069b1a7b4e"} Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.887277 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78589d7845-lb7r8" event={"ID":"7e937537-3ce6-44a9-80ca-d1378ef14544","Type":"ContainerStarted","Data":"5c91101980872222349f6cba0be8173f4ae268164c902d12efd539b19654d539"} Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.890632 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7c13c39f-8f80-4b98-884e-0bde905ab6f9","Type":"ContainerStarted","Data":"888b0f810bbef7da46feec163a7092d544c068ab6feb0c7714f33898289ca9e2"} Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.892911 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6667b4cddf-7z9f7" event={"ID":"a79a49ce-66fc-404d-ba1d-373e0daa3f3a","Type":"ContainerStarted","Data":"1234847130cdb06fc581186759c2f156c559bf8e4cc97a9589035ebe58722d3b"} Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.892938 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6667b4cddf-7z9f7" event={"ID":"a79a49ce-66fc-404d-ba1d-373e0daa3f3a","Type":"ContainerStarted","Data":"b08576167ff133bf6d1ed3fc2104769174c41a3c723b104798e9400969c310b1"} Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.896907 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e598d141-22f5-4f79-8f6f-ca3f5051e894","Type":"ContainerDied","Data":"cbb3222cd52901334f071a9da5dee43576b9aba82ab3e08bc733a55f2bd28c6d"} Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.896945 4854 scope.go:117] "RemoveContainer" containerID="169ec0c025bfa7ffec1106bc9a13be9fbd77d7b8286f71799c28c7c03dc9a0eb" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.897021 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.906963 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7cc8fddb6f-hjsgw" podStartSLOduration=2.547016206 podStartE2EDuration="10.906942745s" podCreationTimestamp="2025-10-07 14:05:43 +0000 UTC" firstStartedPulling="2025-10-07 14:05:44.795788842 +0000 UTC m=+6060.783621097" lastFinishedPulling="2025-10-07 14:05:53.155715341 +0000 UTC m=+6069.143547636" observedRunningTime="2025-10-07 14:05:53.902678313 +0000 UTC m=+6069.890510578" watchObservedRunningTime="2025-10-07 14:05:53.906942745 +0000 UTC m=+6069.894774990" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.938158 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6667b4cddf-7z9f7" podStartSLOduration=1.8125856360000001 podStartE2EDuration="9.938126561s" podCreationTimestamp="2025-10-07 14:05:44 +0000 UTC" firstStartedPulling="2025-10-07 14:05:45.117493025 +0000 UTC m=+6061.105325280" lastFinishedPulling="2025-10-07 14:05:53.24303393 +0000 UTC m=+6069.230866205" observedRunningTime="2025-10-07 14:05:53.931558833 +0000 UTC m=+6069.919391108" watchObservedRunningTime="2025-10-07 14:05:53.938126561 +0000 UTC m=+6069.925958816" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.940217 4854 scope.go:117] "RemoveContainer" containerID="b87e87cd1bf9464c9fcfc1bb9b5ffa1bc656a45581b40934961e33158b1b568e" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.951545 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-78589d7845-lb7r8" podStartSLOduration=2.36468691 podStartE2EDuration="9.951523786s" podCreationTimestamp="2025-10-07 14:05:44 +0000 UTC" firstStartedPulling="2025-10-07 14:05:45.567382022 +0000 UTC m=+6061.555214277" lastFinishedPulling="2025-10-07 14:05:53.154218888 +0000 UTC m=+6069.142051153" observedRunningTime="2025-10-07 14:05:53.946465021 +0000 UTC m=+6069.934297276" watchObservedRunningTime="2025-10-07 14:05:53.951523786 +0000 UTC m=+6069.939356051" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.974861 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.987099 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.996104 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:05:53 crc kubenswrapper[4854]: E1007 14:05:53.996698 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e598d141-22f5-4f79-8f6f-ca3f5051e894" containerName="glance-log" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.996763 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e598d141-22f5-4f79-8f6f-ca3f5051e894" containerName="glance-log" Oct 07 14:05:53 crc kubenswrapper[4854]: E1007 14:05:53.996867 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e598d141-22f5-4f79-8f6f-ca3f5051e894" containerName="glance-httpd" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.996929 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e598d141-22f5-4f79-8f6f-ca3f5051e894" containerName="glance-httpd" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.997270 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="e598d141-22f5-4f79-8f6f-ca3f5051e894" containerName="glance-httpd" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.997373 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="e598d141-22f5-4f79-8f6f-ca3f5051e894" containerName="glance-log" Oct 07 14:05:53 crc kubenswrapper[4854]: I1007 14:05:53.998660 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.001764 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.008940 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.120239 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9aa2b1-2b05-44ab-acfc-09927fda7603-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.121022 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf9aa2b1-2b05-44ab-acfc-09927fda7603-logs\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.121065 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf9aa2b1-2b05-44ab-acfc-09927fda7603-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.121094 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf9aa2b1-2b05-44ab-acfc-09927fda7603-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.121201 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf9aa2b1-2b05-44ab-acfc-09927fda7603-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.121245 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhnd5\" (UniqueName: \"kubernetes.io/projected/cf9aa2b1-2b05-44ab-acfc-09927fda7603-kube-api-access-xhnd5\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.121280 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cf9aa2b1-2b05-44ab-acfc-09927fda7603-ceph\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.223341 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf9aa2b1-2b05-44ab-acfc-09927fda7603-logs\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.223417 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf9aa2b1-2b05-44ab-acfc-09927fda7603-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.223445 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf9aa2b1-2b05-44ab-acfc-09927fda7603-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.223565 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf9aa2b1-2b05-44ab-acfc-09927fda7603-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.223606 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhnd5\" (UniqueName: \"kubernetes.io/projected/cf9aa2b1-2b05-44ab-acfc-09927fda7603-kube-api-access-xhnd5\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.223646 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cf9aa2b1-2b05-44ab-acfc-09927fda7603-ceph\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.223728 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9aa2b1-2b05-44ab-acfc-09927fda7603-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.223978 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf9aa2b1-2b05-44ab-acfc-09927fda7603-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.224049 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf9aa2b1-2b05-44ab-acfc-09927fda7603-logs\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.229786 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cf9aa2b1-2b05-44ab-acfc-09927fda7603-ceph\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.230103 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf9aa2b1-2b05-44ab-acfc-09927fda7603-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.231920 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9aa2b1-2b05-44ab-acfc-09927fda7603-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.239849 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhnd5\" (UniqueName: \"kubernetes.io/projected/cf9aa2b1-2b05-44ab-acfc-09927fda7603-kube-api-access-xhnd5\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.245948 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf9aa2b1-2b05-44ab-acfc-09927fda7603-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf9aa2b1-2b05-44ab-acfc-09927fda7603\") " pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.285204 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.406480 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.521895 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.522232 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.731645 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e598d141-22f5-4f79-8f6f-ca3f5051e894" path="/var/lib/kubelet/pods/e598d141-22f5-4f79-8f6f-ca3f5051e894/volumes" Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.909865 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7c13c39f-8f80-4b98-884e-0bde905ab6f9","Type":"ContainerStarted","Data":"b195cdb8ec3300923211fe9f7fb8cf8d1a2024eef6f01224eb6a41e57c3a2f89"} Oct 07 14:05:54 crc kubenswrapper[4854]: I1007 14:05:54.975424 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:05:55 crc kubenswrapper[4854]: I1007 14:05:55.098599 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:55 crc kubenswrapper[4854]: I1007 14:05:55.098653 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:05:55 crc kubenswrapper[4854]: I1007 14:05:55.929909 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7c13c39f-8f80-4b98-884e-0bde905ab6f9","Type":"ContainerStarted","Data":"fa77876a0bc5ad402c23305b65c6f431dee5be78e4ea9254c106ca7a26f80345"} Oct 07 14:05:55 crc kubenswrapper[4854]: I1007 14:05:55.933611 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf9aa2b1-2b05-44ab-acfc-09927fda7603","Type":"ContainerStarted","Data":"af5ea9c6c7457c8a5d02d56e5a371c95e23ca831786bf4a2227f181ff74d5038"} Oct 07 14:05:55 crc kubenswrapper[4854]: I1007 14:05:55.933687 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf9aa2b1-2b05-44ab-acfc-09927fda7603","Type":"ContainerStarted","Data":"937b5eb520b6edaa342fd7c71497e349c8eee2a7620c697f7d2ea63deab49f86"} Oct 07 14:05:55 crc kubenswrapper[4854]: I1007 14:05:55.968852 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.968832047 podStartE2EDuration="7.968832047s" podCreationTimestamp="2025-10-07 14:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:05:55.951951782 +0000 UTC m=+6071.939784037" watchObservedRunningTime="2025-10-07 14:05:55.968832047 +0000 UTC m=+6071.956664302" Oct 07 14:05:56 crc kubenswrapper[4854]: I1007 14:05:56.943694 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf9aa2b1-2b05-44ab-acfc-09927fda7603","Type":"ContainerStarted","Data":"2c520eaa1f073eb5e101ed5d189c47c37f7c487287e11517952f0d5572bc5b09"} Oct 07 14:05:56 crc kubenswrapper[4854]: I1007 14:05:56.972864 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.9728412349999997 podStartE2EDuration="3.972841235s" podCreationTimestamp="2025-10-07 14:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:05:56.9702241 +0000 UTC m=+6072.958056375" watchObservedRunningTime="2025-10-07 14:05:56.972841235 +0000 UTC m=+6072.960673480" Oct 07 14:05:59 crc kubenswrapper[4854]: I1007 14:05:59.217298 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 14:05:59 crc kubenswrapper[4854]: I1007 14:05:59.218353 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 14:05:59 crc kubenswrapper[4854]: I1007 14:05:59.254482 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 14:05:59 crc kubenswrapper[4854]: I1007 14:05:59.292863 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 14:05:59 crc kubenswrapper[4854]: I1007 14:05:59.988986 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 14:05:59 crc kubenswrapper[4854]: I1007 14:05:59.989534 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 14:06:00 crc kubenswrapper[4854]: I1007 14:06:00.703983 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:06:00 crc kubenswrapper[4854]: E1007 14:06:00.704324 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:06:02 crc kubenswrapper[4854]: I1007 14:06:02.026960 4854 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 14:06:02 crc kubenswrapper[4854]: I1007 14:06:02.262859 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 14:06:02 crc kubenswrapper[4854]: I1007 14:06:02.265831 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 14:06:04 crc kubenswrapper[4854]: I1007 14:06:04.406738 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 14:06:04 crc kubenswrapper[4854]: I1007 14:06:04.407484 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 14:06:04 crc kubenswrapper[4854]: I1007 14:06:04.446959 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 14:06:04 crc kubenswrapper[4854]: I1007 14:06:04.461641 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 14:06:04 crc kubenswrapper[4854]: I1007 14:06:04.520931 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6667b4cddf-7z9f7" podUID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 07 14:06:05 crc kubenswrapper[4854]: I1007 14:06:05.067512 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 14:06:05 crc kubenswrapper[4854]: I1007 14:06:05.067844 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 14:06:05 crc kubenswrapper[4854]: I1007 14:06:05.100309 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-78589d7845-lb7r8" podUID="7e937537-3ce6-44a9-80ca-d1378ef14544" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Oct 07 14:06:06 crc kubenswrapper[4854]: I1007 14:06:06.999157 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 14:06:07 crc kubenswrapper[4854]: I1007 14:06:07.000117 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 14:06:14 crc kubenswrapper[4854]: I1007 14:06:14.710386 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:06:14 crc kubenswrapper[4854]: E1007 14:06:14.711121 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:06:15 crc kubenswrapper[4854]: I1007 14:06:15.062038 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-58fb9"] Oct 07 14:06:15 crc kubenswrapper[4854]: I1007 14:06:15.074198 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-58fb9"] Oct 07 14:06:16 crc kubenswrapper[4854]: I1007 14:06:16.382736 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:06:16 crc kubenswrapper[4854]: I1007 14:06:16.739966 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32f4a848-c854-4263-98be-8d6ac718eff6" path="/var/lib/kubelet/pods/32f4a848-c854-4263-98be-8d6ac718eff6/volumes" Oct 07 14:06:16 crc kubenswrapper[4854]: I1007 14:06:16.993029 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:06:18 crc kubenswrapper[4854]: I1007 14:06:18.157394 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:06:18 crc kubenswrapper[4854]: I1007 14:06:18.624720 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:06:18 crc kubenswrapper[4854]: I1007 14:06:18.686918 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6667b4cddf-7z9f7"] Oct 07 14:06:18 crc kubenswrapper[4854]: I1007 14:06:18.687592 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6667b4cddf-7z9f7" podUID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" containerName="horizon-log" containerID="cri-o://b08576167ff133bf6d1ed3fc2104769174c41a3c723b104798e9400969c310b1" gracePeriod=30 Oct 07 14:06:18 crc kubenswrapper[4854]: I1007 14:06:18.687675 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6667b4cddf-7z9f7" podUID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" containerName="horizon" containerID="cri-o://1234847130cdb06fc581186759c2f156c559bf8e4cc97a9589035ebe58722d3b" gracePeriod=30 Oct 07 14:06:22 crc kubenswrapper[4854]: I1007 14:06:22.308686 4854 generic.go:334] "Generic (PLEG): container finished" podID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" containerID="1234847130cdb06fc581186759c2f156c559bf8e4cc97a9589035ebe58722d3b" exitCode=0 Oct 07 14:06:22 crc kubenswrapper[4854]: I1007 14:06:22.308743 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6667b4cddf-7z9f7" event={"ID":"a79a49ce-66fc-404d-ba1d-373e0daa3f3a","Type":"ContainerDied","Data":"1234847130cdb06fc581186759c2f156c559bf8e4cc97a9589035ebe58722d3b"} Oct 07 14:06:23 crc kubenswrapper[4854]: I1007 14:06:23.736653 4854 scope.go:117] "RemoveContainer" containerID="3aacd9218c357a9a595eb9b8f50e2ccdf5c28c8a92851849b885bdc36a1d7cf7" Oct 07 14:06:23 crc kubenswrapper[4854]: I1007 14:06:23.772855 4854 scope.go:117] "RemoveContainer" containerID="9417fa9bccf9488ffc05b9934733eeb027211979eb3310c30edda1b45ad53f81" Oct 07 14:06:23 crc kubenswrapper[4854]: I1007 14:06:23.844474 4854 scope.go:117] "RemoveContainer" containerID="39540665e851bdcde17936534d7635fc62c065a102d446e32338dfbc58e5ced0" Oct 07 14:06:23 crc kubenswrapper[4854]: I1007 14:06:23.896335 4854 scope.go:117] "RemoveContainer" containerID="4e44f53ef93e55b13229f45f8c6a7879f7ee6658377eb002d71600ec998e067c" Oct 07 14:06:23 crc kubenswrapper[4854]: I1007 14:06:23.944523 4854 scope.go:117] "RemoveContainer" containerID="b895906b4b9e14129973bb658195cdd113392aed549dac8ab95bdd59464938db" Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.073741 4854 scope.go:117] "RemoveContainer" containerID="1871b936d0f6ff5f03623762fc356d0693e8dd110d47b103caa243cd30043372" Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.331067 4854 generic.go:334] "Generic (PLEG): container finished" podID="a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" containerID="387346c2ce812d489e65fd622f0f8faa465e508c13fb53dddfff2849c57cb6d7" exitCode=137 Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.331369 4854 generic.go:334] "Generic (PLEG): container finished" podID="a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" containerID="37abe06239a795e4745b619a4e4d84a4a95ee98a58d5f2e89752c7a2c7f8585d" exitCode=137 Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.331139 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc8fddb6f-hjsgw" event={"ID":"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e","Type":"ContainerDied","Data":"387346c2ce812d489e65fd622f0f8faa465e508c13fb53dddfff2849c57cb6d7"} Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.331407 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc8fddb6f-hjsgw" event={"ID":"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e","Type":"ContainerDied","Data":"37abe06239a795e4745b619a4e4d84a4a95ee98a58d5f2e89752c7a2c7f8585d"} Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.522346 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6667b4cddf-7z9f7" podUID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.882372 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.958550 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-horizon-secret-key\") pod \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.958651 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-logs\") pod \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.958733 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-scripts\") pod \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.958822 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-config-data\") pod \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.958882 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5f49\" (UniqueName: \"kubernetes.io/projected/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-kube-api-access-h5f49\") pod \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\" (UID: \"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e\") " Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.959480 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-logs" (OuterVolumeSpecName: "logs") pod "a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" (UID: "a9fc8309-5f00-4acb-b0c7-0fb734d99c5e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.964212 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" (UID: "a9fc8309-5f00-4acb-b0c7-0fb734d99c5e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.964391 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-kube-api-access-h5f49" (OuterVolumeSpecName: "kube-api-access-h5f49") pod "a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" (UID: "a9fc8309-5f00-4acb-b0c7-0fb734d99c5e"). InnerVolumeSpecName "kube-api-access-h5f49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:06:24 crc kubenswrapper[4854]: I1007 14:06:24.993502 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-scripts" (OuterVolumeSpecName: "scripts") pod "a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" (UID: "a9fc8309-5f00-4acb-b0c7-0fb734d99c5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.001137 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-config-data" (OuterVolumeSpecName: "config-data") pod "a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" (UID: "a9fc8309-5f00-4acb-b0c7-0fb734d99c5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.030384 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-336c-account-create-679lq"] Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.038177 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-336c-account-create-679lq"] Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.061395 4854 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.061423 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.061434 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.061442 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.061456 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5f49\" (UniqueName: \"kubernetes.io/projected/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e-kube-api-access-h5f49\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.342966 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc8fddb6f-hjsgw" event={"ID":"a9fc8309-5f00-4acb-b0c7-0fb734d99c5e","Type":"ContainerDied","Data":"ba8a66d62c6bcd98a204eeaf8893250326469e0ef003afb4db209f36e2f980c0"} Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.343014 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cc8fddb6f-hjsgw" Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.343030 4854 scope.go:117] "RemoveContainer" containerID="387346c2ce812d489e65fd622f0f8faa465e508c13fb53dddfff2849c57cb6d7" Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.377538 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cc8fddb6f-hjsgw"] Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.385375 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7cc8fddb6f-hjsgw"] Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.569314 4854 scope.go:117] "RemoveContainer" containerID="37abe06239a795e4745b619a4e4d84a4a95ee98a58d5f2e89752c7a2c7f8585d" Oct 07 14:06:25 crc kubenswrapper[4854]: I1007 14:06:25.702554 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:06:25 crc kubenswrapper[4854]: E1007 14:06:25.702983 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:06:26 crc kubenswrapper[4854]: I1007 14:06:26.713349 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de4cb82-f1ed-484b-ad54-53bab54436c6" path="/var/lib/kubelet/pods/8de4cb82-f1ed-484b-ad54-53bab54436c6/volumes" Oct 07 14:06:26 crc kubenswrapper[4854]: I1007 14:06:26.715345 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" path="/var/lib/kubelet/pods/a9fc8309-5f00-4acb-b0c7-0fb734d99c5e/volumes" Oct 07 14:06:32 crc kubenswrapper[4854]: I1007 14:06:32.037141 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-7g4fc"] Oct 07 14:06:32 crc kubenswrapper[4854]: I1007 14:06:32.048079 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-7g4fc"] Oct 07 14:06:32 crc kubenswrapper[4854]: I1007 14:06:32.713251 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15fef45-0955-4710-9c77-a73aea90e94a" path="/var/lib/kubelet/pods/a15fef45-0955-4710-9c77-a73aea90e94a/volumes" Oct 07 14:06:34 crc kubenswrapper[4854]: I1007 14:06:34.519180 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6667b4cddf-7z9f7" podUID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 07 14:06:37 crc kubenswrapper[4854]: I1007 14:06:37.703720 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:06:37 crc kubenswrapper[4854]: E1007 14:06:37.704391 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:06:44 crc kubenswrapper[4854]: I1007 14:06:44.519837 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6667b4cddf-7z9f7" podUID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 07 14:06:44 crc kubenswrapper[4854]: I1007 14:06:44.520473 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.116474 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.216247 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-config-data\") pod \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.216309 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-scripts\") pod \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.216392 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-logs\") pod \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.216457 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-horizon-secret-key\") pod \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.216723 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm8k9\" (UniqueName: \"kubernetes.io/projected/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-kube-api-access-hm8k9\") pod \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\" (UID: \"a79a49ce-66fc-404d-ba1d-373e0daa3f3a\") " Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.216997 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-logs" (OuterVolumeSpecName: "logs") pod "a79a49ce-66fc-404d-ba1d-373e0daa3f3a" (UID: "a79a49ce-66fc-404d-ba1d-373e0daa3f3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.219430 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.222342 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a79a49ce-66fc-404d-ba1d-373e0daa3f3a" (UID: "a79a49ce-66fc-404d-ba1d-373e0daa3f3a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.222724 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-kube-api-access-hm8k9" (OuterVolumeSpecName: "kube-api-access-hm8k9") pod "a79a49ce-66fc-404d-ba1d-373e0daa3f3a" (UID: "a79a49ce-66fc-404d-ba1d-373e0daa3f3a"). InnerVolumeSpecName "kube-api-access-hm8k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.243853 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-scripts" (OuterVolumeSpecName: "scripts") pod "a79a49ce-66fc-404d-ba1d-373e0daa3f3a" (UID: "a79a49ce-66fc-404d-ba1d-373e0daa3f3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.260909 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-config-data" (OuterVolumeSpecName: "config-data") pod "a79a49ce-66fc-404d-ba1d-373e0daa3f3a" (UID: "a79a49ce-66fc-404d-ba1d-373e0daa3f3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.321680 4854 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.321725 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm8k9\" (UniqueName: \"kubernetes.io/projected/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-kube-api-access-hm8k9\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.321740 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.321752 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a79a49ce-66fc-404d-ba1d-373e0daa3f3a-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.612043 4854 generic.go:334] "Generic (PLEG): container finished" podID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" containerID="b08576167ff133bf6d1ed3fc2104769174c41a3c723b104798e9400969c310b1" exitCode=137 Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.612120 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6667b4cddf-7z9f7" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.612099 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6667b4cddf-7z9f7" event={"ID":"a79a49ce-66fc-404d-ba1d-373e0daa3f3a","Type":"ContainerDied","Data":"b08576167ff133bf6d1ed3fc2104769174c41a3c723b104798e9400969c310b1"} Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.612495 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6667b4cddf-7z9f7" event={"ID":"a79a49ce-66fc-404d-ba1d-373e0daa3f3a","Type":"ContainerDied","Data":"9291251a81bb537914b8d633385008f2a0c9f22aa6188c9f0fd7f75f0657d044"} Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.612523 4854 scope.go:117] "RemoveContainer" containerID="1234847130cdb06fc581186759c2f156c559bf8e4cc97a9589035ebe58722d3b" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.670908 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6667b4cddf-7z9f7"] Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.681155 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6667b4cddf-7z9f7"] Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.817126 4854 scope.go:117] "RemoveContainer" containerID="b08576167ff133bf6d1ed3fc2104769174c41a3c723b104798e9400969c310b1" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.840507 4854 scope.go:117] "RemoveContainer" containerID="1234847130cdb06fc581186759c2f156c559bf8e4cc97a9589035ebe58722d3b" Oct 07 14:06:49 crc kubenswrapper[4854]: E1007 14:06:49.841218 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1234847130cdb06fc581186759c2f156c559bf8e4cc97a9589035ebe58722d3b\": container with ID starting with 1234847130cdb06fc581186759c2f156c559bf8e4cc97a9589035ebe58722d3b not found: ID does not exist" containerID="1234847130cdb06fc581186759c2f156c559bf8e4cc97a9589035ebe58722d3b" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.841294 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1234847130cdb06fc581186759c2f156c559bf8e4cc97a9589035ebe58722d3b"} err="failed to get container status \"1234847130cdb06fc581186759c2f156c559bf8e4cc97a9589035ebe58722d3b\": rpc error: code = NotFound desc = could not find container \"1234847130cdb06fc581186759c2f156c559bf8e4cc97a9589035ebe58722d3b\": container with ID starting with 1234847130cdb06fc581186759c2f156c559bf8e4cc97a9589035ebe58722d3b not found: ID does not exist" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.841334 4854 scope.go:117] "RemoveContainer" containerID="b08576167ff133bf6d1ed3fc2104769174c41a3c723b104798e9400969c310b1" Oct 07 14:06:49 crc kubenswrapper[4854]: E1007 14:06:49.841700 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b08576167ff133bf6d1ed3fc2104769174c41a3c723b104798e9400969c310b1\": container with ID starting with b08576167ff133bf6d1ed3fc2104769174c41a3c723b104798e9400969c310b1 not found: ID does not exist" containerID="b08576167ff133bf6d1ed3fc2104769174c41a3c723b104798e9400969c310b1" Oct 07 14:06:49 crc kubenswrapper[4854]: I1007 14:06:49.841767 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b08576167ff133bf6d1ed3fc2104769174c41a3c723b104798e9400969c310b1"} err="failed to get container status \"b08576167ff133bf6d1ed3fc2104769174c41a3c723b104798e9400969c310b1\": rpc error: code = NotFound desc = could not find container \"b08576167ff133bf6d1ed3fc2104769174c41a3c723b104798e9400969c310b1\": container with ID starting with b08576167ff133bf6d1ed3fc2104769174c41a3c723b104798e9400969c310b1 not found: ID does not exist" Oct 07 14:06:50 crc kubenswrapper[4854]: I1007 14:06:50.719638 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" path="/var/lib/kubelet/pods/a79a49ce-66fc-404d-ba1d-373e0daa3f3a/volumes" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.702937 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:06:52 crc kubenswrapper[4854]: E1007 14:06:52.703939 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.756720 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6fbd55998f-s8xh4"] Oct 07 14:06:52 crc kubenswrapper[4854]: E1007 14:06:52.757132 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" containerName="horizon-log" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.757153 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" containerName="horizon-log" Oct 07 14:06:52 crc kubenswrapper[4854]: E1007 14:06:52.757308 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" containerName="horizon" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.757317 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" containerName="horizon" Oct 07 14:06:52 crc kubenswrapper[4854]: E1007 14:06:52.757347 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" containerName="horizon-log" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.757355 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" containerName="horizon-log" Oct 07 14:06:52 crc kubenswrapper[4854]: E1007 14:06:52.757382 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" containerName="horizon" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.757390 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" containerName="horizon" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.757642 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" containerName="horizon-log" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.757666 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" containerName="horizon-log" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.757694 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79a49ce-66fc-404d-ba1d-373e0daa3f3a" containerName="horizon" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.757709 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9fc8309-5f00-4acb-b0c7-0fb734d99c5e" containerName="horizon" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.759031 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.772907 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fbd55998f-s8xh4"] Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.804585 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-config-data\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.804667 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-scripts\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.804762 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-horizon-secret-key\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.804794 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64gzk\" (UniqueName: \"kubernetes.io/projected/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-kube-api-access-64gzk\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.804900 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-logs\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.906536 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-config-data\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.906606 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-scripts\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.906645 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-horizon-secret-key\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.909218 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64gzk\" (UniqueName: \"kubernetes.io/projected/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-kube-api-access-64gzk\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.909448 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-logs\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.910001 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-logs\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.907805 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-scripts\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.907980 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-config-data\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.925294 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-horizon-secret-key\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:52 crc kubenswrapper[4854]: I1007 14:06:52.925419 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64gzk\" (UniqueName: \"kubernetes.io/projected/1d5b602a-fb2f-4b4a-8170-d64ee1e29f27-kube-api-access-64gzk\") pod \"horizon-6fbd55998f-s8xh4\" (UID: \"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27\") " pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:53 crc kubenswrapper[4854]: I1007 14:06:53.098799 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:06:53 crc kubenswrapper[4854]: I1007 14:06:53.483632 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fbd55998f-s8xh4"] Oct 07 14:06:53 crc kubenswrapper[4854]: I1007 14:06:53.655267 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbd55998f-s8xh4" event={"ID":"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27","Type":"ContainerStarted","Data":"fbda9fd7e0ce0ee30c03b09a656080da69b6efb4ac45f19cf7402b4d8dc14265"} Oct 07 14:06:54 crc kubenswrapper[4854]: I1007 14:06:54.138961 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-nt4wp"] Oct 07 14:06:54 crc kubenswrapper[4854]: I1007 14:06:54.140764 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nt4wp" Oct 07 14:06:54 crc kubenswrapper[4854]: I1007 14:06:54.151747 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-nt4wp"] Oct 07 14:06:54 crc kubenswrapper[4854]: I1007 14:06:54.245392 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvw7s\" (UniqueName: \"kubernetes.io/projected/5a7de750-cb21-40dd-a6d9-c845a77cd0e5-kube-api-access-nvw7s\") pod \"heat-db-create-nt4wp\" (UID: \"5a7de750-cb21-40dd-a6d9-c845a77cd0e5\") " pod="openstack/heat-db-create-nt4wp" Oct 07 14:06:54 crc kubenswrapper[4854]: I1007 14:06:54.348843 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvw7s\" (UniqueName: \"kubernetes.io/projected/5a7de750-cb21-40dd-a6d9-c845a77cd0e5-kube-api-access-nvw7s\") pod \"heat-db-create-nt4wp\" (UID: \"5a7de750-cb21-40dd-a6d9-c845a77cd0e5\") " pod="openstack/heat-db-create-nt4wp" Oct 07 14:06:54 crc kubenswrapper[4854]: I1007 14:06:54.370078 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvw7s\" (UniqueName: \"kubernetes.io/projected/5a7de750-cb21-40dd-a6d9-c845a77cd0e5-kube-api-access-nvw7s\") pod \"heat-db-create-nt4wp\" (UID: \"5a7de750-cb21-40dd-a6d9-c845a77cd0e5\") " pod="openstack/heat-db-create-nt4wp" Oct 07 14:06:54 crc kubenswrapper[4854]: I1007 14:06:54.528559 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nt4wp" Oct 07 14:06:54 crc kubenswrapper[4854]: I1007 14:06:54.675345 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbd55998f-s8xh4" event={"ID":"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27","Type":"ContainerStarted","Data":"9e6a9afc3038e903907e7a5af44a17ca781b6ea2a78972d18557f854722805d1"} Oct 07 14:06:54 crc kubenswrapper[4854]: I1007 14:06:54.675653 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbd55998f-s8xh4" event={"ID":"1d5b602a-fb2f-4b4a-8170-d64ee1e29f27","Type":"ContainerStarted","Data":"10d1a3ab31a29c5b809cc19382883fe77705f6ebdb3f9807d89740f7b6ee6b6e"} Oct 07 14:06:54 crc kubenswrapper[4854]: I1007 14:06:54.697572 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6fbd55998f-s8xh4" podStartSLOduration=2.697549347 podStartE2EDuration="2.697549347s" podCreationTimestamp="2025-10-07 14:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:06:54.69174409 +0000 UTC m=+6130.679576365" watchObservedRunningTime="2025-10-07 14:06:54.697549347 +0000 UTC m=+6130.685381602" Oct 07 14:06:55 crc kubenswrapper[4854]: I1007 14:06:55.021802 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-nt4wp"] Oct 07 14:06:55 crc kubenswrapper[4854]: I1007 14:06:55.693091 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nt4wp" event={"ID":"5a7de750-cb21-40dd-a6d9-c845a77cd0e5","Type":"ContainerStarted","Data":"3e06c9ff028286cf5698cc142e8dea349fc8ca7ab64e62eef9f6592629d6d9e9"} Oct 07 14:06:55 crc kubenswrapper[4854]: I1007 14:06:55.693515 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nt4wp" event={"ID":"5a7de750-cb21-40dd-a6d9-c845a77cd0e5","Type":"ContainerStarted","Data":"142bb01fad468915e492530501d91680570894b84681b29d294d380af2583c53"} Oct 07 14:06:56 crc kubenswrapper[4854]: I1007 14:06:56.704447 4854 generic.go:334] "Generic (PLEG): container finished" podID="5a7de750-cb21-40dd-a6d9-c845a77cd0e5" containerID="3e06c9ff028286cf5698cc142e8dea349fc8ca7ab64e62eef9f6592629d6d9e9" exitCode=0 Oct 07 14:06:56 crc kubenswrapper[4854]: I1007 14:06:56.717828 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nt4wp" event={"ID":"5a7de750-cb21-40dd-a6d9-c845a77cd0e5","Type":"ContainerDied","Data":"3e06c9ff028286cf5698cc142e8dea349fc8ca7ab64e62eef9f6592629d6d9e9"} Oct 07 14:06:57 crc kubenswrapper[4854]: I1007 14:06:57.436521 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nt4wp" Oct 07 14:06:57 crc kubenswrapper[4854]: I1007 14:06:57.539090 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvw7s\" (UniqueName: \"kubernetes.io/projected/5a7de750-cb21-40dd-a6d9-c845a77cd0e5-kube-api-access-nvw7s\") pod \"5a7de750-cb21-40dd-a6d9-c845a77cd0e5\" (UID: \"5a7de750-cb21-40dd-a6d9-c845a77cd0e5\") " Oct 07 14:06:57 crc kubenswrapper[4854]: I1007 14:06:57.543800 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a7de750-cb21-40dd-a6d9-c845a77cd0e5-kube-api-access-nvw7s" (OuterVolumeSpecName: "kube-api-access-nvw7s") pod "5a7de750-cb21-40dd-a6d9-c845a77cd0e5" (UID: "5a7de750-cb21-40dd-a6d9-c845a77cd0e5"). InnerVolumeSpecName "kube-api-access-nvw7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:06:57 crc kubenswrapper[4854]: I1007 14:06:57.641928 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvw7s\" (UniqueName: \"kubernetes.io/projected/5a7de750-cb21-40dd-a6d9-c845a77cd0e5-kube-api-access-nvw7s\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:57 crc kubenswrapper[4854]: I1007 14:06:57.717009 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nt4wp" event={"ID":"5a7de750-cb21-40dd-a6d9-c845a77cd0e5","Type":"ContainerDied","Data":"142bb01fad468915e492530501d91680570894b84681b29d294d380af2583c53"} Oct 07 14:06:57 crc kubenswrapper[4854]: I1007 14:06:57.717054 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="142bb01fad468915e492530501d91680570894b84681b29d294d380af2583c53" Oct 07 14:06:57 crc kubenswrapper[4854]: I1007 14:06:57.717129 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nt4wp" Oct 07 14:07:03 crc kubenswrapper[4854]: I1007 14:07:03.098948 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:07:03 crc kubenswrapper[4854]: I1007 14:07:03.099547 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:07:04 crc kubenswrapper[4854]: I1007 14:07:04.261557 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-a239-account-create-r8js5"] Oct 07 14:07:04 crc kubenswrapper[4854]: E1007 14:07:04.262604 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7de750-cb21-40dd-a6d9-c845a77cd0e5" containerName="mariadb-database-create" Oct 07 14:07:04 crc kubenswrapper[4854]: I1007 14:07:04.262631 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7de750-cb21-40dd-a6d9-c845a77cd0e5" containerName="mariadb-database-create" Oct 07 14:07:04 crc kubenswrapper[4854]: I1007 14:07:04.263144 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a7de750-cb21-40dd-a6d9-c845a77cd0e5" containerName="mariadb-database-create" Oct 07 14:07:04 crc kubenswrapper[4854]: I1007 14:07:04.264423 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a239-account-create-r8js5" Oct 07 14:07:04 crc kubenswrapper[4854]: I1007 14:07:04.267271 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 07 14:07:04 crc kubenswrapper[4854]: I1007 14:07:04.275985 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-a239-account-create-r8js5"] Oct 07 14:07:04 crc kubenswrapper[4854]: I1007 14:07:04.391854 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxfwm\" (UniqueName: \"kubernetes.io/projected/154324fd-9a6c-421c-aa91-3b3cb4f5d842-kube-api-access-fxfwm\") pod \"heat-a239-account-create-r8js5\" (UID: \"154324fd-9a6c-421c-aa91-3b3cb4f5d842\") " pod="openstack/heat-a239-account-create-r8js5" Oct 07 14:07:04 crc kubenswrapper[4854]: I1007 14:07:04.494551 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxfwm\" (UniqueName: \"kubernetes.io/projected/154324fd-9a6c-421c-aa91-3b3cb4f5d842-kube-api-access-fxfwm\") pod \"heat-a239-account-create-r8js5\" (UID: \"154324fd-9a6c-421c-aa91-3b3cb4f5d842\") " pod="openstack/heat-a239-account-create-r8js5" Oct 07 14:07:04 crc kubenswrapper[4854]: I1007 14:07:04.531816 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxfwm\" (UniqueName: \"kubernetes.io/projected/154324fd-9a6c-421c-aa91-3b3cb4f5d842-kube-api-access-fxfwm\") pod \"heat-a239-account-create-r8js5\" (UID: \"154324fd-9a6c-421c-aa91-3b3cb4f5d842\") " pod="openstack/heat-a239-account-create-r8js5" Oct 07 14:07:04 crc kubenswrapper[4854]: I1007 14:07:04.595053 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a239-account-create-r8js5" Oct 07 14:07:04 crc kubenswrapper[4854]: I1007 14:07:04.721625 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:07:04 crc kubenswrapper[4854]: E1007 14:07:04.722051 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:07:04 crc kubenswrapper[4854]: I1007 14:07:04.888236 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-a239-account-create-r8js5"] Oct 07 14:07:05 crc kubenswrapper[4854]: I1007 14:07:05.790095 4854 generic.go:334] "Generic (PLEG): container finished" podID="154324fd-9a6c-421c-aa91-3b3cb4f5d842" containerID="2a86ef79dce0dfb289dc6586d81d23e3efda6f45a6968d50210066e280274609" exitCode=0 Oct 07 14:07:05 crc kubenswrapper[4854]: I1007 14:07:05.790173 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a239-account-create-r8js5" event={"ID":"154324fd-9a6c-421c-aa91-3b3cb4f5d842","Type":"ContainerDied","Data":"2a86ef79dce0dfb289dc6586d81d23e3efda6f45a6968d50210066e280274609"} Oct 07 14:07:05 crc kubenswrapper[4854]: I1007 14:07:05.790532 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a239-account-create-r8js5" event={"ID":"154324fd-9a6c-421c-aa91-3b3cb4f5d842","Type":"ContainerStarted","Data":"048ef652695452da2d628558ee91960cb4a19a71659da2f786551a048819b91f"} Oct 07 14:07:07 crc kubenswrapper[4854]: I1007 14:07:07.197121 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a239-account-create-r8js5" Oct 07 14:07:07 crc kubenswrapper[4854]: I1007 14:07:07.359947 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxfwm\" (UniqueName: \"kubernetes.io/projected/154324fd-9a6c-421c-aa91-3b3cb4f5d842-kube-api-access-fxfwm\") pod \"154324fd-9a6c-421c-aa91-3b3cb4f5d842\" (UID: \"154324fd-9a6c-421c-aa91-3b3cb4f5d842\") " Oct 07 14:07:07 crc kubenswrapper[4854]: I1007 14:07:07.369413 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/154324fd-9a6c-421c-aa91-3b3cb4f5d842-kube-api-access-fxfwm" (OuterVolumeSpecName: "kube-api-access-fxfwm") pod "154324fd-9a6c-421c-aa91-3b3cb4f5d842" (UID: "154324fd-9a6c-421c-aa91-3b3cb4f5d842"). InnerVolumeSpecName "kube-api-access-fxfwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:07:07 crc kubenswrapper[4854]: I1007 14:07:07.462312 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxfwm\" (UniqueName: \"kubernetes.io/projected/154324fd-9a6c-421c-aa91-3b3cb4f5d842-kube-api-access-fxfwm\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:07 crc kubenswrapper[4854]: I1007 14:07:07.811610 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a239-account-create-r8js5" event={"ID":"154324fd-9a6c-421c-aa91-3b3cb4f5d842","Type":"ContainerDied","Data":"048ef652695452da2d628558ee91960cb4a19a71659da2f786551a048819b91f"} Oct 07 14:07:07 crc kubenswrapper[4854]: I1007 14:07:07.811925 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="048ef652695452da2d628558ee91960cb4a19a71659da2f786551a048819b91f" Oct 07 14:07:07 crc kubenswrapper[4854]: I1007 14:07:07.811631 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a239-account-create-r8js5" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.341877 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-5sl76"] Oct 07 14:07:09 crc kubenswrapper[4854]: E1007 14:07:09.342469 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154324fd-9a6c-421c-aa91-3b3cb4f5d842" containerName="mariadb-account-create" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.342489 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="154324fd-9a6c-421c-aa91-3b3cb4f5d842" containerName="mariadb-account-create" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.342972 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="154324fd-9a6c-421c-aa91-3b3cb4f5d842" containerName="mariadb-account-create" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.344023 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-5sl76" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.346234 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-8tb2s" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.352751 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.355444 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-5sl76"] Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.405766 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10373da9-c986-4dee-853d-9bcc9892b5c1-combined-ca-bundle\") pod \"heat-db-sync-5sl76\" (UID: \"10373da9-c986-4dee-853d-9bcc9892b5c1\") " pod="openstack/heat-db-sync-5sl76" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.405870 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10373da9-c986-4dee-853d-9bcc9892b5c1-config-data\") pod \"heat-db-sync-5sl76\" (UID: \"10373da9-c986-4dee-853d-9bcc9892b5c1\") " pod="openstack/heat-db-sync-5sl76" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.405924 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2wkp\" (UniqueName: \"kubernetes.io/projected/10373da9-c986-4dee-853d-9bcc9892b5c1-kube-api-access-h2wkp\") pod \"heat-db-sync-5sl76\" (UID: \"10373da9-c986-4dee-853d-9bcc9892b5c1\") " pod="openstack/heat-db-sync-5sl76" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.508081 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10373da9-c986-4dee-853d-9bcc9892b5c1-combined-ca-bundle\") pod \"heat-db-sync-5sl76\" (UID: \"10373da9-c986-4dee-853d-9bcc9892b5c1\") " pod="openstack/heat-db-sync-5sl76" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.508201 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10373da9-c986-4dee-853d-9bcc9892b5c1-config-data\") pod \"heat-db-sync-5sl76\" (UID: \"10373da9-c986-4dee-853d-9bcc9892b5c1\") " pod="openstack/heat-db-sync-5sl76" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.508250 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2wkp\" (UniqueName: \"kubernetes.io/projected/10373da9-c986-4dee-853d-9bcc9892b5c1-kube-api-access-h2wkp\") pod \"heat-db-sync-5sl76\" (UID: \"10373da9-c986-4dee-853d-9bcc9892b5c1\") " pod="openstack/heat-db-sync-5sl76" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.515088 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10373da9-c986-4dee-853d-9bcc9892b5c1-combined-ca-bundle\") pod \"heat-db-sync-5sl76\" (UID: \"10373da9-c986-4dee-853d-9bcc9892b5c1\") " pod="openstack/heat-db-sync-5sl76" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.527346 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10373da9-c986-4dee-853d-9bcc9892b5c1-config-data\") pod \"heat-db-sync-5sl76\" (UID: \"10373da9-c986-4dee-853d-9bcc9892b5c1\") " pod="openstack/heat-db-sync-5sl76" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.531250 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2wkp\" (UniqueName: \"kubernetes.io/projected/10373da9-c986-4dee-853d-9bcc9892b5c1-kube-api-access-h2wkp\") pod \"heat-db-sync-5sl76\" (UID: \"10373da9-c986-4dee-853d-9bcc9892b5c1\") " pod="openstack/heat-db-sync-5sl76" Oct 07 14:07:09 crc kubenswrapper[4854]: I1007 14:07:09.668767 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-5sl76" Oct 07 14:07:10 crc kubenswrapper[4854]: I1007 14:07:10.114888 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-5sl76"] Oct 07 14:07:10 crc kubenswrapper[4854]: W1007 14:07:10.115828 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10373da9_c986_4dee_853d_9bcc9892b5c1.slice/crio-868e5f1adbc29737530fd452377cddb8f2fe116e3618c77ae3b6a6f0eee329ca WatchSource:0}: Error finding container 868e5f1adbc29737530fd452377cddb8f2fe116e3618c77ae3b6a6f0eee329ca: Status 404 returned error can't find the container with id 868e5f1adbc29737530fd452377cddb8f2fe116e3618c77ae3b6a6f0eee329ca Oct 07 14:07:10 crc kubenswrapper[4854]: I1007 14:07:10.848263 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-5sl76" event={"ID":"10373da9-c986-4dee-853d-9bcc9892b5c1","Type":"ContainerStarted","Data":"868e5f1adbc29737530fd452377cddb8f2fe116e3618c77ae3b6a6f0eee329ca"} Oct 07 14:07:13 crc kubenswrapper[4854]: I1007 14:07:13.102216 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6fbd55998f-s8xh4" podUID="1d5b602a-fb2f-4b4a-8170-d64ee1e29f27" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.118:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.118:8080: connect: connection refused" Oct 07 14:07:16 crc kubenswrapper[4854]: I1007 14:07:16.703366 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:07:16 crc kubenswrapper[4854]: E1007 14:07:16.704337 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:07:17 crc kubenswrapper[4854]: I1007 14:07:17.936256 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-5sl76" event={"ID":"10373da9-c986-4dee-853d-9bcc9892b5c1","Type":"ContainerStarted","Data":"ac1981ef1b44c97ceaeeb6a070174463d2109817ab081269d11a107a32aa75d9"} Oct 07 14:07:17 crc kubenswrapper[4854]: I1007 14:07:17.962505 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-5sl76" podStartSLOduration=1.437436106 podStartE2EDuration="8.962483167s" podCreationTimestamp="2025-10-07 14:07:09 +0000 UTC" firstStartedPulling="2025-10-07 14:07:10.118687258 +0000 UTC m=+6146.106519503" lastFinishedPulling="2025-10-07 14:07:17.643734309 +0000 UTC m=+6153.631566564" observedRunningTime="2025-10-07 14:07:17.958010009 +0000 UTC m=+6153.945842274" watchObservedRunningTime="2025-10-07 14:07:17.962483167 +0000 UTC m=+6153.950315422" Oct 07 14:07:19 crc kubenswrapper[4854]: I1007 14:07:19.955789 4854 generic.go:334] "Generic (PLEG): container finished" podID="10373da9-c986-4dee-853d-9bcc9892b5c1" containerID="ac1981ef1b44c97ceaeeb6a070174463d2109817ab081269d11a107a32aa75d9" exitCode=0 Oct 07 14:07:19 crc kubenswrapper[4854]: I1007 14:07:19.955815 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-5sl76" event={"ID":"10373da9-c986-4dee-853d-9bcc9892b5c1","Type":"ContainerDied","Data":"ac1981ef1b44c97ceaeeb6a070174463d2109817ab081269d11a107a32aa75d9"} Oct 07 14:07:21 crc kubenswrapper[4854]: I1007 14:07:21.374091 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-5sl76" Oct 07 14:07:21 crc kubenswrapper[4854]: I1007 14:07:21.458171 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10373da9-c986-4dee-853d-9bcc9892b5c1-config-data\") pod \"10373da9-c986-4dee-853d-9bcc9892b5c1\" (UID: \"10373da9-c986-4dee-853d-9bcc9892b5c1\") " Oct 07 14:07:21 crc kubenswrapper[4854]: I1007 14:07:21.458228 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10373da9-c986-4dee-853d-9bcc9892b5c1-combined-ca-bundle\") pod \"10373da9-c986-4dee-853d-9bcc9892b5c1\" (UID: \"10373da9-c986-4dee-853d-9bcc9892b5c1\") " Oct 07 14:07:21 crc kubenswrapper[4854]: I1007 14:07:21.458337 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2wkp\" (UniqueName: \"kubernetes.io/projected/10373da9-c986-4dee-853d-9bcc9892b5c1-kube-api-access-h2wkp\") pod \"10373da9-c986-4dee-853d-9bcc9892b5c1\" (UID: \"10373da9-c986-4dee-853d-9bcc9892b5c1\") " Oct 07 14:07:21 crc kubenswrapper[4854]: I1007 14:07:21.463933 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10373da9-c986-4dee-853d-9bcc9892b5c1-kube-api-access-h2wkp" (OuterVolumeSpecName: "kube-api-access-h2wkp") pod "10373da9-c986-4dee-853d-9bcc9892b5c1" (UID: "10373da9-c986-4dee-853d-9bcc9892b5c1"). InnerVolumeSpecName "kube-api-access-h2wkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:07:21 crc kubenswrapper[4854]: I1007 14:07:21.492134 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10373da9-c986-4dee-853d-9bcc9892b5c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10373da9-c986-4dee-853d-9bcc9892b5c1" (UID: "10373da9-c986-4dee-853d-9bcc9892b5c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:07:21 crc kubenswrapper[4854]: I1007 14:07:21.532697 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10373da9-c986-4dee-853d-9bcc9892b5c1-config-data" (OuterVolumeSpecName: "config-data") pod "10373da9-c986-4dee-853d-9bcc9892b5c1" (UID: "10373da9-c986-4dee-853d-9bcc9892b5c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:07:21 crc kubenswrapper[4854]: I1007 14:07:21.563452 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10373da9-c986-4dee-853d-9bcc9892b5c1-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:21 crc kubenswrapper[4854]: I1007 14:07:21.563481 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10373da9-c986-4dee-853d-9bcc9892b5c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:21 crc kubenswrapper[4854]: I1007 14:07:21.563491 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2wkp\" (UniqueName: \"kubernetes.io/projected/10373da9-c986-4dee-853d-9bcc9892b5c1-kube-api-access-h2wkp\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:21 crc kubenswrapper[4854]: I1007 14:07:21.973995 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-5sl76" event={"ID":"10373da9-c986-4dee-853d-9bcc9892b5c1","Type":"ContainerDied","Data":"868e5f1adbc29737530fd452377cddb8f2fe116e3618c77ae3b6a6f0eee329ca"} Oct 07 14:07:21 crc kubenswrapper[4854]: I1007 14:07:21.974266 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="868e5f1adbc29737530fd452377cddb8f2fe116e3618c77ae3b6a6f0eee329ca" Oct 07 14:07:21 crc kubenswrapper[4854]: I1007 14:07:21.974044 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-5sl76" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.264775 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-79b95db96d-7jdbk"] Oct 07 14:07:23 crc kubenswrapper[4854]: E1007 14:07:23.265509 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10373da9-c986-4dee-853d-9bcc9892b5c1" containerName="heat-db-sync" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.265525 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="10373da9-c986-4dee-853d-9bcc9892b5c1" containerName="heat-db-sync" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.265727 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="10373da9-c986-4dee-853d-9bcc9892b5c1" containerName="heat-db-sync" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.266449 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.272947 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.276535 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-79b95db96d-7jdbk"] Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.285496 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.286615 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-8tb2s" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.385348 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6df4b7cd4d-c88k6"] Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.387043 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.392393 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.402654 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f632003-d94c-4443-ab13-a0a3f1b50647-combined-ca-bundle\") pod \"heat-engine-79b95db96d-7jdbk\" (UID: \"9f632003-d94c-4443-ab13-a0a3f1b50647\") " pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.402863 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f632003-d94c-4443-ab13-a0a3f1b50647-config-data-custom\") pod \"heat-engine-79b95db96d-7jdbk\" (UID: \"9f632003-d94c-4443-ab13-a0a3f1b50647\") " pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.402892 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f632003-d94c-4443-ab13-a0a3f1b50647-config-data\") pod \"heat-engine-79b95db96d-7jdbk\" (UID: \"9f632003-d94c-4443-ab13-a0a3f1b50647\") " pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.402952 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwhrw\" (UniqueName: \"kubernetes.io/projected/9f632003-d94c-4443-ab13-a0a3f1b50647-kube-api-access-pwhrw\") pod \"heat-engine-79b95db96d-7jdbk\" (UID: \"9f632003-d94c-4443-ab13-a0a3f1b50647\") " pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.422103 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6df4b7cd4d-c88k6"] Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.460683 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-fcbd8b89f-8c6fd"] Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.462431 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.468921 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.473855 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-fcbd8b89f-8c6fd"] Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.504602 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/847a28d3-c688-4de9-8e03-5956cfdc1dd2-config-data\") pod \"heat-api-6df4b7cd4d-c88k6\" (UID: \"847a28d3-c688-4de9-8e03-5956cfdc1dd2\") " pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.504676 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/847a28d3-c688-4de9-8e03-5956cfdc1dd2-config-data-custom\") pod \"heat-api-6df4b7cd4d-c88k6\" (UID: \"847a28d3-c688-4de9-8e03-5956cfdc1dd2\") " pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.504724 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/847a28d3-c688-4de9-8e03-5956cfdc1dd2-combined-ca-bundle\") pod \"heat-api-6df4b7cd4d-c88k6\" (UID: \"847a28d3-c688-4de9-8e03-5956cfdc1dd2\") " pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.504747 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48dwn\" (UniqueName: \"kubernetes.io/projected/847a28d3-c688-4de9-8e03-5956cfdc1dd2-kube-api-access-48dwn\") pod \"heat-api-6df4b7cd4d-c88k6\" (UID: \"847a28d3-c688-4de9-8e03-5956cfdc1dd2\") " pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.504837 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f632003-d94c-4443-ab13-a0a3f1b50647-config-data-custom\") pod \"heat-engine-79b95db96d-7jdbk\" (UID: \"9f632003-d94c-4443-ab13-a0a3f1b50647\") " pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.504859 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f632003-d94c-4443-ab13-a0a3f1b50647-config-data\") pod \"heat-engine-79b95db96d-7jdbk\" (UID: \"9f632003-d94c-4443-ab13-a0a3f1b50647\") " pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.504908 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwhrw\" (UniqueName: \"kubernetes.io/projected/9f632003-d94c-4443-ab13-a0a3f1b50647-kube-api-access-pwhrw\") pod \"heat-engine-79b95db96d-7jdbk\" (UID: \"9f632003-d94c-4443-ab13-a0a3f1b50647\") " pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.504957 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f632003-d94c-4443-ab13-a0a3f1b50647-combined-ca-bundle\") pod \"heat-engine-79b95db96d-7jdbk\" (UID: \"9f632003-d94c-4443-ab13-a0a3f1b50647\") " pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.529427 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f632003-d94c-4443-ab13-a0a3f1b50647-config-data-custom\") pod \"heat-engine-79b95db96d-7jdbk\" (UID: \"9f632003-d94c-4443-ab13-a0a3f1b50647\") " pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.529442 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f632003-d94c-4443-ab13-a0a3f1b50647-combined-ca-bundle\") pod \"heat-engine-79b95db96d-7jdbk\" (UID: \"9f632003-d94c-4443-ab13-a0a3f1b50647\") " pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.529811 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f632003-d94c-4443-ab13-a0a3f1b50647-config-data\") pod \"heat-engine-79b95db96d-7jdbk\" (UID: \"9f632003-d94c-4443-ab13-a0a3f1b50647\") " pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.537940 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwhrw\" (UniqueName: \"kubernetes.io/projected/9f632003-d94c-4443-ab13-a0a3f1b50647-kube-api-access-pwhrw\") pod \"heat-engine-79b95db96d-7jdbk\" (UID: \"9f632003-d94c-4443-ab13-a0a3f1b50647\") " pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.591610 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.606909 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k49jr\" (UniqueName: \"kubernetes.io/projected/4072ba4e-6d99-4149-be5d-fe68ccfd5622-kube-api-access-k49jr\") pod \"heat-cfnapi-fcbd8b89f-8c6fd\" (UID: \"4072ba4e-6d99-4149-be5d-fe68ccfd5622\") " pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.607490 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4072ba4e-6d99-4149-be5d-fe68ccfd5622-combined-ca-bundle\") pod \"heat-cfnapi-fcbd8b89f-8c6fd\" (UID: \"4072ba4e-6d99-4149-be5d-fe68ccfd5622\") " pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.607733 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/847a28d3-c688-4de9-8e03-5956cfdc1dd2-config-data\") pod \"heat-api-6df4b7cd4d-c88k6\" (UID: \"847a28d3-c688-4de9-8e03-5956cfdc1dd2\") " pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.607803 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4072ba4e-6d99-4149-be5d-fe68ccfd5622-config-data-custom\") pod \"heat-cfnapi-fcbd8b89f-8c6fd\" (UID: \"4072ba4e-6d99-4149-be5d-fe68ccfd5622\") " pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.607866 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/847a28d3-c688-4de9-8e03-5956cfdc1dd2-config-data-custom\") pod \"heat-api-6df4b7cd4d-c88k6\" (UID: \"847a28d3-c688-4de9-8e03-5956cfdc1dd2\") " pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.607995 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/847a28d3-c688-4de9-8e03-5956cfdc1dd2-combined-ca-bundle\") pod \"heat-api-6df4b7cd4d-c88k6\" (UID: \"847a28d3-c688-4de9-8e03-5956cfdc1dd2\") " pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.608036 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48dwn\" (UniqueName: \"kubernetes.io/projected/847a28d3-c688-4de9-8e03-5956cfdc1dd2-kube-api-access-48dwn\") pod \"heat-api-6df4b7cd4d-c88k6\" (UID: \"847a28d3-c688-4de9-8e03-5956cfdc1dd2\") " pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.608095 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4072ba4e-6d99-4149-be5d-fe68ccfd5622-config-data\") pod \"heat-cfnapi-fcbd8b89f-8c6fd\" (UID: \"4072ba4e-6d99-4149-be5d-fe68ccfd5622\") " pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.613367 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/847a28d3-c688-4de9-8e03-5956cfdc1dd2-config-data-custom\") pod \"heat-api-6df4b7cd4d-c88k6\" (UID: \"847a28d3-c688-4de9-8e03-5956cfdc1dd2\") " pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.618428 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/847a28d3-c688-4de9-8e03-5956cfdc1dd2-combined-ca-bundle\") pod \"heat-api-6df4b7cd4d-c88k6\" (UID: \"847a28d3-c688-4de9-8e03-5956cfdc1dd2\") " pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.624723 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/847a28d3-c688-4de9-8e03-5956cfdc1dd2-config-data\") pod \"heat-api-6df4b7cd4d-c88k6\" (UID: \"847a28d3-c688-4de9-8e03-5956cfdc1dd2\") " pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.628286 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48dwn\" (UniqueName: \"kubernetes.io/projected/847a28d3-c688-4de9-8e03-5956cfdc1dd2-kube-api-access-48dwn\") pod \"heat-api-6df4b7cd4d-c88k6\" (UID: \"847a28d3-c688-4de9-8e03-5956cfdc1dd2\") " pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.711253 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4072ba4e-6d99-4149-be5d-fe68ccfd5622-config-data-custom\") pod \"heat-cfnapi-fcbd8b89f-8c6fd\" (UID: \"4072ba4e-6d99-4149-be5d-fe68ccfd5622\") " pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.711342 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4072ba4e-6d99-4149-be5d-fe68ccfd5622-config-data\") pod \"heat-cfnapi-fcbd8b89f-8c6fd\" (UID: \"4072ba4e-6d99-4149-be5d-fe68ccfd5622\") " pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.711384 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k49jr\" (UniqueName: \"kubernetes.io/projected/4072ba4e-6d99-4149-be5d-fe68ccfd5622-kube-api-access-k49jr\") pod \"heat-cfnapi-fcbd8b89f-8c6fd\" (UID: \"4072ba4e-6d99-4149-be5d-fe68ccfd5622\") " pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.711426 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4072ba4e-6d99-4149-be5d-fe68ccfd5622-combined-ca-bundle\") pod \"heat-cfnapi-fcbd8b89f-8c6fd\" (UID: \"4072ba4e-6d99-4149-be5d-fe68ccfd5622\") " pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.718214 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4072ba4e-6d99-4149-be5d-fe68ccfd5622-combined-ca-bundle\") pod \"heat-cfnapi-fcbd8b89f-8c6fd\" (UID: \"4072ba4e-6d99-4149-be5d-fe68ccfd5622\") " pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.718775 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4072ba4e-6d99-4149-be5d-fe68ccfd5622-config-data-custom\") pod \"heat-cfnapi-fcbd8b89f-8c6fd\" (UID: \"4072ba4e-6d99-4149-be5d-fe68ccfd5622\") " pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.719594 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.735867 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k49jr\" (UniqueName: \"kubernetes.io/projected/4072ba4e-6d99-4149-be5d-fe68ccfd5622-kube-api-access-k49jr\") pod \"heat-cfnapi-fcbd8b89f-8c6fd\" (UID: \"4072ba4e-6d99-4149-be5d-fe68ccfd5622\") " pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.736914 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4072ba4e-6d99-4149-be5d-fe68ccfd5622-config-data\") pod \"heat-cfnapi-fcbd8b89f-8c6fd\" (UID: \"4072ba4e-6d99-4149-be5d-fe68ccfd5622\") " pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:23 crc kubenswrapper[4854]: I1007 14:07:23.784729 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:24 crc kubenswrapper[4854]: I1007 14:07:24.122818 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-79b95db96d-7jdbk"] Oct 07 14:07:24 crc kubenswrapper[4854]: I1007 14:07:24.285047 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6df4b7cd4d-c88k6"] Oct 07 14:07:24 crc kubenswrapper[4854]: W1007 14:07:24.305555 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod847a28d3_c688_4de9_8e03_5956cfdc1dd2.slice/crio-d1a5abc2ea7d5084d7bd00954560576d760fe7d6d72b239c545203505d1acd0b WatchSource:0}: Error finding container d1a5abc2ea7d5084d7bd00954560576d760fe7d6d72b239c545203505d1acd0b: Status 404 returned error can't find the container with id d1a5abc2ea7d5084d7bd00954560576d760fe7d6d72b239c545203505d1acd0b Oct 07 14:07:24 crc kubenswrapper[4854]: I1007 14:07:24.310862 4854 scope.go:117] "RemoveContainer" containerID="6f09e8ba6f54f1fe26a8b948a03572e9a4e13a42a054bf343d28f68255c8912b" Oct 07 14:07:24 crc kubenswrapper[4854]: I1007 14:07:24.428424 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-fcbd8b89f-8c6fd"] Oct 07 14:07:24 crc kubenswrapper[4854]: I1007 14:07:24.440439 4854 scope.go:117] "RemoveContainer" containerID="fb33b8b9f7f2fe3595900494c9d909bc664f86175b0a618f1b1e0254a4c8ebda" Oct 07 14:07:25 crc kubenswrapper[4854]: I1007 14:07:25.000282 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79b95db96d-7jdbk" event={"ID":"9f632003-d94c-4443-ab13-a0a3f1b50647","Type":"ContainerStarted","Data":"8ed55f30b0f1ecece64d43973aa4babbb4f06c9474e56d86fd4b9fd79027869c"} Oct 07 14:07:25 crc kubenswrapper[4854]: I1007 14:07:25.001941 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79b95db96d-7jdbk" event={"ID":"9f632003-d94c-4443-ab13-a0a3f1b50647","Type":"ContainerStarted","Data":"24d287d703d8636f6840e8eac446e2515f1d94077e543deba3b077ddf87104f8"} Oct 07 14:07:25 crc kubenswrapper[4854]: I1007 14:07:25.002083 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:25 crc kubenswrapper[4854]: I1007 14:07:25.003312 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6df4b7cd4d-c88k6" event={"ID":"847a28d3-c688-4de9-8e03-5956cfdc1dd2","Type":"ContainerStarted","Data":"d1a5abc2ea7d5084d7bd00954560576d760fe7d6d72b239c545203505d1acd0b"} Oct 07 14:07:25 crc kubenswrapper[4854]: I1007 14:07:25.004514 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" event={"ID":"4072ba4e-6d99-4149-be5d-fe68ccfd5622","Type":"ContainerStarted","Data":"4506f4cedaf09d3a0990bf13b4522f105ec4e28004583d70f21524fd093f76d5"} Oct 07 14:07:25 crc kubenswrapper[4854]: I1007 14:07:25.022713 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-79b95db96d-7jdbk" podStartSLOduration=2.022690042 podStartE2EDuration="2.022690042s" podCreationTimestamp="2025-10-07 14:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:07:25.016798412 +0000 UTC m=+6161.004630667" watchObservedRunningTime="2025-10-07 14:07:25.022690042 +0000 UTC m=+6161.010522297" Oct 07 14:07:25 crc kubenswrapper[4854]: I1007 14:07:25.500167 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:07:27 crc kubenswrapper[4854]: I1007 14:07:27.029463 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6df4b7cd4d-c88k6" event={"ID":"847a28d3-c688-4de9-8e03-5956cfdc1dd2","Type":"ContainerStarted","Data":"155eeab15c158c5e0c76349eae95b1a1df930afcacbc0e784938f3e54b696f4d"} Oct 07 14:07:27 crc kubenswrapper[4854]: I1007 14:07:27.029987 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:27 crc kubenswrapper[4854]: I1007 14:07:27.030802 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" event={"ID":"4072ba4e-6d99-4149-be5d-fe68ccfd5622","Type":"ContainerStarted","Data":"e4c4b8fa42e86eae4f7008a952c2f3b108bc1f1ef973a95199496510f1430bf7"} Oct 07 14:07:27 crc kubenswrapper[4854]: I1007 14:07:27.054511 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6df4b7cd4d-c88k6" podStartSLOduration=1.968124083 podStartE2EDuration="4.05448948s" podCreationTimestamp="2025-10-07 14:07:23 +0000 UTC" firstStartedPulling="2025-10-07 14:07:24.315322197 +0000 UTC m=+6160.303154452" lastFinishedPulling="2025-10-07 14:07:26.401687594 +0000 UTC m=+6162.389519849" observedRunningTime="2025-10-07 14:07:27.046633784 +0000 UTC m=+6163.034466039" watchObservedRunningTime="2025-10-07 14:07:27.05448948 +0000 UTC m=+6163.042321735" Oct 07 14:07:27 crc kubenswrapper[4854]: I1007 14:07:27.493605 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6fbd55998f-s8xh4" Oct 07 14:07:27 crc kubenswrapper[4854]: I1007 14:07:27.519572 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" podStartSLOduration=2.569500183 podStartE2EDuration="4.519554613s" podCreationTimestamp="2025-10-07 14:07:23 +0000 UTC" firstStartedPulling="2025-10-07 14:07:24.450035868 +0000 UTC m=+6160.437868123" lastFinishedPulling="2025-10-07 14:07:26.400090298 +0000 UTC m=+6162.387922553" observedRunningTime="2025-10-07 14:07:27.093507921 +0000 UTC m=+6163.081340176" watchObservedRunningTime="2025-10-07 14:07:27.519554613 +0000 UTC m=+6163.507386868" Oct 07 14:07:27 crc kubenswrapper[4854]: I1007 14:07:27.558170 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-78589d7845-lb7r8"] Oct 07 14:07:27 crc kubenswrapper[4854]: I1007 14:07:27.558437 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-78589d7845-lb7r8" podUID="7e937537-3ce6-44a9-80ca-d1378ef14544" containerName="horizon-log" containerID="cri-o://5c91101980872222349f6cba0be8173f4ae268164c902d12efd539b19654d539" gracePeriod=30 Oct 07 14:07:27 crc kubenswrapper[4854]: I1007 14:07:27.559612 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-78589d7845-lb7r8" podUID="7e937537-3ce6-44a9-80ca-d1378ef14544" containerName="horizon" containerID="cri-o://17f219ef799682b0ffa1592f854acfcfbf6d9d555d1fad79287bf1069b1a7b4e" gracePeriod=30 Oct 07 14:07:27 crc kubenswrapper[4854]: I1007 14:07:27.703421 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:07:27 crc kubenswrapper[4854]: E1007 14:07:27.703691 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:07:28 crc kubenswrapper[4854]: I1007 14:07:28.038384 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:30 crc kubenswrapper[4854]: I1007 14:07:30.044166 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-chs97"] Oct 07 14:07:30 crc kubenswrapper[4854]: I1007 14:07:30.054975 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-x4q2z"] Oct 07 14:07:30 crc kubenswrapper[4854]: I1007 14:07:30.066910 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-ffkq6"] Oct 07 14:07:30 crc kubenswrapper[4854]: I1007 14:07:30.078853 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-ffkq6"] Oct 07 14:07:30 crc kubenswrapper[4854]: I1007 14:07:30.091469 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-x4q2z"] Oct 07 14:07:30 crc kubenswrapper[4854]: I1007 14:07:30.098842 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-chs97"] Oct 07 14:07:30 crc kubenswrapper[4854]: I1007 14:07:30.724966 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14baabea-8903-4587-bb73-3183643a716f" path="/var/lib/kubelet/pods/14baabea-8903-4587-bb73-3183643a716f/volumes" Oct 07 14:07:30 crc kubenswrapper[4854]: I1007 14:07:30.725890 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5491e135-8a4d-4f3b-b914-d837087b3826" path="/var/lib/kubelet/pods/5491e135-8a4d-4f3b-b914-d837087b3826/volumes" Oct 07 14:07:30 crc kubenswrapper[4854]: I1007 14:07:30.727504 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b39e71d1-d571-4a66-ac83-d4f3a2816fda" path="/var/lib/kubelet/pods/b39e71d1-d571-4a66-ac83-d4f3a2816fda/volumes" Oct 07 14:07:31 crc kubenswrapper[4854]: I1007 14:07:31.064929 4854 generic.go:334] "Generic (PLEG): container finished" podID="7e937537-3ce6-44a9-80ca-d1378ef14544" containerID="17f219ef799682b0ffa1592f854acfcfbf6d9d555d1fad79287bf1069b1a7b4e" exitCode=0 Oct 07 14:07:31 crc kubenswrapper[4854]: I1007 14:07:31.064972 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78589d7845-lb7r8" event={"ID":"7e937537-3ce6-44a9-80ca-d1378ef14544","Type":"ContainerDied","Data":"17f219ef799682b0ffa1592f854acfcfbf6d9d555d1fad79287bf1069b1a7b4e"} Oct 07 14:07:35 crc kubenswrapper[4854]: I1007 14:07:35.035453 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6df4b7cd4d-c88k6" Oct 07 14:07:35 crc kubenswrapper[4854]: I1007 14:07:35.047143 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-fcbd8b89f-8c6fd" Oct 07 14:07:35 crc kubenswrapper[4854]: I1007 14:07:35.098974 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-78589d7845-lb7r8" podUID="7e937537-3ce6-44a9-80ca-d1378ef14544" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Oct 07 14:07:40 crc kubenswrapper[4854]: I1007 14:07:40.053399 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-02b4-account-create-ws7zm"] Oct 07 14:07:40 crc kubenswrapper[4854]: I1007 14:07:40.067220 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c7d6-account-create-w5tkr"] Oct 07 14:07:40 crc kubenswrapper[4854]: I1007 14:07:40.086468 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-65a7-account-create-88qrr"] Oct 07 14:07:40 crc kubenswrapper[4854]: I1007 14:07:40.096236 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c7d6-account-create-w5tkr"] Oct 07 14:07:40 crc kubenswrapper[4854]: I1007 14:07:40.103835 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-02b4-account-create-ws7zm"] Oct 07 14:07:40 crc kubenswrapper[4854]: I1007 14:07:40.109854 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-65a7-account-create-88qrr"] Oct 07 14:07:40 crc kubenswrapper[4854]: I1007 14:07:40.734271 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29949f4f-fe34-418c-9dc7-4468eb0749d4" path="/var/lib/kubelet/pods/29949f4f-fe34-418c-9dc7-4468eb0749d4/volumes" Oct 07 14:07:40 crc kubenswrapper[4854]: I1007 14:07:40.735135 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a445c23d-afd6-466d-8bc4-b47492fab261" path="/var/lib/kubelet/pods/a445c23d-afd6-466d-8bc4-b47492fab261/volumes" Oct 07 14:07:40 crc kubenswrapper[4854]: I1007 14:07:40.735704 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fea85a90-a128-4e5a-b27c-b2cb1fb901d3" path="/var/lib/kubelet/pods/fea85a90-a128-4e5a-b27c-b2cb1fb901d3/volumes" Oct 07 14:07:41 crc kubenswrapper[4854]: I1007 14:07:41.245686 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2qgpz"] Oct 07 14:07:41 crc kubenswrapper[4854]: I1007 14:07:41.247646 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:41 crc kubenswrapper[4854]: I1007 14:07:41.266278 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2qgpz"] Oct 07 14:07:41 crc kubenswrapper[4854]: I1007 14:07:41.323933 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acba990b-7d20-4af9-92ca-7b5571da56a8-catalog-content\") pod \"certified-operators-2qgpz\" (UID: \"acba990b-7d20-4af9-92ca-7b5571da56a8\") " pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:41 crc kubenswrapper[4854]: I1007 14:07:41.324140 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acba990b-7d20-4af9-92ca-7b5571da56a8-utilities\") pod \"certified-operators-2qgpz\" (UID: \"acba990b-7d20-4af9-92ca-7b5571da56a8\") " pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:41 crc kubenswrapper[4854]: I1007 14:07:41.324427 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qqqg\" (UniqueName: \"kubernetes.io/projected/acba990b-7d20-4af9-92ca-7b5571da56a8-kube-api-access-2qqqg\") pod \"certified-operators-2qgpz\" (UID: \"acba990b-7d20-4af9-92ca-7b5571da56a8\") " pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:41 crc kubenswrapper[4854]: I1007 14:07:41.426421 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acba990b-7d20-4af9-92ca-7b5571da56a8-catalog-content\") pod \"certified-operators-2qgpz\" (UID: \"acba990b-7d20-4af9-92ca-7b5571da56a8\") " pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:41 crc kubenswrapper[4854]: I1007 14:07:41.426577 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acba990b-7d20-4af9-92ca-7b5571da56a8-utilities\") pod \"certified-operators-2qgpz\" (UID: \"acba990b-7d20-4af9-92ca-7b5571da56a8\") " pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:41 crc kubenswrapper[4854]: I1007 14:07:41.426741 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qqqg\" (UniqueName: \"kubernetes.io/projected/acba990b-7d20-4af9-92ca-7b5571da56a8-kube-api-access-2qqqg\") pod \"certified-operators-2qgpz\" (UID: \"acba990b-7d20-4af9-92ca-7b5571da56a8\") " pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:41 crc kubenswrapper[4854]: I1007 14:07:41.427066 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acba990b-7d20-4af9-92ca-7b5571da56a8-catalog-content\") pod \"certified-operators-2qgpz\" (UID: \"acba990b-7d20-4af9-92ca-7b5571da56a8\") " pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:41 crc kubenswrapper[4854]: I1007 14:07:41.427093 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acba990b-7d20-4af9-92ca-7b5571da56a8-utilities\") pod \"certified-operators-2qgpz\" (UID: \"acba990b-7d20-4af9-92ca-7b5571da56a8\") " pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:41 crc kubenswrapper[4854]: I1007 14:07:41.450408 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qqqg\" (UniqueName: \"kubernetes.io/projected/acba990b-7d20-4af9-92ca-7b5571da56a8-kube-api-access-2qqqg\") pod \"certified-operators-2qgpz\" (UID: \"acba990b-7d20-4af9-92ca-7b5571da56a8\") " pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:41 crc kubenswrapper[4854]: I1007 14:07:41.584555 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:41 crc kubenswrapper[4854]: I1007 14:07:41.704049 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:07:41 crc kubenswrapper[4854]: E1007 14:07:41.704528 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:07:42 crc kubenswrapper[4854]: I1007 14:07:42.116230 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2qgpz"] Oct 07 14:07:42 crc kubenswrapper[4854]: W1007 14:07:42.124676 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacba990b_7d20_4af9_92ca_7b5571da56a8.slice/crio-1d7b98e3034365de66a807483f25d1c95aba6f38c4fd636a5b8482dbf9ed3a5b WatchSource:0}: Error finding container 1d7b98e3034365de66a807483f25d1c95aba6f38c4fd636a5b8482dbf9ed3a5b: Status 404 returned error can't find the container with id 1d7b98e3034365de66a807483f25d1c95aba6f38c4fd636a5b8482dbf9ed3a5b Oct 07 14:07:42 crc kubenswrapper[4854]: I1007 14:07:42.202034 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qgpz" event={"ID":"acba990b-7d20-4af9-92ca-7b5571da56a8","Type":"ContainerStarted","Data":"1d7b98e3034365de66a807483f25d1c95aba6f38c4fd636a5b8482dbf9ed3a5b"} Oct 07 14:07:43 crc kubenswrapper[4854]: I1007 14:07:43.222096 4854 generic.go:334] "Generic (PLEG): container finished" podID="acba990b-7d20-4af9-92ca-7b5571da56a8" containerID="70756dcf60d695dcd995639ac217f6c057966b556acfcd4d1a7fe3878c8d008a" exitCode=0 Oct 07 14:07:43 crc kubenswrapper[4854]: I1007 14:07:43.222207 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qgpz" event={"ID":"acba990b-7d20-4af9-92ca-7b5571da56a8","Type":"ContainerDied","Data":"70756dcf60d695dcd995639ac217f6c057966b556acfcd4d1a7fe3878c8d008a"} Oct 07 14:07:43 crc kubenswrapper[4854]: I1007 14:07:43.639756 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-79b95db96d-7jdbk" Oct 07 14:07:45 crc kubenswrapper[4854]: I1007 14:07:45.102304 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-78589d7845-lb7r8" podUID="7e937537-3ce6-44a9-80ca-d1378ef14544" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Oct 07 14:07:45 crc kubenswrapper[4854]: I1007 14:07:45.246909 4854 generic.go:334] "Generic (PLEG): container finished" podID="acba990b-7d20-4af9-92ca-7b5571da56a8" containerID="f55670ece152e6faa9e997d33799c8b35381d3a8fd038f4cbed1a9ac1dc83ca6" exitCode=0 Oct 07 14:07:45 crc kubenswrapper[4854]: I1007 14:07:45.246966 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qgpz" event={"ID":"acba990b-7d20-4af9-92ca-7b5571da56a8","Type":"ContainerDied","Data":"f55670ece152e6faa9e997d33799c8b35381d3a8fd038f4cbed1a9ac1dc83ca6"} Oct 07 14:07:47 crc kubenswrapper[4854]: I1007 14:07:47.268919 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qgpz" event={"ID":"acba990b-7d20-4af9-92ca-7b5571da56a8","Type":"ContainerStarted","Data":"22385486bb8643a26982e4e2371b184a18a4b20535fab62fb9d2b2c80e4ff9e2"} Oct 07 14:07:47 crc kubenswrapper[4854]: I1007 14:07:47.289002 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2qgpz" podStartSLOduration=2.726967616 podStartE2EDuration="6.28898225s" podCreationTimestamp="2025-10-07 14:07:41 +0000 UTC" firstStartedPulling="2025-10-07 14:07:43.227381011 +0000 UTC m=+6179.215213276" lastFinishedPulling="2025-10-07 14:07:46.789395645 +0000 UTC m=+6182.777227910" observedRunningTime="2025-10-07 14:07:47.28584864 +0000 UTC m=+6183.273680905" watchObservedRunningTime="2025-10-07 14:07:47.28898225 +0000 UTC m=+6183.276814505" Oct 07 14:07:50 crc kubenswrapper[4854]: I1007 14:07:50.032429 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tf2vc"] Oct 07 14:07:50 crc kubenswrapper[4854]: I1007 14:07:50.042483 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tf2vc"] Oct 07 14:07:50 crc kubenswrapper[4854]: I1007 14:07:50.720298 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91276b96-40b5-468a-a8e4-11446b2f0cfe" path="/var/lib/kubelet/pods/91276b96-40b5-468a-a8e4-11446b2f0cfe/volumes" Oct 07 14:07:51 crc kubenswrapper[4854]: I1007 14:07:51.585675 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:51 crc kubenswrapper[4854]: I1007 14:07:51.585726 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:51 crc kubenswrapper[4854]: I1007 14:07:51.653382 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:52 crc kubenswrapper[4854]: I1007 14:07:52.394799 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:53 crc kubenswrapper[4854]: I1007 14:07:53.702639 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:07:53 crc kubenswrapper[4854]: E1007 14:07:53.702985 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:07:54 crc kubenswrapper[4854]: I1007 14:07:54.696050 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2qgpz"] Oct 07 14:07:54 crc kubenswrapper[4854]: I1007 14:07:54.696939 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2qgpz" podUID="acba990b-7d20-4af9-92ca-7b5571da56a8" containerName="registry-server" containerID="cri-o://22385486bb8643a26982e4e2371b184a18a4b20535fab62fb9d2b2c80e4ff9e2" gracePeriod=2 Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.103256 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-78589d7845-lb7r8" podUID="7e937537-3ce6-44a9-80ca-d1378ef14544" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.115:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8080: connect: connection refused" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.103360 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.190171 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.228971 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acba990b-7d20-4af9-92ca-7b5571da56a8-catalog-content\") pod \"acba990b-7d20-4af9-92ca-7b5571da56a8\" (UID: \"acba990b-7d20-4af9-92ca-7b5571da56a8\") " Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.229075 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qqqg\" (UniqueName: \"kubernetes.io/projected/acba990b-7d20-4af9-92ca-7b5571da56a8-kube-api-access-2qqqg\") pod \"acba990b-7d20-4af9-92ca-7b5571da56a8\" (UID: \"acba990b-7d20-4af9-92ca-7b5571da56a8\") " Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.229108 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acba990b-7d20-4af9-92ca-7b5571da56a8-utilities\") pod \"acba990b-7d20-4af9-92ca-7b5571da56a8\" (UID: \"acba990b-7d20-4af9-92ca-7b5571da56a8\") " Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.230114 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acba990b-7d20-4af9-92ca-7b5571da56a8-utilities" (OuterVolumeSpecName: "utilities") pod "acba990b-7d20-4af9-92ca-7b5571da56a8" (UID: "acba990b-7d20-4af9-92ca-7b5571da56a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.236602 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acba990b-7d20-4af9-92ca-7b5571da56a8-kube-api-access-2qqqg" (OuterVolumeSpecName: "kube-api-access-2qqqg") pod "acba990b-7d20-4af9-92ca-7b5571da56a8" (UID: "acba990b-7d20-4af9-92ca-7b5571da56a8"). InnerVolumeSpecName "kube-api-access-2qqqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.273103 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acba990b-7d20-4af9-92ca-7b5571da56a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acba990b-7d20-4af9-92ca-7b5571da56a8" (UID: "acba990b-7d20-4af9-92ca-7b5571da56a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.330923 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acba990b-7d20-4af9-92ca-7b5571da56a8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.330958 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qqqg\" (UniqueName: \"kubernetes.io/projected/acba990b-7d20-4af9-92ca-7b5571da56a8-kube-api-access-2qqqg\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.330969 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acba990b-7d20-4af9-92ca-7b5571da56a8-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.372577 4854 generic.go:334] "Generic (PLEG): container finished" podID="acba990b-7d20-4af9-92ca-7b5571da56a8" containerID="22385486bb8643a26982e4e2371b184a18a4b20535fab62fb9d2b2c80e4ff9e2" exitCode=0 Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.372627 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qgpz" event={"ID":"acba990b-7d20-4af9-92ca-7b5571da56a8","Type":"ContainerDied","Data":"22385486bb8643a26982e4e2371b184a18a4b20535fab62fb9d2b2c80e4ff9e2"} Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.372658 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2qgpz" event={"ID":"acba990b-7d20-4af9-92ca-7b5571da56a8","Type":"ContainerDied","Data":"1d7b98e3034365de66a807483f25d1c95aba6f38c4fd636a5b8482dbf9ed3a5b"} Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.372659 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2qgpz" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.372674 4854 scope.go:117] "RemoveContainer" containerID="22385486bb8643a26982e4e2371b184a18a4b20535fab62fb9d2b2c80e4ff9e2" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.407429 4854 scope.go:117] "RemoveContainer" containerID="f55670ece152e6faa9e997d33799c8b35381d3a8fd038f4cbed1a9ac1dc83ca6" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.412275 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2qgpz"] Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.431617 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2qgpz"] Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.450016 4854 scope.go:117] "RemoveContainer" containerID="70756dcf60d695dcd995639ac217f6c057966b556acfcd4d1a7fe3878c8d008a" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.496621 4854 scope.go:117] "RemoveContainer" containerID="22385486bb8643a26982e4e2371b184a18a4b20535fab62fb9d2b2c80e4ff9e2" Oct 07 14:07:55 crc kubenswrapper[4854]: E1007 14:07:55.497112 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22385486bb8643a26982e4e2371b184a18a4b20535fab62fb9d2b2c80e4ff9e2\": container with ID starting with 22385486bb8643a26982e4e2371b184a18a4b20535fab62fb9d2b2c80e4ff9e2 not found: ID does not exist" containerID="22385486bb8643a26982e4e2371b184a18a4b20535fab62fb9d2b2c80e4ff9e2" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.497163 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22385486bb8643a26982e4e2371b184a18a4b20535fab62fb9d2b2c80e4ff9e2"} err="failed to get container status \"22385486bb8643a26982e4e2371b184a18a4b20535fab62fb9d2b2c80e4ff9e2\": rpc error: code = NotFound desc = could not find container \"22385486bb8643a26982e4e2371b184a18a4b20535fab62fb9d2b2c80e4ff9e2\": container with ID starting with 22385486bb8643a26982e4e2371b184a18a4b20535fab62fb9d2b2c80e4ff9e2 not found: ID does not exist" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.497187 4854 scope.go:117] "RemoveContainer" containerID="f55670ece152e6faa9e997d33799c8b35381d3a8fd038f4cbed1a9ac1dc83ca6" Oct 07 14:07:55 crc kubenswrapper[4854]: E1007 14:07:55.497421 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f55670ece152e6faa9e997d33799c8b35381d3a8fd038f4cbed1a9ac1dc83ca6\": container with ID starting with f55670ece152e6faa9e997d33799c8b35381d3a8fd038f4cbed1a9ac1dc83ca6 not found: ID does not exist" containerID="f55670ece152e6faa9e997d33799c8b35381d3a8fd038f4cbed1a9ac1dc83ca6" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.497445 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f55670ece152e6faa9e997d33799c8b35381d3a8fd038f4cbed1a9ac1dc83ca6"} err="failed to get container status \"f55670ece152e6faa9e997d33799c8b35381d3a8fd038f4cbed1a9ac1dc83ca6\": rpc error: code = NotFound desc = could not find container \"f55670ece152e6faa9e997d33799c8b35381d3a8fd038f4cbed1a9ac1dc83ca6\": container with ID starting with f55670ece152e6faa9e997d33799c8b35381d3a8fd038f4cbed1a9ac1dc83ca6 not found: ID does not exist" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.497463 4854 scope.go:117] "RemoveContainer" containerID="70756dcf60d695dcd995639ac217f6c057966b556acfcd4d1a7fe3878c8d008a" Oct 07 14:07:55 crc kubenswrapper[4854]: E1007 14:07:55.497875 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70756dcf60d695dcd995639ac217f6c057966b556acfcd4d1a7fe3878c8d008a\": container with ID starting with 70756dcf60d695dcd995639ac217f6c057966b556acfcd4d1a7fe3878c8d008a not found: ID does not exist" containerID="70756dcf60d695dcd995639ac217f6c057966b556acfcd4d1a7fe3878c8d008a" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.497898 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70756dcf60d695dcd995639ac217f6c057966b556acfcd4d1a7fe3878c8d008a"} err="failed to get container status \"70756dcf60d695dcd995639ac217f6c057966b556acfcd4d1a7fe3878c8d008a\": rpc error: code = NotFound desc = could not find container \"70756dcf60d695dcd995639ac217f6c057966b556acfcd4d1a7fe3878c8d008a\": container with ID starting with 70756dcf60d695dcd995639ac217f6c057966b556acfcd4d1a7fe3878c8d008a not found: ID does not exist" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.944990 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64"] Oct 07 14:07:55 crc kubenswrapper[4854]: E1007 14:07:55.945670 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba990b-7d20-4af9-92ca-7b5571da56a8" containerName="registry-server" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.945766 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba990b-7d20-4af9-92ca-7b5571da56a8" containerName="registry-server" Oct 07 14:07:55 crc kubenswrapper[4854]: E1007 14:07:55.945869 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba990b-7d20-4af9-92ca-7b5571da56a8" containerName="extract-content" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.945952 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba990b-7d20-4af9-92ca-7b5571da56a8" containerName="extract-content" Oct 07 14:07:55 crc kubenswrapper[4854]: E1007 14:07:55.946024 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acba990b-7d20-4af9-92ca-7b5571da56a8" containerName="extract-utilities" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.946092 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="acba990b-7d20-4af9-92ca-7b5571da56a8" containerName="extract-utilities" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.946426 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="acba990b-7d20-4af9-92ca-7b5571da56a8" containerName="registry-server" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.948325 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.950365 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 14:07:55 crc kubenswrapper[4854]: I1007 14:07:55.954719 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64"] Oct 07 14:07:56 crc kubenswrapper[4854]: I1007 14:07:56.047037 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2kmg\" (UniqueName: \"kubernetes.io/projected/8b37d062-3006-456c-8cea-ca674cc3ce32-kube-api-access-g2kmg\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64\" (UID: \"8b37d062-3006-456c-8cea-ca674cc3ce32\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" Oct 07 14:07:56 crc kubenswrapper[4854]: I1007 14:07:56.047138 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b37d062-3006-456c-8cea-ca674cc3ce32-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64\" (UID: \"8b37d062-3006-456c-8cea-ca674cc3ce32\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" Oct 07 14:07:56 crc kubenswrapper[4854]: I1007 14:07:56.047270 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b37d062-3006-456c-8cea-ca674cc3ce32-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64\" (UID: \"8b37d062-3006-456c-8cea-ca674cc3ce32\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" Oct 07 14:07:56 crc kubenswrapper[4854]: I1007 14:07:56.149026 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2kmg\" (UniqueName: \"kubernetes.io/projected/8b37d062-3006-456c-8cea-ca674cc3ce32-kube-api-access-g2kmg\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64\" (UID: \"8b37d062-3006-456c-8cea-ca674cc3ce32\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" Oct 07 14:07:56 crc kubenswrapper[4854]: I1007 14:07:56.149111 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b37d062-3006-456c-8cea-ca674cc3ce32-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64\" (UID: \"8b37d062-3006-456c-8cea-ca674cc3ce32\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" Oct 07 14:07:56 crc kubenswrapper[4854]: I1007 14:07:56.149266 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b37d062-3006-456c-8cea-ca674cc3ce32-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64\" (UID: \"8b37d062-3006-456c-8cea-ca674cc3ce32\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" Oct 07 14:07:56 crc kubenswrapper[4854]: I1007 14:07:56.149696 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b37d062-3006-456c-8cea-ca674cc3ce32-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64\" (UID: \"8b37d062-3006-456c-8cea-ca674cc3ce32\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" Oct 07 14:07:56 crc kubenswrapper[4854]: I1007 14:07:56.149739 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b37d062-3006-456c-8cea-ca674cc3ce32-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64\" (UID: \"8b37d062-3006-456c-8cea-ca674cc3ce32\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" Oct 07 14:07:56 crc kubenswrapper[4854]: I1007 14:07:56.168766 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2kmg\" (UniqueName: \"kubernetes.io/projected/8b37d062-3006-456c-8cea-ca674cc3ce32-kube-api-access-g2kmg\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64\" (UID: \"8b37d062-3006-456c-8cea-ca674cc3ce32\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" Oct 07 14:07:56 crc kubenswrapper[4854]: I1007 14:07:56.304617 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" Oct 07 14:07:56 crc kubenswrapper[4854]: I1007 14:07:56.713109 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acba990b-7d20-4af9-92ca-7b5571da56a8" path="/var/lib/kubelet/pods/acba990b-7d20-4af9-92ca-7b5571da56a8/volumes" Oct 07 14:07:56 crc kubenswrapper[4854]: I1007 14:07:56.780667 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64"] Oct 07 14:07:57 crc kubenswrapper[4854]: I1007 14:07:57.397326 4854 generic.go:334] "Generic (PLEG): container finished" podID="8b37d062-3006-456c-8cea-ca674cc3ce32" containerID="371c784448f3293070c5ee1278760ae8d966d978b9359be9248b27550aa2f866" exitCode=0 Oct 07 14:07:57 crc kubenswrapper[4854]: I1007 14:07:57.397373 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" event={"ID":"8b37d062-3006-456c-8cea-ca674cc3ce32","Type":"ContainerDied","Data":"371c784448f3293070c5ee1278760ae8d966d978b9359be9248b27550aa2f866"} Oct 07 14:07:57 crc kubenswrapper[4854]: I1007 14:07:57.397401 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" event={"ID":"8b37d062-3006-456c-8cea-ca674cc3ce32","Type":"ContainerStarted","Data":"f578e576edb2c4abaa4be1ee288e1a0a8e562924e21e848aff27c30d1dfdd88d"} Oct 07 14:07:57 crc kubenswrapper[4854]: I1007 14:07:57.994390 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.097716 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e937537-3ce6-44a9-80ca-d1378ef14544-config-data\") pod \"7e937537-3ce6-44a9-80ca-d1378ef14544\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.097791 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e937537-3ce6-44a9-80ca-d1378ef14544-horizon-secret-key\") pod \"7e937537-3ce6-44a9-80ca-d1378ef14544\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.097833 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e937537-3ce6-44a9-80ca-d1378ef14544-logs\") pod \"7e937537-3ce6-44a9-80ca-d1378ef14544\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.097857 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlth4\" (UniqueName: \"kubernetes.io/projected/7e937537-3ce6-44a9-80ca-d1378ef14544-kube-api-access-hlth4\") pod \"7e937537-3ce6-44a9-80ca-d1378ef14544\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.097934 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e937537-3ce6-44a9-80ca-d1378ef14544-scripts\") pod \"7e937537-3ce6-44a9-80ca-d1378ef14544\" (UID: \"7e937537-3ce6-44a9-80ca-d1378ef14544\") " Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.098403 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e937537-3ce6-44a9-80ca-d1378ef14544-logs" (OuterVolumeSpecName: "logs") pod "7e937537-3ce6-44a9-80ca-d1378ef14544" (UID: "7e937537-3ce6-44a9-80ca-d1378ef14544"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.098874 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e937537-3ce6-44a9-80ca-d1378ef14544-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.103250 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e937537-3ce6-44a9-80ca-d1378ef14544-kube-api-access-hlth4" (OuterVolumeSpecName: "kube-api-access-hlth4") pod "7e937537-3ce6-44a9-80ca-d1378ef14544" (UID: "7e937537-3ce6-44a9-80ca-d1378ef14544"). InnerVolumeSpecName "kube-api-access-hlth4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.103707 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e937537-3ce6-44a9-80ca-d1378ef14544-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7e937537-3ce6-44a9-80ca-d1378ef14544" (UID: "7e937537-3ce6-44a9-80ca-d1378ef14544"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.123971 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e937537-3ce6-44a9-80ca-d1378ef14544-scripts" (OuterVolumeSpecName: "scripts") pod "7e937537-3ce6-44a9-80ca-d1378ef14544" (UID: "7e937537-3ce6-44a9-80ca-d1378ef14544"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.129420 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e937537-3ce6-44a9-80ca-d1378ef14544-config-data" (OuterVolumeSpecName: "config-data") pod "7e937537-3ce6-44a9-80ca-d1378ef14544" (UID: "7e937537-3ce6-44a9-80ca-d1378ef14544"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.201140 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e937537-3ce6-44a9-80ca-d1378ef14544-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.201195 4854 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e937537-3ce6-44a9-80ca-d1378ef14544-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.201208 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlth4\" (UniqueName: \"kubernetes.io/projected/7e937537-3ce6-44a9-80ca-d1378ef14544-kube-api-access-hlth4\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.201219 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e937537-3ce6-44a9-80ca-d1378ef14544-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.410064 4854 generic.go:334] "Generic (PLEG): container finished" podID="7e937537-3ce6-44a9-80ca-d1378ef14544" containerID="5c91101980872222349f6cba0be8173f4ae268164c902d12efd539b19654d539" exitCode=137 Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.410106 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78589d7845-lb7r8" event={"ID":"7e937537-3ce6-44a9-80ca-d1378ef14544","Type":"ContainerDied","Data":"5c91101980872222349f6cba0be8173f4ae268164c902d12efd539b19654d539"} Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.410131 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78589d7845-lb7r8" event={"ID":"7e937537-3ce6-44a9-80ca-d1378ef14544","Type":"ContainerDied","Data":"26aaf87732eb451a590d9d04f7fa38adb2d3e8db1d0199ea57191f1208c44cbd"} Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.410150 4854 scope.go:117] "RemoveContainer" containerID="17f219ef799682b0ffa1592f854acfcfbf6d9d555d1fad79287bf1069b1a7b4e" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.410274 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78589d7845-lb7r8" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.467028 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-78589d7845-lb7r8"] Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.473904 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-78589d7845-lb7r8"] Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.589930 4854 scope.go:117] "RemoveContainer" containerID="5c91101980872222349f6cba0be8173f4ae268164c902d12efd539b19654d539" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.720954 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e937537-3ce6-44a9-80ca-d1378ef14544" path="/var/lib/kubelet/pods/7e937537-3ce6-44a9-80ca-d1378ef14544/volumes" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.744043 4854 scope.go:117] "RemoveContainer" containerID="17f219ef799682b0ffa1592f854acfcfbf6d9d555d1fad79287bf1069b1a7b4e" Oct 07 14:07:58 crc kubenswrapper[4854]: E1007 14:07:58.744498 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f219ef799682b0ffa1592f854acfcfbf6d9d555d1fad79287bf1069b1a7b4e\": container with ID starting with 17f219ef799682b0ffa1592f854acfcfbf6d9d555d1fad79287bf1069b1a7b4e not found: ID does not exist" containerID="17f219ef799682b0ffa1592f854acfcfbf6d9d555d1fad79287bf1069b1a7b4e" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.744558 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f219ef799682b0ffa1592f854acfcfbf6d9d555d1fad79287bf1069b1a7b4e"} err="failed to get container status \"17f219ef799682b0ffa1592f854acfcfbf6d9d555d1fad79287bf1069b1a7b4e\": rpc error: code = NotFound desc = could not find container \"17f219ef799682b0ffa1592f854acfcfbf6d9d555d1fad79287bf1069b1a7b4e\": container with ID starting with 17f219ef799682b0ffa1592f854acfcfbf6d9d555d1fad79287bf1069b1a7b4e not found: ID does not exist" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.744585 4854 scope.go:117] "RemoveContainer" containerID="5c91101980872222349f6cba0be8173f4ae268164c902d12efd539b19654d539" Oct 07 14:07:58 crc kubenswrapper[4854]: E1007 14:07:58.744871 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c91101980872222349f6cba0be8173f4ae268164c902d12efd539b19654d539\": container with ID starting with 5c91101980872222349f6cba0be8173f4ae268164c902d12efd539b19654d539 not found: ID does not exist" containerID="5c91101980872222349f6cba0be8173f4ae268164c902d12efd539b19654d539" Oct 07 14:07:58 crc kubenswrapper[4854]: I1007 14:07:58.744907 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c91101980872222349f6cba0be8173f4ae268164c902d12efd539b19654d539"} err="failed to get container status \"5c91101980872222349f6cba0be8173f4ae268164c902d12efd539b19654d539\": rpc error: code = NotFound desc = could not find container \"5c91101980872222349f6cba0be8173f4ae268164c902d12efd539b19654d539\": container with ID starting with 5c91101980872222349f6cba0be8173f4ae268164c902d12efd539b19654d539 not found: ID does not exist" Oct 07 14:07:59 crc kubenswrapper[4854]: I1007 14:07:59.424953 4854 generic.go:334] "Generic (PLEG): container finished" podID="8b37d062-3006-456c-8cea-ca674cc3ce32" containerID="3505d1cc9752a0e26401842a4efcd928e4dc2e89637fdda7fb1769eee0dd5c2a" exitCode=0 Oct 07 14:07:59 crc kubenswrapper[4854]: I1007 14:07:59.425034 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" event={"ID":"8b37d062-3006-456c-8cea-ca674cc3ce32","Type":"ContainerDied","Data":"3505d1cc9752a0e26401842a4efcd928e4dc2e89637fdda7fb1769eee0dd5c2a"} Oct 07 14:08:00 crc kubenswrapper[4854]: I1007 14:08:00.442080 4854 generic.go:334] "Generic (PLEG): container finished" podID="8b37d062-3006-456c-8cea-ca674cc3ce32" containerID="4cf1ae4969ca04749d7831991480e615250f05bb8abcd88dae6c182a3b910bcc" exitCode=0 Oct 07 14:08:00 crc kubenswrapper[4854]: I1007 14:08:00.442198 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" event={"ID":"8b37d062-3006-456c-8cea-ca674cc3ce32","Type":"ContainerDied","Data":"4cf1ae4969ca04749d7831991480e615250f05bb8abcd88dae6c182a3b910bcc"} Oct 07 14:08:01 crc kubenswrapper[4854]: I1007 14:08:01.836929 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" Oct 07 14:08:01 crc kubenswrapper[4854]: I1007 14:08:01.997035 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2kmg\" (UniqueName: \"kubernetes.io/projected/8b37d062-3006-456c-8cea-ca674cc3ce32-kube-api-access-g2kmg\") pod \"8b37d062-3006-456c-8cea-ca674cc3ce32\" (UID: \"8b37d062-3006-456c-8cea-ca674cc3ce32\") " Oct 07 14:08:01 crc kubenswrapper[4854]: I1007 14:08:01.997434 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b37d062-3006-456c-8cea-ca674cc3ce32-util\") pod \"8b37d062-3006-456c-8cea-ca674cc3ce32\" (UID: \"8b37d062-3006-456c-8cea-ca674cc3ce32\") " Oct 07 14:08:01 crc kubenswrapper[4854]: I1007 14:08:01.997541 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b37d062-3006-456c-8cea-ca674cc3ce32-bundle\") pod \"8b37d062-3006-456c-8cea-ca674cc3ce32\" (UID: \"8b37d062-3006-456c-8cea-ca674cc3ce32\") " Oct 07 14:08:01 crc kubenswrapper[4854]: I1007 14:08:01.999817 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b37d062-3006-456c-8cea-ca674cc3ce32-bundle" (OuterVolumeSpecName: "bundle") pod "8b37d062-3006-456c-8cea-ca674cc3ce32" (UID: "8b37d062-3006-456c-8cea-ca674cc3ce32"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:08:02 crc kubenswrapper[4854]: I1007 14:08:02.005280 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b37d062-3006-456c-8cea-ca674cc3ce32-kube-api-access-g2kmg" (OuterVolumeSpecName: "kube-api-access-g2kmg") pod "8b37d062-3006-456c-8cea-ca674cc3ce32" (UID: "8b37d062-3006-456c-8cea-ca674cc3ce32"). InnerVolumeSpecName "kube-api-access-g2kmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:08:02 crc kubenswrapper[4854]: I1007 14:08:02.013832 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b37d062-3006-456c-8cea-ca674cc3ce32-util" (OuterVolumeSpecName: "util") pod "8b37d062-3006-456c-8cea-ca674cc3ce32" (UID: "8b37d062-3006-456c-8cea-ca674cc3ce32"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:08:02 crc kubenswrapper[4854]: I1007 14:08:02.100189 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2kmg\" (UniqueName: \"kubernetes.io/projected/8b37d062-3006-456c-8cea-ca674cc3ce32-kube-api-access-g2kmg\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:02 crc kubenswrapper[4854]: I1007 14:08:02.100554 4854 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b37d062-3006-456c-8cea-ca674cc3ce32-util\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:02 crc kubenswrapper[4854]: I1007 14:08:02.100875 4854 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b37d062-3006-456c-8cea-ca674cc3ce32-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:02 crc kubenswrapper[4854]: I1007 14:08:02.467879 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" event={"ID":"8b37d062-3006-456c-8cea-ca674cc3ce32","Type":"ContainerDied","Data":"f578e576edb2c4abaa4be1ee288e1a0a8e562924e21e848aff27c30d1dfdd88d"} Oct 07 14:08:02 crc kubenswrapper[4854]: I1007 14:08:02.467959 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64" Oct 07 14:08:02 crc kubenswrapper[4854]: I1007 14:08:02.467964 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f578e576edb2c4abaa4be1ee288e1a0a8e562924e21e848aff27c30d1dfdd88d" Oct 07 14:08:05 crc kubenswrapper[4854]: I1007 14:08:05.703081 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:08:05 crc kubenswrapper[4854]: E1007 14:08:05.703901 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:08:09 crc kubenswrapper[4854]: I1007 14:08:09.089437 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-866fh"] Oct 07 14:08:09 crc kubenswrapper[4854]: I1007 14:08:09.096891 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-866fh"] Oct 07 14:08:09 crc kubenswrapper[4854]: I1007 14:08:09.120364 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mk6pz"] Oct 07 14:08:09 crc kubenswrapper[4854]: I1007 14:08:09.128116 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mk6pz"] Oct 07 14:08:10 crc kubenswrapper[4854]: I1007 14:08:10.716597 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a7655f0-c07c-4ca3-9074-449f647da536" path="/var/lib/kubelet/pods/4a7655f0-c07c-4ca3-9074-449f647da536/volumes" Oct 07 14:08:10 crc kubenswrapper[4854]: I1007 14:08:10.717751 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="931987ae-eef0-4720-99ba-2b538e98726b" path="/var/lib/kubelet/pods/931987ae-eef0-4720-99ba-2b538e98726b/volumes" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.139603 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-s6sn8"] Oct 07 14:08:12 crc kubenswrapper[4854]: E1007 14:08:12.140365 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b37d062-3006-456c-8cea-ca674cc3ce32" containerName="pull" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.140381 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b37d062-3006-456c-8cea-ca674cc3ce32" containerName="pull" Oct 07 14:08:12 crc kubenswrapper[4854]: E1007 14:08:12.140396 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b37d062-3006-456c-8cea-ca674cc3ce32" containerName="extract" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.140404 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b37d062-3006-456c-8cea-ca674cc3ce32" containerName="extract" Oct 07 14:08:12 crc kubenswrapper[4854]: E1007 14:08:12.140425 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e937537-3ce6-44a9-80ca-d1378ef14544" containerName="horizon-log" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.140434 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e937537-3ce6-44a9-80ca-d1378ef14544" containerName="horizon-log" Oct 07 14:08:12 crc kubenswrapper[4854]: E1007 14:08:12.140458 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e937537-3ce6-44a9-80ca-d1378ef14544" containerName="horizon" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.140467 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e937537-3ce6-44a9-80ca-d1378ef14544" containerName="horizon" Oct 07 14:08:12 crc kubenswrapper[4854]: E1007 14:08:12.140483 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b37d062-3006-456c-8cea-ca674cc3ce32" containerName="util" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.140492 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b37d062-3006-456c-8cea-ca674cc3ce32" containerName="util" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.140737 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e937537-3ce6-44a9-80ca-d1378ef14544" containerName="horizon-log" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.140758 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e937537-3ce6-44a9-80ca-d1378ef14544" containerName="horizon" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.140776 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b37d062-3006-456c-8cea-ca674cc3ce32" containerName="extract" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.141579 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-s6sn8" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.148447 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-lmg4p" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.149471 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.152372 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.160735 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-s6sn8"] Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.212357 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px72n\" (UniqueName: \"kubernetes.io/projected/1dd6e29f-49fc-4584-9861-48432332df45-kube-api-access-px72n\") pod \"obo-prometheus-operator-7c8cf85677-s6sn8\" (UID: \"1dd6e29f-49fc-4584-9861-48432332df45\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-s6sn8" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.264090 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld"] Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.265523 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.268798 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-b2sng" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.269082 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.290252 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld"] Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.305078 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9"] Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.307063 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.314816 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px72n\" (UniqueName: \"kubernetes.io/projected/1dd6e29f-49fc-4584-9861-48432332df45-kube-api-access-px72n\") pod \"obo-prometheus-operator-7c8cf85677-s6sn8\" (UID: \"1dd6e29f-49fc-4584-9861-48432332df45\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-s6sn8" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.362291 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9"] Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.381992 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px72n\" (UniqueName: \"kubernetes.io/projected/1dd6e29f-49fc-4584-9861-48432332df45-kube-api-access-px72n\") pod \"obo-prometheus-operator-7c8cf85677-s6sn8\" (UID: \"1dd6e29f-49fc-4584-9861-48432332df45\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-s6sn8" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.417397 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3675a20e-0013-48cc-9adb-7baf69a36ac4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld\" (UID: \"3675a20e-0013-48cc-9adb-7baf69a36ac4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.417493 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3675a20e-0013-48cc-9adb-7baf69a36ac4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld\" (UID: \"3675a20e-0013-48cc-9adb-7baf69a36ac4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.417520 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/561c80f6-2f43-4922-8413-9f7ae61b0814-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9\" (UID: \"561c80f6-2f43-4922-8413-9f7ae61b0814\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.417588 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/561c80f6-2f43-4922-8413-9f7ae61b0814-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9\" (UID: \"561c80f6-2f43-4922-8413-9f7ae61b0814\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.470987 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-s6sn8" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.471949 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-2mdwc"] Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.473303 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-2mdwc" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.481275 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.481429 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-w5xdw" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.516674 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-2mdwc"] Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.519102 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/561c80f6-2f43-4922-8413-9f7ae61b0814-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9\" (UID: \"561c80f6-2f43-4922-8413-9f7ae61b0814\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.519249 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcf7v\" (UniqueName: \"kubernetes.io/projected/1e1a5094-5947-48c7-ad37-e5e8de5c8459-kube-api-access-kcf7v\") pod \"observability-operator-cc5f78dfc-2mdwc\" (UID: \"1e1a5094-5947-48c7-ad37-e5e8de5c8459\") " pod="openshift-operators/observability-operator-cc5f78dfc-2mdwc" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.519300 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3675a20e-0013-48cc-9adb-7baf69a36ac4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld\" (UID: \"3675a20e-0013-48cc-9adb-7baf69a36ac4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.519370 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3675a20e-0013-48cc-9adb-7baf69a36ac4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld\" (UID: \"3675a20e-0013-48cc-9adb-7baf69a36ac4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.519397 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e1a5094-5947-48c7-ad37-e5e8de5c8459-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-2mdwc\" (UID: \"1e1a5094-5947-48c7-ad37-e5e8de5c8459\") " pod="openshift-operators/observability-operator-cc5f78dfc-2mdwc" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.519415 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/561c80f6-2f43-4922-8413-9f7ae61b0814-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9\" (UID: \"561c80f6-2f43-4922-8413-9f7ae61b0814\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.532390 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/561c80f6-2f43-4922-8413-9f7ae61b0814-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9\" (UID: \"561c80f6-2f43-4922-8413-9f7ae61b0814\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.537088 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3675a20e-0013-48cc-9adb-7baf69a36ac4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld\" (UID: \"3675a20e-0013-48cc-9adb-7baf69a36ac4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.542058 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/561c80f6-2f43-4922-8413-9f7ae61b0814-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9\" (UID: \"561c80f6-2f43-4922-8413-9f7ae61b0814\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.544165 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3675a20e-0013-48cc-9adb-7baf69a36ac4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld\" (UID: \"3675a20e-0013-48cc-9adb-7baf69a36ac4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.584100 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.623476 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.623950 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e1a5094-5947-48c7-ad37-e5e8de5c8459-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-2mdwc\" (UID: \"1e1a5094-5947-48c7-ad37-e5e8de5c8459\") " pod="openshift-operators/observability-operator-cc5f78dfc-2mdwc" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.624087 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcf7v\" (UniqueName: \"kubernetes.io/projected/1e1a5094-5947-48c7-ad37-e5e8de5c8459-kube-api-access-kcf7v\") pod \"observability-operator-cc5f78dfc-2mdwc\" (UID: \"1e1a5094-5947-48c7-ad37-e5e8de5c8459\") " pod="openshift-operators/observability-operator-cc5f78dfc-2mdwc" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.633667 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e1a5094-5947-48c7-ad37-e5e8de5c8459-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-2mdwc\" (UID: \"1e1a5094-5947-48c7-ad37-e5e8de5c8459\") " pod="openshift-operators/observability-operator-cc5f78dfc-2mdwc" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.681806 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcf7v\" (UniqueName: \"kubernetes.io/projected/1e1a5094-5947-48c7-ad37-e5e8de5c8459-kube-api-access-kcf7v\") pod \"observability-operator-cc5f78dfc-2mdwc\" (UID: \"1e1a5094-5947-48c7-ad37-e5e8de5c8459\") " pod="openshift-operators/observability-operator-cc5f78dfc-2mdwc" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.743376 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-9h565"] Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.744575 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-9h565" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.753063 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-kv786" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.762995 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-9h565"] Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.821577 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-2mdwc" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.844996 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9bsr\" (UniqueName: \"kubernetes.io/projected/5a39c021-c124-4cc5-a90b-1a86531f6143-kube-api-access-p9bsr\") pod \"perses-operator-54bc95c9fb-9h565\" (UID: \"5a39c021-c124-4cc5-a90b-1a86531f6143\") " pod="openshift-operators/perses-operator-54bc95c9fb-9h565" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.845203 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a39c021-c124-4cc5-a90b-1a86531f6143-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-9h565\" (UID: \"5a39c021-c124-4cc5-a90b-1a86531f6143\") " pod="openshift-operators/perses-operator-54bc95c9fb-9h565" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.948329 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9bsr\" (UniqueName: \"kubernetes.io/projected/5a39c021-c124-4cc5-a90b-1a86531f6143-kube-api-access-p9bsr\") pod \"perses-operator-54bc95c9fb-9h565\" (UID: \"5a39c021-c124-4cc5-a90b-1a86531f6143\") " pod="openshift-operators/perses-operator-54bc95c9fb-9h565" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.949589 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a39c021-c124-4cc5-a90b-1a86531f6143-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-9h565\" (UID: \"5a39c021-c124-4cc5-a90b-1a86531f6143\") " pod="openshift-operators/perses-operator-54bc95c9fb-9h565" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.950490 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a39c021-c124-4cc5-a90b-1a86531f6143-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-9h565\" (UID: \"5a39c021-c124-4cc5-a90b-1a86531f6143\") " pod="openshift-operators/perses-operator-54bc95c9fb-9h565" Oct 07 14:08:12 crc kubenswrapper[4854]: I1007 14:08:12.972374 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9bsr\" (UniqueName: \"kubernetes.io/projected/5a39c021-c124-4cc5-a90b-1a86531f6143-kube-api-access-p9bsr\") pod \"perses-operator-54bc95c9fb-9h565\" (UID: \"5a39c021-c124-4cc5-a90b-1a86531f6143\") " pod="openshift-operators/perses-operator-54bc95c9fb-9h565" Oct 07 14:08:13 crc kubenswrapper[4854]: I1007 14:08:13.087099 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-9h565" Oct 07 14:08:13 crc kubenswrapper[4854]: I1007 14:08:13.191090 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-s6sn8"] Oct 07 14:08:13 crc kubenswrapper[4854]: I1007 14:08:13.364616 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9"] Oct 07 14:08:13 crc kubenswrapper[4854]: I1007 14:08:13.382437 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld"] Oct 07 14:08:13 crc kubenswrapper[4854]: I1007 14:08:13.593358 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9" event={"ID":"561c80f6-2f43-4922-8413-9f7ae61b0814","Type":"ContainerStarted","Data":"ee2927ceb3941a0fd89e0f1530f285b69580abd4ed9524e13b174d26a29f0d20"} Oct 07 14:08:13 crc kubenswrapper[4854]: I1007 14:08:13.600800 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld" event={"ID":"3675a20e-0013-48cc-9adb-7baf69a36ac4","Type":"ContainerStarted","Data":"9bd4c78fe7ff8a9aa28fddcba19b482d627c6b1cee49603cbe2345347077af84"} Oct 07 14:08:13 crc kubenswrapper[4854]: I1007 14:08:13.605910 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-s6sn8" event={"ID":"1dd6e29f-49fc-4584-9861-48432332df45","Type":"ContainerStarted","Data":"1a5a81c1463adfb8646edee5c17ba3d562ebb8a9e00cd6c9455787081f036d36"} Oct 07 14:08:13 crc kubenswrapper[4854]: I1007 14:08:13.678854 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-2mdwc"] Oct 07 14:08:13 crc kubenswrapper[4854]: I1007 14:08:13.857989 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-9h565"] Oct 07 14:08:14 crc kubenswrapper[4854]: I1007 14:08:14.617787 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-2mdwc" event={"ID":"1e1a5094-5947-48c7-ad37-e5e8de5c8459","Type":"ContainerStarted","Data":"e3829bc0a764deea669338adb286f8102a87b8e8f01eb2508e52879ea9e23573"} Oct 07 14:08:14 crc kubenswrapper[4854]: I1007 14:08:14.620075 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-9h565" event={"ID":"5a39c021-c124-4cc5-a90b-1a86531f6143","Type":"ContainerStarted","Data":"24ce98b9d69d098dfc94e5a6ee1c9ffedd4c1f8effc2a00961c086a51ae966bf"} Oct 07 14:08:20 crc kubenswrapper[4854]: I1007 14:08:20.706670 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:08:20 crc kubenswrapper[4854]: E1007 14:08:20.707589 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:08:22 crc kubenswrapper[4854]: I1007 14:08:22.719505 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-2mdwc" event={"ID":"1e1a5094-5947-48c7-ad37-e5e8de5c8459","Type":"ContainerStarted","Data":"3ae1f1f9ba93e42cc93b0d8941ef79402876244c43786b5c83416189eced9c37"} Oct 07 14:08:22 crc kubenswrapper[4854]: I1007 14:08:22.719924 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-2mdwc" Oct 07 14:08:22 crc kubenswrapper[4854]: I1007 14:08:22.721947 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-s6sn8" event={"ID":"1dd6e29f-49fc-4584-9861-48432332df45","Type":"ContainerStarted","Data":"78b47502c1d3f13fdd2f5364ceeacf757c31d7383677b018863cfd64384ca569"} Oct 07 14:08:22 crc kubenswrapper[4854]: I1007 14:08:22.723653 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9" event={"ID":"561c80f6-2f43-4922-8413-9f7ae61b0814","Type":"ContainerStarted","Data":"86afb23a41ecdca9c7c21d00e173cd0e5fea878f58898cfe21414a6ddbd5bcfe"} Oct 07 14:08:22 crc kubenswrapper[4854]: I1007 14:08:22.725060 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-9h565" event={"ID":"5a39c021-c124-4cc5-a90b-1a86531f6143","Type":"ContainerStarted","Data":"4b926a2ee361aec1f154b54e8c48cbfbfd8d86834d34b1179c6d9d40f8c1d678"} Oct 07 14:08:22 crc kubenswrapper[4854]: I1007 14:08:22.725170 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-9h565" Oct 07 14:08:22 crc kubenswrapper[4854]: I1007 14:08:22.726453 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld" event={"ID":"3675a20e-0013-48cc-9adb-7baf69a36ac4","Type":"ContainerStarted","Data":"84dfab3c636332f4d1aa027df6b9a7d40e6d79e5ca39f75533858f64fd4105b2"} Oct 07 14:08:22 crc kubenswrapper[4854]: I1007 14:08:22.740993 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-2mdwc" Oct 07 14:08:22 crc kubenswrapper[4854]: I1007 14:08:22.747641 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-2mdwc" podStartSLOduration=2.65173383 podStartE2EDuration="10.747621781s" podCreationTimestamp="2025-10-07 14:08:12 +0000 UTC" firstStartedPulling="2025-10-07 14:08:13.677850178 +0000 UTC m=+6209.665682423" lastFinishedPulling="2025-10-07 14:08:21.773738119 +0000 UTC m=+6217.761570374" observedRunningTime="2025-10-07 14:08:22.737130539 +0000 UTC m=+6218.724962804" watchObservedRunningTime="2025-10-07 14:08:22.747621781 +0000 UTC m=+6218.735454036" Oct 07 14:08:22 crc kubenswrapper[4854]: I1007 14:08:22.765399 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld" podStartSLOduration=2.441957783 podStartE2EDuration="10.765373181s" podCreationTimestamp="2025-10-07 14:08:12 +0000 UTC" firstStartedPulling="2025-10-07 14:08:13.41516356 +0000 UTC m=+6209.402995805" lastFinishedPulling="2025-10-07 14:08:21.738578948 +0000 UTC m=+6217.726411203" observedRunningTime="2025-10-07 14:08:22.757595647 +0000 UTC m=+6218.745427902" watchObservedRunningTime="2025-10-07 14:08:22.765373181 +0000 UTC m=+6218.753205436" Oct 07 14:08:22 crc kubenswrapper[4854]: I1007 14:08:22.794701 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-s6sn8" podStartSLOduration=2.283581942 podStartE2EDuration="10.794685223s" podCreationTimestamp="2025-10-07 14:08:12 +0000 UTC" firstStartedPulling="2025-10-07 14:08:13.230881065 +0000 UTC m=+6209.218713320" lastFinishedPulling="2025-10-07 14:08:21.741984346 +0000 UTC m=+6217.729816601" observedRunningTime="2025-10-07 14:08:22.792549961 +0000 UTC m=+6218.780382216" watchObservedRunningTime="2025-10-07 14:08:22.794685223 +0000 UTC m=+6218.782517478" Oct 07 14:08:22 crc kubenswrapper[4854]: I1007 14:08:22.819852 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9" podStartSLOduration=2.427557868 podStartE2EDuration="10.819836085s" podCreationTimestamp="2025-10-07 14:08:12 +0000 UTC" firstStartedPulling="2025-10-07 14:08:13.381494603 +0000 UTC m=+6209.369326858" lastFinishedPulling="2025-10-07 14:08:21.77377282 +0000 UTC m=+6217.761605075" observedRunningTime="2025-10-07 14:08:22.815729198 +0000 UTC m=+6218.803561453" watchObservedRunningTime="2025-10-07 14:08:22.819836085 +0000 UTC m=+6218.807668340" Oct 07 14:08:22 crc kubenswrapper[4854]: I1007 14:08:22.846288 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-9h565" podStartSLOduration=2.970103827 podStartE2EDuration="10.846268635s" podCreationTimestamp="2025-10-07 14:08:12 +0000 UTC" firstStartedPulling="2025-10-07 14:08:13.864245133 +0000 UTC m=+6209.852077388" lastFinishedPulling="2025-10-07 14:08:21.740409941 +0000 UTC m=+6217.728242196" observedRunningTime="2025-10-07 14:08:22.837536404 +0000 UTC m=+6218.825368659" watchObservedRunningTime="2025-10-07 14:08:22.846268635 +0000 UTC m=+6218.834100910" Oct 07 14:08:23 crc kubenswrapper[4854]: I1007 14:08:23.036073 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-7jltj"] Oct 07 14:08:23 crc kubenswrapper[4854]: I1007 14:08:23.044691 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-7jltj"] Oct 07 14:08:24 crc kubenswrapper[4854]: I1007 14:08:24.600504 4854 scope.go:117] "RemoveContainer" containerID="7cae3b88b9c7a9cc363911f17a67d7217527ff9f01af5c23ac170030384adae5" Oct 07 14:08:24 crc kubenswrapper[4854]: I1007 14:08:24.627102 4854 scope.go:117] "RemoveContainer" containerID="47a262d42eeb5b9f85dd35bcceb4918ee64bdf7a0d248afef269dc9c7999272b" Oct 07 14:08:24 crc kubenswrapper[4854]: I1007 14:08:24.685746 4854 scope.go:117] "RemoveContainer" containerID="2e783d4b8d5383605800e1c8e6ea4a86935e417c49a7dc3493ad122f4a0c171e" Oct 07 14:08:24 crc kubenswrapper[4854]: I1007 14:08:24.727344 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24483653-65d3-4679-b34c-c25b9a30a6cc" path="/var/lib/kubelet/pods/24483653-65d3-4679-b34c-c25b9a30a6cc/volumes" Oct 07 14:08:24 crc kubenswrapper[4854]: I1007 14:08:24.757450 4854 scope.go:117] "RemoveContainer" containerID="b4ed2e1c5d94007508f04b23ef0de26b29d3ba95195164bbf76397bb026525a0" Oct 07 14:08:24 crc kubenswrapper[4854]: I1007 14:08:24.798802 4854 scope.go:117] "RemoveContainer" containerID="bcf8dc0d93e3aeda3f9ef4dc9c53fb2af324c36e6e24b46342f879499bf4b7ca" Oct 07 14:08:24 crc kubenswrapper[4854]: I1007 14:08:24.924803 4854 scope.go:117] "RemoveContainer" containerID="64dc2353957cd2136fe7e081d1d2716580400493d4919029ba9f566824dc836e" Oct 07 14:08:25 crc kubenswrapper[4854]: I1007 14:08:25.002568 4854 scope.go:117] "RemoveContainer" containerID="0cde039c2e3cb974a2412ea453914ae9d2562a76caa6294b3ca89ee1cdb69039" Oct 07 14:08:25 crc kubenswrapper[4854]: I1007 14:08:25.072317 4854 scope.go:117] "RemoveContainer" containerID="cca28208affe024a95ef54b2966477e7abcd6ed9f4ab6feea62378dcae7c014f" Oct 07 14:08:25 crc kubenswrapper[4854]: I1007 14:08:25.112785 4854 scope.go:117] "RemoveContainer" containerID="7e9fae7d0ed33c260a17f3617aaaf076c4863385974f314669ae18e9c647b54c" Oct 07 14:08:25 crc kubenswrapper[4854]: I1007 14:08:25.135896 4854 scope.go:117] "RemoveContainer" containerID="97695d80808032506527452ea15f07d07327d3c990087dc5ec39d10f030b9c42" Oct 07 14:08:31 crc kubenswrapper[4854]: I1007 14:08:31.702658 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:08:31 crc kubenswrapper[4854]: E1007 14:08:31.703451 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:08:33 crc kubenswrapper[4854]: I1007 14:08:33.090053 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-9h565" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.576628 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.577356 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="66c67541-4e54-48ab-96dd-7d904d0ff7d2" containerName="openstackclient" containerID="cri-o://7a04548110984e439eb546bdcce8e1598dbda5abb0f426cacc29230e34833f37" gracePeriod=2 Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.593443 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.686525 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 07 14:08:35 crc kubenswrapper[4854]: E1007 14:08:35.687073 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c67541-4e54-48ab-96dd-7d904d0ff7d2" containerName="openstackclient" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.687095 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c67541-4e54-48ab-96dd-7d904d0ff7d2" containerName="openstackclient" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.687348 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c67541-4e54-48ab-96dd-7d904d0ff7d2" containerName="openstackclient" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.688217 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.694913 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.747414 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/00a568f6-04ab-4d3c-a92f-6e4d35532950-openstack-config\") pod \"openstackclient\" (UID: \"00a568f6-04ab-4d3c-a92f-6e4d35532950\") " pod="openstack/openstackclient" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.747488 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr9k4\" (UniqueName: \"kubernetes.io/projected/00a568f6-04ab-4d3c-a92f-6e4d35532950-kube-api-access-sr9k4\") pod \"openstackclient\" (UID: \"00a568f6-04ab-4d3c-a92f-6e4d35532950\") " pod="openstack/openstackclient" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.747560 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/00a568f6-04ab-4d3c-a92f-6e4d35532950-openstack-config-secret\") pod \"openstackclient\" (UID: \"00a568f6-04ab-4d3c-a92f-6e4d35532950\") " pod="openstack/openstackclient" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.851516 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/00a568f6-04ab-4d3c-a92f-6e4d35532950-openstack-config\") pod \"openstackclient\" (UID: \"00a568f6-04ab-4d3c-a92f-6e4d35532950\") " pod="openstack/openstackclient" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.851589 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr9k4\" (UniqueName: \"kubernetes.io/projected/00a568f6-04ab-4d3c-a92f-6e4d35532950-kube-api-access-sr9k4\") pod \"openstackclient\" (UID: \"00a568f6-04ab-4d3c-a92f-6e4d35532950\") " pod="openstack/openstackclient" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.851672 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/00a568f6-04ab-4d3c-a92f-6e4d35532950-openstack-config-secret\") pod \"openstackclient\" (UID: \"00a568f6-04ab-4d3c-a92f-6e4d35532950\") " pod="openstack/openstackclient" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.852496 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/00a568f6-04ab-4d3c-a92f-6e4d35532950-openstack-config\") pod \"openstackclient\" (UID: \"00a568f6-04ab-4d3c-a92f-6e4d35532950\") " pod="openstack/openstackclient" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.861816 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/00a568f6-04ab-4d3c-a92f-6e4d35532950-openstack-config-secret\") pod \"openstackclient\" (UID: \"00a568f6-04ab-4d3c-a92f-6e4d35532950\") " pod="openstack/openstackclient" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.875832 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr9k4\" (UniqueName: \"kubernetes.io/projected/00a568f6-04ab-4d3c-a92f-6e4d35532950-kube-api-access-sr9k4\") pod \"openstackclient\" (UID: \"00a568f6-04ab-4d3c-a92f-6e4d35532950\") " pod="openstack/openstackclient" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.968455 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.970292 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.973459 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-64prw" Oct 07 14:08:35 crc kubenswrapper[4854]: I1007 14:08:35.984751 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.009753 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.067770 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp5db\" (UniqueName: \"kubernetes.io/projected/c33e1e79-872b-4328-a544-f34779689934-kube-api-access-wp5db\") pod \"kube-state-metrics-0\" (UID: \"c33e1e79-872b-4328-a544-f34779689934\") " pod="openstack/kube-state-metrics-0" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.172185 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp5db\" (UniqueName: \"kubernetes.io/projected/c33e1e79-872b-4328-a544-f34779689934-kube-api-access-wp5db\") pod \"kube-state-metrics-0\" (UID: \"c33e1e79-872b-4328-a544-f34779689934\") " pod="openstack/kube-state-metrics-0" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.206571 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp5db\" (UniqueName: \"kubernetes.io/projected/c33e1e79-872b-4328-a544-f34779689934-kube-api-access-wp5db\") pod \"kube-state-metrics-0\" (UID: \"c33e1e79-872b-4328-a544-f34779689934\") " pod="openstack/kube-state-metrics-0" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.296396 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.855666 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.868326 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.880466 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.880716 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-k6hdv" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.880843 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.880949 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.888595 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.965811 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.965871 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6w7s\" (UniqueName: \"kubernetes.io/projected/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-kube-api-access-j6w7s\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.965965 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.966010 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.966053 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:36 crc kubenswrapper[4854]: I1007 14:08:36.966102 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.069090 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.069441 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.069489 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.069539 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.069599 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.069619 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6w7s\" (UniqueName: \"kubernetes.io/projected/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-kube-api-access-j6w7s\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.070211 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.099512 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6w7s\" (UniqueName: \"kubernetes.io/projected/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-kube-api-access-j6w7s\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.117178 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.117703 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.126692 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.130831 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cc17a7f7-bac6-4b57-bf18-1f3110b14f29-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"cc17a7f7-bac6-4b57-bf18-1f3110b14f29\") " pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.157782 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.275262 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.472437 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.485241 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.487848 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.493035 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.494818 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.494874 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kljtf" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.495044 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.495184 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.495309 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.495047 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.581498 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3edb391a-9ddf-4fed-bc20-51d79f783380-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.581613 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3edb391a-9ddf-4fed-bc20-51d79f783380-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.581677 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f5eaf04b-f2d5-4a38-8c49-e5f14f77b52c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f5eaf04b-f2d5-4a38-8c49-e5f14f77b52c\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.581702 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3edb391a-9ddf-4fed-bc20-51d79f783380-config\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.581784 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3edb391a-9ddf-4fed-bc20-51d79f783380-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.581827 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3edb391a-9ddf-4fed-bc20-51d79f783380-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.581907 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3edb391a-9ddf-4fed-bc20-51d79f783380-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.581953 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stk8h\" (UniqueName: \"kubernetes.io/projected/3edb391a-9ddf-4fed-bc20-51d79f783380-kube-api-access-stk8h\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.686393 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3edb391a-9ddf-4fed-bc20-51d79f783380-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.686795 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3edb391a-9ddf-4fed-bc20-51d79f783380-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.686949 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3edb391a-9ddf-4fed-bc20-51d79f783380-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.687013 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stk8h\" (UniqueName: \"kubernetes.io/projected/3edb391a-9ddf-4fed-bc20-51d79f783380-kube-api-access-stk8h\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.687074 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3edb391a-9ddf-4fed-bc20-51d79f783380-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.687169 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3edb391a-9ddf-4fed-bc20-51d79f783380-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.687563 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f5eaf04b-f2d5-4a38-8c49-e5f14f77b52c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f5eaf04b-f2d5-4a38-8c49-e5f14f77b52c\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.687609 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3edb391a-9ddf-4fed-bc20-51d79f783380-config\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.688865 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3edb391a-9ddf-4fed-bc20-51d79f783380-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.697644 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3edb391a-9ddf-4fed-bc20-51d79f783380-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.698326 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3edb391a-9ddf-4fed-bc20-51d79f783380-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.699643 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3edb391a-9ddf-4fed-bc20-51d79f783380-config\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.705104 4854 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.705163 4854 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f5eaf04b-f2d5-4a38-8c49-e5f14f77b52c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f5eaf04b-f2d5-4a38-8c49-e5f14f77b52c\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/04b134550aa1f985ddce7c5b07caeeda315cad504a5041362fc3c5d9ff3b829f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.718622 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3edb391a-9ddf-4fed-bc20-51d79f783380-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.735035 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3edb391a-9ddf-4fed-bc20-51d79f783380-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.735257 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stk8h\" (UniqueName: \"kubernetes.io/projected/3edb391a-9ddf-4fed-bc20-51d79f783380-kube-api-access-stk8h\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.798124 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f5eaf04b-f2d5-4a38-8c49-e5f14f77b52c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f5eaf04b-f2d5-4a38-8c49-e5f14f77b52c\") pod \"prometheus-metric-storage-0\" (UID: \"3edb391a-9ddf-4fed-bc20-51d79f783380\") " pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.854391 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.885469 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"00a568f6-04ab-4d3c-a92f-6e4d35532950","Type":"ContainerStarted","Data":"fbd39b7d9e0a697aba99917ced3a56d0a00db396f5ce883cade99e0a6e1cbacf"} Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.885521 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"00a568f6-04ab-4d3c-a92f-6e4d35532950","Type":"ContainerStarted","Data":"97c63f81174994a6717801bb8bf78df0cc524b776940532a2f3e3f51e2218287"} Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.888127 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c33e1e79-872b-4328-a544-f34779689934","Type":"ContainerStarted","Data":"e0d5883aeae7a1e7fa4297c0f8b7543be6869d71b14181b1c469d58d2e8326ca"} Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.890239 4854 generic.go:334] "Generic (PLEG): container finished" podID="66c67541-4e54-48ab-96dd-7d904d0ff7d2" containerID="7a04548110984e439eb546bdcce8e1598dbda5abb0f426cacc29230e34833f37" exitCode=137 Oct 07 14:08:37 crc kubenswrapper[4854]: I1007 14:08:37.935271 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.9352510929999998 podStartE2EDuration="2.935251093s" podCreationTimestamp="2025-10-07 14:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:08:37.900798033 +0000 UTC m=+6233.888630278" watchObservedRunningTime="2025-10-07 14:08:37.935251093 +0000 UTC m=+6233.923083348" Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.128073 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.180412 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.327897 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66c67541-4e54-48ab-96dd-7d904d0ff7d2-openstack-config-secret\") pod \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\" (UID: \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\") " Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.327994 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66c67541-4e54-48ab-96dd-7d904d0ff7d2-openstack-config\") pod \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\" (UID: \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\") " Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.328197 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgfkf\" (UniqueName: \"kubernetes.io/projected/66c67541-4e54-48ab-96dd-7d904d0ff7d2-kube-api-access-kgfkf\") pod \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\" (UID: \"66c67541-4e54-48ab-96dd-7d904d0ff7d2\") " Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.349740 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c67541-4e54-48ab-96dd-7d904d0ff7d2-kube-api-access-kgfkf" (OuterVolumeSpecName: "kube-api-access-kgfkf") pod "66c67541-4e54-48ab-96dd-7d904d0ff7d2" (UID: "66c67541-4e54-48ab-96dd-7d904d0ff7d2"). InnerVolumeSpecName "kube-api-access-kgfkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.391389 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66c67541-4e54-48ab-96dd-7d904d0ff7d2-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "66c67541-4e54-48ab-96dd-7d904d0ff7d2" (UID: "66c67541-4e54-48ab-96dd-7d904d0ff7d2"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.431873 4854 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66c67541-4e54-48ab-96dd-7d904d0ff7d2-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.431924 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgfkf\" (UniqueName: \"kubernetes.io/projected/66c67541-4e54-48ab-96dd-7d904d0ff7d2-kube-api-access-kgfkf\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.449855 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c67541-4e54-48ab-96dd-7d904d0ff7d2-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "66c67541-4e54-48ab-96dd-7d904d0ff7d2" (UID: "66c67541-4e54-48ab-96dd-7d904d0ff7d2"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.457137 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.534273 4854 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66c67541-4e54-48ab-96dd-7d904d0ff7d2-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.717348 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c67541-4e54-48ab-96dd-7d904d0ff7d2" path="/var/lib/kubelet/pods/66c67541-4e54-48ab-96dd-7d904d0ff7d2/volumes" Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.907266 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"cc17a7f7-bac6-4b57-bf18-1f3110b14f29","Type":"ContainerStarted","Data":"7def990270b06a8e2de3a520e227c1f11f564e9933374ce2eff70203f24a19ed"} Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.909397 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3edb391a-9ddf-4fed-bc20-51d79f783380","Type":"ContainerStarted","Data":"80eeb613489bbd170fe9bb72992e6285c16d1e882ab72946406a4b18a1c951ab"} Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.912559 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c33e1e79-872b-4328-a544-f34779689934","Type":"ContainerStarted","Data":"ea7d3a223a7dd94e70993e0408da487a8cb494bd0b424dadf6c058d7512d8c6b"} Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.913061 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.915314 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.915746 4854 scope.go:117] "RemoveContainer" containerID="7a04548110984e439eb546bdcce8e1598dbda5abb0f426cacc29230e34833f37" Oct 07 14:08:38 crc kubenswrapper[4854]: I1007 14:08:38.944862 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.490466245 podStartE2EDuration="3.944840591s" podCreationTimestamp="2025-10-07 14:08:35 +0000 UTC" firstStartedPulling="2025-10-07 14:08:37.511453516 +0000 UTC m=+6233.499285771" lastFinishedPulling="2025-10-07 14:08:37.965827862 +0000 UTC m=+6233.953660117" observedRunningTime="2025-10-07 14:08:38.935320878 +0000 UTC m=+6234.923153143" watchObservedRunningTime="2025-10-07 14:08:38.944840591 +0000 UTC m=+6234.932672846" Oct 07 14:08:43 crc kubenswrapper[4854]: I1007 14:08:43.734759 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:08:43 crc kubenswrapper[4854]: E1007 14:08:43.735542 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:08:44 crc kubenswrapper[4854]: I1007 14:08:44.982608 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3edb391a-9ddf-4fed-bc20-51d79f783380","Type":"ContainerStarted","Data":"cb6982def81fa516a6e88ea72eb3fdc1724e65a1d89fdea6d2d8a9382ba65f9e"} Oct 07 14:08:44 crc kubenswrapper[4854]: I1007 14:08:44.984194 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"cc17a7f7-bac6-4b57-bf18-1f3110b14f29","Type":"ContainerStarted","Data":"987842c8f264f93a9d99f5e86794cf87f75246e8efed00151649331b0eb3bafd"} Oct 07 14:08:46 crc kubenswrapper[4854]: I1007 14:08:46.301561 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 14:08:54 crc kubenswrapper[4854]: I1007 14:08:54.093452 4854 generic.go:334] "Generic (PLEG): container finished" podID="3edb391a-9ddf-4fed-bc20-51d79f783380" containerID="cb6982def81fa516a6e88ea72eb3fdc1724e65a1d89fdea6d2d8a9382ba65f9e" exitCode=0 Oct 07 14:08:54 crc kubenswrapper[4854]: I1007 14:08:54.093557 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3edb391a-9ddf-4fed-bc20-51d79f783380","Type":"ContainerDied","Data":"cb6982def81fa516a6e88ea72eb3fdc1724e65a1d89fdea6d2d8a9382ba65f9e"} Oct 07 14:08:56 crc kubenswrapper[4854]: I1007 14:08:56.111613 4854 generic.go:334] "Generic (PLEG): container finished" podID="cc17a7f7-bac6-4b57-bf18-1f3110b14f29" containerID="987842c8f264f93a9d99f5e86794cf87f75246e8efed00151649331b0eb3bafd" exitCode=0 Oct 07 14:08:56 crc kubenswrapper[4854]: I1007 14:08:56.111796 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"cc17a7f7-bac6-4b57-bf18-1f3110b14f29","Type":"ContainerDied","Data":"987842c8f264f93a9d99f5e86794cf87f75246e8efed00151649331b0eb3bafd"} Oct 07 14:08:58 crc kubenswrapper[4854]: I1007 14:08:58.702997 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:08:58 crc kubenswrapper[4854]: E1007 14:08:58.703571 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:09:01 crc kubenswrapper[4854]: I1007 14:09:01.191842 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"cc17a7f7-bac6-4b57-bf18-1f3110b14f29","Type":"ContainerStarted","Data":"f7edb3054e02f92a402551618baf570ae77f786b5c0ffc25839cf506482076f6"} Oct 07 14:09:01 crc kubenswrapper[4854]: I1007 14:09:01.195399 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3edb391a-9ddf-4fed-bc20-51d79f783380","Type":"ContainerStarted","Data":"82a54499552354960262c89ae9f4f24490884524e5e581ce932b9ec0919647cd"} Oct 07 14:09:05 crc kubenswrapper[4854]: I1007 14:09:05.264827 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3edb391a-9ddf-4fed-bc20-51d79f783380","Type":"ContainerStarted","Data":"0338f4a1da82d01d757ca298e228ee726621233561fe737559d16e9fd10e4d53"} Oct 07 14:09:05 crc kubenswrapper[4854]: I1007 14:09:05.267815 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"cc17a7f7-bac6-4b57-bf18-1f3110b14f29","Type":"ContainerStarted","Data":"63222890e2aacd77916a2f0b2a570277250a5f7de0d22e63a3ec1f36847a24f5"} Oct 07 14:09:05 crc kubenswrapper[4854]: I1007 14:09:05.268363 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 07 14:09:05 crc kubenswrapper[4854]: I1007 14:09:05.273549 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 07 14:09:05 crc kubenswrapper[4854]: I1007 14:09:05.299625 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.653388626 podStartE2EDuration="29.299605228s" podCreationTimestamp="2025-10-07 14:08:36 +0000 UTC" firstStartedPulling="2025-10-07 14:08:38.160093444 +0000 UTC m=+6234.147925699" lastFinishedPulling="2025-10-07 14:09:00.806310046 +0000 UTC m=+6256.794142301" observedRunningTime="2025-10-07 14:09:05.29410406 +0000 UTC m=+6261.281936315" watchObservedRunningTime="2025-10-07 14:09:05.299605228 +0000 UTC m=+6261.287437483" Oct 07 14:09:07 crc kubenswrapper[4854]: I1007 14:09:07.047250 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6p2jw"] Oct 07 14:09:07 crc kubenswrapper[4854]: I1007 14:09:07.054759 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6p2jw"] Oct 07 14:09:08 crc kubenswrapper[4854]: I1007 14:09:08.304730 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3edb391a-9ddf-4fed-bc20-51d79f783380","Type":"ContainerStarted","Data":"faed4aca6f551f677b4955812301ab92d982951fec30acd8bec3514aab1c74a3"} Oct 07 14:09:08 crc kubenswrapper[4854]: I1007 14:09:08.352875 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=2.878370031 podStartE2EDuration="32.352778222s" podCreationTimestamp="2025-10-07 14:08:36 +0000 UTC" firstStartedPulling="2025-10-07 14:08:38.471419029 +0000 UTC m=+6234.459251294" lastFinishedPulling="2025-10-07 14:09:07.94582723 +0000 UTC m=+6263.933659485" observedRunningTime="2025-10-07 14:09:08.340220361 +0000 UTC m=+6264.328052616" watchObservedRunningTime="2025-10-07 14:09:08.352778222 +0000 UTC m=+6264.340610467" Oct 07 14:09:08 crc kubenswrapper[4854]: I1007 14:09:08.726212 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c2b6c86-281c-4a3d-9998-4bb308bea16c" path="/var/lib/kubelet/pods/6c2b6c86-281c-4a3d-9998-4bb308bea16c/volumes" Oct 07 14:09:12 crc kubenswrapper[4854]: I1007 14:09:12.702681 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:09:12 crc kubenswrapper[4854]: E1007 14:09:12.703415 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:09:12 crc kubenswrapper[4854]: I1007 14:09:12.854900 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.552831 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.558913 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.566737 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.573349 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.586958 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.669432 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/618b91b9-b089-4347-978b-2d4de31c17e8-run-httpd\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.670759 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/618b91b9-b089-4347-978b-2d4de31c17e8-log-httpd\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.671080 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.671375 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-scripts\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.671772 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.671958 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-config-data\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.672326 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9nnm\" (UniqueName: \"kubernetes.io/projected/618b91b9-b089-4347-978b-2d4de31c17e8-kube-api-access-b9nnm\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.773988 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/618b91b9-b089-4347-978b-2d4de31c17e8-log-httpd\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.774037 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/618b91b9-b089-4347-978b-2d4de31c17e8-run-httpd\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.774075 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.774158 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-scripts\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.774226 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.774252 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-config-data\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.774374 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9nnm\" (UniqueName: \"kubernetes.io/projected/618b91b9-b089-4347-978b-2d4de31c17e8-kube-api-access-b9nnm\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.775220 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/618b91b9-b089-4347-978b-2d4de31c17e8-log-httpd\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.775550 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/618b91b9-b089-4347-978b-2d4de31c17e8-run-httpd\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.780860 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.781489 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.782245 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-config-data\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.782809 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-scripts\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.807057 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9nnm\" (UniqueName: \"kubernetes.io/projected/618b91b9-b089-4347-978b-2d4de31c17e8-kube-api-access-b9nnm\") pod \"ceilometer-0\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " pod="openstack/ceilometer-0" Oct 07 14:09:13 crc kubenswrapper[4854]: I1007 14:09:13.906788 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:09:14 crc kubenswrapper[4854]: I1007 14:09:14.413895 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:09:14 crc kubenswrapper[4854]: W1007 14:09:14.417426 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod618b91b9_b089_4347_978b_2d4de31c17e8.slice/crio-02959f06c6c102a1da010f4aa9026cdc3f4ed6deeab254c05d17dc905b20dbe3 WatchSource:0}: Error finding container 02959f06c6c102a1da010f4aa9026cdc3f4ed6deeab254c05d17dc905b20dbe3: Status 404 returned error can't find the container with id 02959f06c6c102a1da010f4aa9026cdc3f4ed6deeab254c05d17dc905b20dbe3 Oct 07 14:09:15 crc kubenswrapper[4854]: I1007 14:09:15.379933 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"618b91b9-b089-4347-978b-2d4de31c17e8","Type":"ContainerStarted","Data":"02959f06c6c102a1da010f4aa9026cdc3f4ed6deeab254c05d17dc905b20dbe3"} Oct 07 14:09:16 crc kubenswrapper[4854]: I1007 14:09:16.390607 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"618b91b9-b089-4347-978b-2d4de31c17e8","Type":"ContainerStarted","Data":"f0ae54b9263d72dbbc45096a6da10da0ed2a6289efdea821063b6c790e37f972"} Oct 07 14:09:16 crc kubenswrapper[4854]: I1007 14:09:16.390881 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"618b91b9-b089-4347-978b-2d4de31c17e8","Type":"ContainerStarted","Data":"0d6a11b2c54d5e0c3d4795eab2434541737ba64e65747fc72f2ac52dd2f9ad73"} Oct 07 14:09:17 crc kubenswrapper[4854]: I1007 14:09:17.408724 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"618b91b9-b089-4347-978b-2d4de31c17e8","Type":"ContainerStarted","Data":"3aefacd66ba95319410e40794141b5f0749a775f85038b2431056dc7ed8dcb68"} Oct 07 14:09:18 crc kubenswrapper[4854]: I1007 14:09:18.042539 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6f06-account-create-52qjt"] Oct 07 14:09:18 crc kubenswrapper[4854]: I1007 14:09:18.059061 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-6f06-account-create-52qjt"] Oct 07 14:09:18 crc kubenswrapper[4854]: I1007 14:09:18.716331 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="965e4c63-cd95-4f53-93df-3ad027fd5758" path="/var/lib/kubelet/pods/965e4c63-cd95-4f53-93df-3ad027fd5758/volumes" Oct 07 14:09:19 crc kubenswrapper[4854]: I1007 14:09:19.442357 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"618b91b9-b089-4347-978b-2d4de31c17e8","Type":"ContainerStarted","Data":"17377fcd8911716a361c9ef687bc3ea7d691ebb06ac3f176f992a95146931227"} Oct 07 14:09:19 crc kubenswrapper[4854]: I1007 14:09:19.443249 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 14:09:19 crc kubenswrapper[4854]: I1007 14:09:19.477044 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.2439469 podStartE2EDuration="6.477023725s" podCreationTimestamp="2025-10-07 14:09:13 +0000 UTC" firstStartedPulling="2025-10-07 14:09:14.420525222 +0000 UTC m=+6270.408357477" lastFinishedPulling="2025-10-07 14:09:18.653602037 +0000 UTC m=+6274.641434302" observedRunningTime="2025-10-07 14:09:19.468599763 +0000 UTC m=+6275.456432018" watchObservedRunningTime="2025-10-07 14:09:19.477023725 +0000 UTC m=+6275.464855970" Oct 07 14:09:22 crc kubenswrapper[4854]: I1007 14:09:22.854691 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 07 14:09:22 crc kubenswrapper[4854]: I1007 14:09:22.858571 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 07 14:09:23 crc kubenswrapper[4854]: I1007 14:09:23.514580 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 07 14:09:23 crc kubenswrapper[4854]: I1007 14:09:23.589208 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-lzlxs"] Oct 07 14:09:23 crc kubenswrapper[4854]: I1007 14:09:23.590631 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-lzlxs" Oct 07 14:09:23 crc kubenswrapper[4854]: I1007 14:09:23.627608 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-lzlxs"] Oct 07 14:09:23 crc kubenswrapper[4854]: I1007 14:09:23.697454 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp7tj\" (UniqueName: \"kubernetes.io/projected/9083bc7a-21f0-4620-bb10-15aa3011034d-kube-api-access-mp7tj\") pod \"aodh-db-create-lzlxs\" (UID: \"9083bc7a-21f0-4620-bb10-15aa3011034d\") " pod="openstack/aodh-db-create-lzlxs" Oct 07 14:09:23 crc kubenswrapper[4854]: I1007 14:09:23.703015 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:09:23 crc kubenswrapper[4854]: E1007 14:09:23.703419 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:09:23 crc kubenswrapper[4854]: I1007 14:09:23.799852 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp7tj\" (UniqueName: \"kubernetes.io/projected/9083bc7a-21f0-4620-bb10-15aa3011034d-kube-api-access-mp7tj\") pod \"aodh-db-create-lzlxs\" (UID: \"9083bc7a-21f0-4620-bb10-15aa3011034d\") " pod="openstack/aodh-db-create-lzlxs" Oct 07 14:09:23 crc kubenswrapper[4854]: I1007 14:09:23.817391 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp7tj\" (UniqueName: \"kubernetes.io/projected/9083bc7a-21f0-4620-bb10-15aa3011034d-kube-api-access-mp7tj\") pod \"aodh-db-create-lzlxs\" (UID: \"9083bc7a-21f0-4620-bb10-15aa3011034d\") " pod="openstack/aodh-db-create-lzlxs" Oct 07 14:09:23 crc kubenswrapper[4854]: I1007 14:09:23.924104 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-lzlxs" Oct 07 14:09:24 crc kubenswrapper[4854]: I1007 14:09:24.513935 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-lzlxs"] Oct 07 14:09:24 crc kubenswrapper[4854]: I1007 14:09:24.541384 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-lzlxs" event={"ID":"9083bc7a-21f0-4620-bb10-15aa3011034d","Type":"ContainerStarted","Data":"979d73fb9c3e51984d238a751e4a176b651e7565af3882e709f19f639d8e0fff"} Oct 07 14:09:25 crc kubenswrapper[4854]: I1007 14:09:25.027834 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-fccv9"] Oct 07 14:09:25 crc kubenswrapper[4854]: I1007 14:09:25.035851 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-fccv9"] Oct 07 14:09:25 crc kubenswrapper[4854]: I1007 14:09:25.432008 4854 scope.go:117] "RemoveContainer" containerID="388423d7188e6e69d16a1682193b76bf45f0a8ead74b37a6deb545b4d26b8afc" Oct 07 14:09:25 crc kubenswrapper[4854]: I1007 14:09:25.459503 4854 scope.go:117] "RemoveContainer" containerID="bfb141acdac6641fdd80c3f01c194ca941338295668bc6f98d8ea4a469471a17" Oct 07 14:09:25 crc kubenswrapper[4854]: I1007 14:09:25.507127 4854 scope.go:117] "RemoveContainer" containerID="09136106b68329237faf5b4cd31b40ce6c296e85cfae8247c08017c039efa6b0" Oct 07 14:09:25 crc kubenswrapper[4854]: I1007 14:09:25.555611 4854 generic.go:334] "Generic (PLEG): container finished" podID="9083bc7a-21f0-4620-bb10-15aa3011034d" containerID="76987baad39c168f1d37164d5c27fc07893b828c214f9819ddc6a4e29cf15ca2" exitCode=0 Oct 07 14:09:25 crc kubenswrapper[4854]: I1007 14:09:25.555733 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-lzlxs" event={"ID":"9083bc7a-21f0-4620-bb10-15aa3011034d","Type":"ContainerDied","Data":"76987baad39c168f1d37164d5c27fc07893b828c214f9819ddc6a4e29cf15ca2"} Oct 07 14:09:26 crc kubenswrapper[4854]: I1007 14:09:26.716575 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f36c7ef9-d715-4471-966d-e086733bc6a5" path="/var/lib/kubelet/pods/f36c7ef9-d715-4471-966d-e086733bc6a5/volumes" Oct 07 14:09:27 crc kubenswrapper[4854]: I1007 14:09:27.009833 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-lzlxs" Oct 07 14:09:27 crc kubenswrapper[4854]: I1007 14:09:27.077054 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp7tj\" (UniqueName: \"kubernetes.io/projected/9083bc7a-21f0-4620-bb10-15aa3011034d-kube-api-access-mp7tj\") pod \"9083bc7a-21f0-4620-bb10-15aa3011034d\" (UID: \"9083bc7a-21f0-4620-bb10-15aa3011034d\") " Oct 07 14:09:27 crc kubenswrapper[4854]: I1007 14:09:27.082584 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9083bc7a-21f0-4620-bb10-15aa3011034d-kube-api-access-mp7tj" (OuterVolumeSpecName: "kube-api-access-mp7tj") pod "9083bc7a-21f0-4620-bb10-15aa3011034d" (UID: "9083bc7a-21f0-4620-bb10-15aa3011034d"). InnerVolumeSpecName "kube-api-access-mp7tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:09:27 crc kubenswrapper[4854]: I1007 14:09:27.179230 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp7tj\" (UniqueName: \"kubernetes.io/projected/9083bc7a-21f0-4620-bb10-15aa3011034d-kube-api-access-mp7tj\") on node \"crc\" DevicePath \"\"" Oct 07 14:09:27 crc kubenswrapper[4854]: I1007 14:09:27.583237 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-lzlxs" event={"ID":"9083bc7a-21f0-4620-bb10-15aa3011034d","Type":"ContainerDied","Data":"979d73fb9c3e51984d238a751e4a176b651e7565af3882e709f19f639d8e0fff"} Oct 07 14:09:27 crc kubenswrapper[4854]: I1007 14:09:27.583300 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="979d73fb9c3e51984d238a751e4a176b651e7565af3882e709f19f639d8e0fff" Oct 07 14:09:27 crc kubenswrapper[4854]: I1007 14:09:27.583310 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-lzlxs" Oct 07 14:09:33 crc kubenswrapper[4854]: I1007 14:09:33.643240 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-7966-account-create-9t5bt"] Oct 07 14:09:33 crc kubenswrapper[4854]: E1007 14:09:33.644045 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9083bc7a-21f0-4620-bb10-15aa3011034d" containerName="mariadb-database-create" Oct 07 14:09:33 crc kubenswrapper[4854]: I1007 14:09:33.644059 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9083bc7a-21f0-4620-bb10-15aa3011034d" containerName="mariadb-database-create" Oct 07 14:09:33 crc kubenswrapper[4854]: I1007 14:09:33.644266 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9083bc7a-21f0-4620-bb10-15aa3011034d" containerName="mariadb-database-create" Oct 07 14:09:33 crc kubenswrapper[4854]: I1007 14:09:33.644960 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-7966-account-create-9t5bt" Oct 07 14:09:33 crc kubenswrapper[4854]: I1007 14:09:33.649263 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 07 14:09:33 crc kubenswrapper[4854]: I1007 14:09:33.675447 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-7966-account-create-9t5bt"] Oct 07 14:09:33 crc kubenswrapper[4854]: I1007 14:09:33.733403 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqtwf\" (UniqueName: \"kubernetes.io/projected/ada900e0-3cee-4921-bf22-24d4b90a3297-kube-api-access-tqtwf\") pod \"aodh-7966-account-create-9t5bt\" (UID: \"ada900e0-3cee-4921-bf22-24d4b90a3297\") " pod="openstack/aodh-7966-account-create-9t5bt" Oct 07 14:09:33 crc kubenswrapper[4854]: I1007 14:09:33.834952 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqtwf\" (UniqueName: \"kubernetes.io/projected/ada900e0-3cee-4921-bf22-24d4b90a3297-kube-api-access-tqtwf\") pod \"aodh-7966-account-create-9t5bt\" (UID: \"ada900e0-3cee-4921-bf22-24d4b90a3297\") " pod="openstack/aodh-7966-account-create-9t5bt" Oct 07 14:09:33 crc kubenswrapper[4854]: I1007 14:09:33.865482 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqtwf\" (UniqueName: \"kubernetes.io/projected/ada900e0-3cee-4921-bf22-24d4b90a3297-kube-api-access-tqtwf\") pod \"aodh-7966-account-create-9t5bt\" (UID: \"ada900e0-3cee-4921-bf22-24d4b90a3297\") " pod="openstack/aodh-7966-account-create-9t5bt" Oct 07 14:09:33 crc kubenswrapper[4854]: I1007 14:09:33.970311 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-7966-account-create-9t5bt" Oct 07 14:09:34 crc kubenswrapper[4854]: I1007 14:09:34.611598 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-7966-account-create-9t5bt"] Oct 07 14:09:34 crc kubenswrapper[4854]: I1007 14:09:34.670522 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-7966-account-create-9t5bt" event={"ID":"ada900e0-3cee-4921-bf22-24d4b90a3297","Type":"ContainerStarted","Data":"2d1652e65ce780508e8a6ff66b4c5fa6ae85b38fc3315c63c0ee10eaeb1a02a6"} Oct 07 14:09:35 crc kubenswrapper[4854]: I1007 14:09:35.686185 4854 generic.go:334] "Generic (PLEG): container finished" podID="ada900e0-3cee-4921-bf22-24d4b90a3297" containerID="fa81e2550057dc26f74a34254ef8654d301b65a5499a926440b3801b9d78016e" exitCode=0 Oct 07 14:09:35 crc kubenswrapper[4854]: I1007 14:09:35.686283 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-7966-account-create-9t5bt" event={"ID":"ada900e0-3cee-4921-bf22-24d4b90a3297","Type":"ContainerDied","Data":"fa81e2550057dc26f74a34254ef8654d301b65a5499a926440b3801b9d78016e"} Oct 07 14:09:36 crc kubenswrapper[4854]: I1007 14:09:36.703707 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:09:36 crc kubenswrapper[4854]: E1007 14:09:36.704759 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:09:37 crc kubenswrapper[4854]: I1007 14:09:37.128438 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-7966-account-create-9t5bt" Oct 07 14:09:37 crc kubenswrapper[4854]: I1007 14:09:37.210583 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqtwf\" (UniqueName: \"kubernetes.io/projected/ada900e0-3cee-4921-bf22-24d4b90a3297-kube-api-access-tqtwf\") pod \"ada900e0-3cee-4921-bf22-24d4b90a3297\" (UID: \"ada900e0-3cee-4921-bf22-24d4b90a3297\") " Oct 07 14:09:37 crc kubenswrapper[4854]: I1007 14:09:37.216278 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada900e0-3cee-4921-bf22-24d4b90a3297-kube-api-access-tqtwf" (OuterVolumeSpecName: "kube-api-access-tqtwf") pod "ada900e0-3cee-4921-bf22-24d4b90a3297" (UID: "ada900e0-3cee-4921-bf22-24d4b90a3297"). InnerVolumeSpecName "kube-api-access-tqtwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:09:37 crc kubenswrapper[4854]: I1007 14:09:37.313804 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqtwf\" (UniqueName: \"kubernetes.io/projected/ada900e0-3cee-4921-bf22-24d4b90a3297-kube-api-access-tqtwf\") on node \"crc\" DevicePath \"\"" Oct 07 14:09:37 crc kubenswrapper[4854]: I1007 14:09:37.711194 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-7966-account-create-9t5bt" event={"ID":"ada900e0-3cee-4921-bf22-24d4b90a3297","Type":"ContainerDied","Data":"2d1652e65ce780508e8a6ff66b4c5fa6ae85b38fc3315c63c0ee10eaeb1a02a6"} Oct 07 14:09:37 crc kubenswrapper[4854]: I1007 14:09:37.711240 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d1652e65ce780508e8a6ff66b4c5fa6ae85b38fc3315c63c0ee10eaeb1a02a6" Oct 07 14:09:37 crc kubenswrapper[4854]: I1007 14:09:37.711254 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-7966-account-create-9t5bt" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.075991 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-4gtk9"] Oct 07 14:09:39 crc kubenswrapper[4854]: E1007 14:09:39.077711 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada900e0-3cee-4921-bf22-24d4b90a3297" containerName="mariadb-account-create" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.077837 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada900e0-3cee-4921-bf22-24d4b90a3297" containerName="mariadb-account-create" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.078224 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada900e0-3cee-4921-bf22-24d4b90a3297" containerName="mariadb-account-create" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.079357 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.081938 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.082075 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.082135 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-4sgkd" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.097044 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-4gtk9"] Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.155639 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-config-data\") pod \"aodh-db-sync-4gtk9\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.155754 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-scripts\") pod \"aodh-db-sync-4gtk9\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.155796 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcvg4\" (UniqueName: \"kubernetes.io/projected/e2290c3b-653d-4e9f-96be-1ce645310b22-kube-api-access-vcvg4\") pod \"aodh-db-sync-4gtk9\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.155861 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-combined-ca-bundle\") pod \"aodh-db-sync-4gtk9\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.257350 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcvg4\" (UniqueName: \"kubernetes.io/projected/e2290c3b-653d-4e9f-96be-1ce645310b22-kube-api-access-vcvg4\") pod \"aodh-db-sync-4gtk9\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.257458 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-combined-ca-bundle\") pod \"aodh-db-sync-4gtk9\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.257535 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-config-data\") pod \"aodh-db-sync-4gtk9\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.257601 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-scripts\") pod \"aodh-db-sync-4gtk9\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.268033 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-scripts\") pod \"aodh-db-sync-4gtk9\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.268347 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-combined-ca-bundle\") pod \"aodh-db-sync-4gtk9\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.269751 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-config-data\") pod \"aodh-db-sync-4gtk9\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.280799 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcvg4\" (UniqueName: \"kubernetes.io/projected/e2290c3b-653d-4e9f-96be-1ce645310b22-kube-api-access-vcvg4\") pod \"aodh-db-sync-4gtk9\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.401949 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:39 crc kubenswrapper[4854]: I1007 14:09:39.922870 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-4gtk9"] Oct 07 14:09:40 crc kubenswrapper[4854]: I1007 14:09:40.747093 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-4gtk9" event={"ID":"e2290c3b-653d-4e9f-96be-1ce645310b22","Type":"ContainerStarted","Data":"af9c9377434c29aa6d23b486eda3fbb80eb28baee3df1cef37672c894e2e9165"} Oct 07 14:09:43 crc kubenswrapper[4854]: I1007 14:09:43.936504 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 14:09:44 crc kubenswrapper[4854]: I1007 14:09:44.786515 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-4gtk9" event={"ID":"e2290c3b-653d-4e9f-96be-1ce645310b22","Type":"ContainerStarted","Data":"8f3452eac734a7fb33369eb6903d47315593c7f583357b468335361af056926b"} Oct 07 14:09:44 crc kubenswrapper[4854]: I1007 14:09:44.813305 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-4gtk9" podStartSLOduration=1.540944937 podStartE2EDuration="5.813274889s" podCreationTimestamp="2025-10-07 14:09:39 +0000 UTC" firstStartedPulling="2025-10-07 14:09:39.925746221 +0000 UTC m=+6295.913578476" lastFinishedPulling="2025-10-07 14:09:44.198076173 +0000 UTC m=+6300.185908428" observedRunningTime="2025-10-07 14:09:44.804227269 +0000 UTC m=+6300.792059524" watchObservedRunningTime="2025-10-07 14:09:44.813274889 +0000 UTC m=+6300.801107144" Oct 07 14:09:46 crc kubenswrapper[4854]: I1007 14:09:46.806990 4854 generic.go:334] "Generic (PLEG): container finished" podID="e2290c3b-653d-4e9f-96be-1ce645310b22" containerID="8f3452eac734a7fb33369eb6903d47315593c7f583357b468335361af056926b" exitCode=0 Oct 07 14:09:46 crc kubenswrapper[4854]: I1007 14:09:46.807077 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-4gtk9" event={"ID":"e2290c3b-653d-4e9f-96be-1ce645310b22","Type":"ContainerDied","Data":"8f3452eac734a7fb33369eb6903d47315593c7f583357b468335361af056926b"} Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.309375 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.463303 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-combined-ca-bundle\") pod \"e2290c3b-653d-4e9f-96be-1ce645310b22\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.463832 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcvg4\" (UniqueName: \"kubernetes.io/projected/e2290c3b-653d-4e9f-96be-1ce645310b22-kube-api-access-vcvg4\") pod \"e2290c3b-653d-4e9f-96be-1ce645310b22\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.463904 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-scripts\") pod \"e2290c3b-653d-4e9f-96be-1ce645310b22\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.464117 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-config-data\") pod \"e2290c3b-653d-4e9f-96be-1ce645310b22\" (UID: \"e2290c3b-653d-4e9f-96be-1ce645310b22\") " Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.469699 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2290c3b-653d-4e9f-96be-1ce645310b22-kube-api-access-vcvg4" (OuterVolumeSpecName: "kube-api-access-vcvg4") pod "e2290c3b-653d-4e9f-96be-1ce645310b22" (UID: "e2290c3b-653d-4e9f-96be-1ce645310b22"). InnerVolumeSpecName "kube-api-access-vcvg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.470392 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-scripts" (OuterVolumeSpecName: "scripts") pod "e2290c3b-653d-4e9f-96be-1ce645310b22" (UID: "e2290c3b-653d-4e9f-96be-1ce645310b22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.498141 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2290c3b-653d-4e9f-96be-1ce645310b22" (UID: "e2290c3b-653d-4e9f-96be-1ce645310b22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.499067 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-config-data" (OuterVolumeSpecName: "config-data") pod "e2290c3b-653d-4e9f-96be-1ce645310b22" (UID: "e2290c3b-653d-4e9f-96be-1ce645310b22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.567023 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.567076 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcvg4\" (UniqueName: \"kubernetes.io/projected/e2290c3b-653d-4e9f-96be-1ce645310b22-kube-api-access-vcvg4\") on node \"crc\" DevicePath \"\"" Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.567120 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.567130 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2290c3b-653d-4e9f-96be-1ce645310b22-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.829769 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-4gtk9" event={"ID":"e2290c3b-653d-4e9f-96be-1ce645310b22","Type":"ContainerDied","Data":"af9c9377434c29aa6d23b486eda3fbb80eb28baee3df1cef37672c894e2e9165"} Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.829813 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af9c9377434c29aa6d23b486eda3fbb80eb28baee3df1cef37672c894e2e9165" Oct 07 14:09:48 crc kubenswrapper[4854]: I1007 14:09:48.829905 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-4gtk9" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.188012 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 07 14:09:49 crc kubenswrapper[4854]: E1007 14:09:49.188620 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2290c3b-653d-4e9f-96be-1ce645310b22" containerName="aodh-db-sync" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.188644 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2290c3b-653d-4e9f-96be-1ce645310b22" containerName="aodh-db-sync" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.188914 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2290c3b-653d-4e9f-96be-1ce645310b22" containerName="aodh-db-sync" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.191924 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.196491 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-4sgkd" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.196540 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.196650 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.208816 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.281395 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c233cebb-5c68-4c2f-b875-5c26e2af4d6b-scripts\") pod \"aodh-0\" (UID: \"c233cebb-5c68-4c2f-b875-5c26e2af4d6b\") " pod="openstack/aodh-0" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.281481 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccsmk\" (UniqueName: \"kubernetes.io/projected/c233cebb-5c68-4c2f-b875-5c26e2af4d6b-kube-api-access-ccsmk\") pod \"aodh-0\" (UID: \"c233cebb-5c68-4c2f-b875-5c26e2af4d6b\") " pod="openstack/aodh-0" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.282068 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c233cebb-5c68-4c2f-b875-5c26e2af4d6b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c233cebb-5c68-4c2f-b875-5c26e2af4d6b\") " pod="openstack/aodh-0" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.282123 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c233cebb-5c68-4c2f-b875-5c26e2af4d6b-config-data\") pod \"aodh-0\" (UID: \"c233cebb-5c68-4c2f-b875-5c26e2af4d6b\") " pod="openstack/aodh-0" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.384099 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c233cebb-5c68-4c2f-b875-5c26e2af4d6b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c233cebb-5c68-4c2f-b875-5c26e2af4d6b\") " pod="openstack/aodh-0" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.384138 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c233cebb-5c68-4c2f-b875-5c26e2af4d6b-config-data\") pod \"aodh-0\" (UID: \"c233cebb-5c68-4c2f-b875-5c26e2af4d6b\") " pod="openstack/aodh-0" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.384203 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c233cebb-5c68-4c2f-b875-5c26e2af4d6b-scripts\") pod \"aodh-0\" (UID: \"c233cebb-5c68-4c2f-b875-5c26e2af4d6b\") " pod="openstack/aodh-0" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.384236 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccsmk\" (UniqueName: \"kubernetes.io/projected/c233cebb-5c68-4c2f-b875-5c26e2af4d6b-kube-api-access-ccsmk\") pod \"aodh-0\" (UID: \"c233cebb-5c68-4c2f-b875-5c26e2af4d6b\") " pod="openstack/aodh-0" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.388724 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c233cebb-5c68-4c2f-b875-5c26e2af4d6b-config-data\") pod \"aodh-0\" (UID: \"c233cebb-5c68-4c2f-b875-5c26e2af4d6b\") " pod="openstack/aodh-0" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.393364 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c233cebb-5c68-4c2f-b875-5c26e2af4d6b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c233cebb-5c68-4c2f-b875-5c26e2af4d6b\") " pod="openstack/aodh-0" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.395864 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c233cebb-5c68-4c2f-b875-5c26e2af4d6b-scripts\") pod \"aodh-0\" (UID: \"c233cebb-5c68-4c2f-b875-5c26e2af4d6b\") " pod="openstack/aodh-0" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.415776 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccsmk\" (UniqueName: \"kubernetes.io/projected/c233cebb-5c68-4c2f-b875-5c26e2af4d6b-kube-api-access-ccsmk\") pod \"aodh-0\" (UID: \"c233cebb-5c68-4c2f-b875-5c26e2af4d6b\") " pod="openstack/aodh-0" Oct 07 14:09:49 crc kubenswrapper[4854]: I1007 14:09:49.524058 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 07 14:09:50 crc kubenswrapper[4854]: I1007 14:09:50.076946 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 07 14:09:50 crc kubenswrapper[4854]: I1007 14:09:50.719956 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:09:50 crc kubenswrapper[4854]: I1007 14:09:50.854289 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c233cebb-5c68-4c2f-b875-5c26e2af4d6b","Type":"ContainerStarted","Data":"9eccf5b09108599a2901b101b0021825cfd356e8a12103ab2d2bd495b0a5ce89"} Oct 07 14:09:51 crc kubenswrapper[4854]: I1007 14:09:51.839377 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:09:51 crc kubenswrapper[4854]: I1007 14:09:51.841116 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="ceilometer-central-agent" containerID="cri-o://0d6a11b2c54d5e0c3d4795eab2434541737ba64e65747fc72f2ac52dd2f9ad73" gracePeriod=30 Oct 07 14:09:51 crc kubenswrapper[4854]: I1007 14:09:51.841794 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="proxy-httpd" containerID="cri-o://17377fcd8911716a361c9ef687bc3ea7d691ebb06ac3f176f992a95146931227" gracePeriod=30 Oct 07 14:09:51 crc kubenswrapper[4854]: I1007 14:09:51.841923 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="ceilometer-notification-agent" containerID="cri-o://f0ae54b9263d72dbbc45096a6da10da0ed2a6289efdea821063b6c790e37f972" gracePeriod=30 Oct 07 14:09:51 crc kubenswrapper[4854]: I1007 14:09:51.841987 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="sg-core" containerID="cri-o://3aefacd66ba95319410e40794141b5f0749a775f85038b2431056dc7ed8dcb68" gracePeriod=30 Oct 07 14:09:51 crc kubenswrapper[4854]: I1007 14:09:51.866781 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c233cebb-5c68-4c2f-b875-5c26e2af4d6b","Type":"ContainerStarted","Data":"f983b605f9136615db7773b5163d4652fb17895b0ba0dd3ac1f906970ab831cd"} Oct 07 14:09:51 crc kubenswrapper[4854]: I1007 14:09:51.877163 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"23da9ad8d99b4f3fecff0a63dbca297f16a76579dd1005e295268c0507480214"} Oct 07 14:09:52 crc kubenswrapper[4854]: I1007 14:09:52.887449 4854 generic.go:334] "Generic (PLEG): container finished" podID="618b91b9-b089-4347-978b-2d4de31c17e8" containerID="17377fcd8911716a361c9ef687bc3ea7d691ebb06ac3f176f992a95146931227" exitCode=0 Oct 07 14:09:52 crc kubenswrapper[4854]: I1007 14:09:52.888093 4854 generic.go:334] "Generic (PLEG): container finished" podID="618b91b9-b089-4347-978b-2d4de31c17e8" containerID="3aefacd66ba95319410e40794141b5f0749a775f85038b2431056dc7ed8dcb68" exitCode=2 Oct 07 14:09:52 crc kubenswrapper[4854]: I1007 14:09:52.887534 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"618b91b9-b089-4347-978b-2d4de31c17e8","Type":"ContainerDied","Data":"17377fcd8911716a361c9ef687bc3ea7d691ebb06ac3f176f992a95146931227"} Oct 07 14:09:52 crc kubenswrapper[4854]: I1007 14:09:52.888111 4854 generic.go:334] "Generic (PLEG): container finished" podID="618b91b9-b089-4347-978b-2d4de31c17e8" containerID="0d6a11b2c54d5e0c3d4795eab2434541737ba64e65747fc72f2ac52dd2f9ad73" exitCode=0 Oct 07 14:09:52 crc kubenswrapper[4854]: I1007 14:09:52.888138 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"618b91b9-b089-4347-978b-2d4de31c17e8","Type":"ContainerDied","Data":"3aefacd66ba95319410e40794141b5f0749a775f85038b2431056dc7ed8dcb68"} Oct 07 14:09:52 crc kubenswrapper[4854]: I1007 14:09:52.888168 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"618b91b9-b089-4347-978b-2d4de31c17e8","Type":"ContainerDied","Data":"0d6a11b2c54d5e0c3d4795eab2434541737ba64e65747fc72f2ac52dd2f9ad73"} Oct 07 14:09:52 crc kubenswrapper[4854]: I1007 14:09:52.890227 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c233cebb-5c68-4c2f-b875-5c26e2af4d6b","Type":"ContainerStarted","Data":"3499cf0f7afed4e0833980e20d96b7eb0dd8b101da1e19b2c891f75474ac5aff"} Oct 07 14:09:53 crc kubenswrapper[4854]: I1007 14:09:53.926942 4854 generic.go:334] "Generic (PLEG): container finished" podID="618b91b9-b089-4347-978b-2d4de31c17e8" containerID="f0ae54b9263d72dbbc45096a6da10da0ed2a6289efdea821063b6c790e37f972" exitCode=0 Oct 07 14:09:53 crc kubenswrapper[4854]: I1007 14:09:53.927327 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"618b91b9-b089-4347-978b-2d4de31c17e8","Type":"ContainerDied","Data":"f0ae54b9263d72dbbc45096a6da10da0ed2a6289efdea821063b6c790e37f972"} Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.126740 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.312589 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-config-data\") pod \"618b91b9-b089-4347-978b-2d4de31c17e8\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.312772 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-scripts\") pod \"618b91b9-b089-4347-978b-2d4de31c17e8\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.312815 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-combined-ca-bundle\") pod \"618b91b9-b089-4347-978b-2d4de31c17e8\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.312902 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/618b91b9-b089-4347-978b-2d4de31c17e8-log-httpd\") pod \"618b91b9-b089-4347-978b-2d4de31c17e8\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.312980 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/618b91b9-b089-4347-978b-2d4de31c17e8-run-httpd\") pod \"618b91b9-b089-4347-978b-2d4de31c17e8\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.313129 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-sg-core-conf-yaml\") pod \"618b91b9-b089-4347-978b-2d4de31c17e8\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.313194 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9nnm\" (UniqueName: \"kubernetes.io/projected/618b91b9-b089-4347-978b-2d4de31c17e8-kube-api-access-b9nnm\") pod \"618b91b9-b089-4347-978b-2d4de31c17e8\" (UID: \"618b91b9-b089-4347-978b-2d4de31c17e8\") " Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.315326 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/618b91b9-b089-4347-978b-2d4de31c17e8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "618b91b9-b089-4347-978b-2d4de31c17e8" (UID: "618b91b9-b089-4347-978b-2d4de31c17e8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.315536 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/618b91b9-b089-4347-978b-2d4de31c17e8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "618b91b9-b089-4347-978b-2d4de31c17e8" (UID: "618b91b9-b089-4347-978b-2d4de31c17e8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.319031 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/618b91b9-b089-4347-978b-2d4de31c17e8-kube-api-access-b9nnm" (OuterVolumeSpecName: "kube-api-access-b9nnm") pod "618b91b9-b089-4347-978b-2d4de31c17e8" (UID: "618b91b9-b089-4347-978b-2d4de31c17e8"). InnerVolumeSpecName "kube-api-access-b9nnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.335575 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-scripts" (OuterVolumeSpecName: "scripts") pod "618b91b9-b089-4347-978b-2d4de31c17e8" (UID: "618b91b9-b089-4347-978b-2d4de31c17e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.361298 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "618b91b9-b089-4347-978b-2d4de31c17e8" (UID: "618b91b9-b089-4347-978b-2d4de31c17e8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.404611 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "618b91b9-b089-4347-978b-2d4de31c17e8" (UID: "618b91b9-b089-4347-978b-2d4de31c17e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.416081 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.416124 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.416139 4854 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/618b91b9-b089-4347-978b-2d4de31c17e8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.416688 4854 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/618b91b9-b089-4347-978b-2d4de31c17e8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.416711 4854 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.416724 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9nnm\" (UniqueName: \"kubernetes.io/projected/618b91b9-b089-4347-978b-2d4de31c17e8-kube-api-access-b9nnm\") on node \"crc\" DevicePath \"\"" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.427229 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-config-data" (OuterVolumeSpecName: "config-data") pod "618b91b9-b089-4347-978b-2d4de31c17e8" (UID: "618b91b9-b089-4347-978b-2d4de31c17e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.518632 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b91b9-b089-4347-978b-2d4de31c17e8-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.942056 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c233cebb-5c68-4c2f-b875-5c26e2af4d6b","Type":"ContainerStarted","Data":"fd059eb8e47c66fc6d82b092d09e9455b150d8417b9ce893112e53af37a139b6"} Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.946780 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"618b91b9-b089-4347-978b-2d4de31c17e8","Type":"ContainerDied","Data":"02959f06c6c102a1da010f4aa9026cdc3f4ed6deeab254c05d17dc905b20dbe3"} Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.946902 4854 scope.go:117] "RemoveContainer" containerID="17377fcd8911716a361c9ef687bc3ea7d691ebb06ac3f176f992a95146931227" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.946931 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.978350 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:09:54 crc kubenswrapper[4854]: I1007 14:09:54.994991 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:54.999709 4854 scope.go:117] "RemoveContainer" containerID="3aefacd66ba95319410e40794141b5f0749a775f85038b2431056dc7ed8dcb68" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.019675 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:09:55 crc kubenswrapper[4854]: E1007 14:09:55.020123 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="sg-core" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.020134 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="sg-core" Oct 07 14:09:55 crc kubenswrapper[4854]: E1007 14:09:55.020163 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="ceilometer-notification-agent" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.020170 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="ceilometer-notification-agent" Oct 07 14:09:55 crc kubenswrapper[4854]: E1007 14:09:55.020179 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="ceilometer-central-agent" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.020186 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="ceilometer-central-agent" Oct 07 14:09:55 crc kubenswrapper[4854]: E1007 14:09:55.020196 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="proxy-httpd" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.020201 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="proxy-httpd" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.020392 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="ceilometer-notification-agent" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.020401 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="sg-core" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.020413 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="ceilometer-central-agent" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.020421 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" containerName="proxy-httpd" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.022320 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.024790 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.024930 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.047852 4854 scope.go:117] "RemoveContainer" containerID="f0ae54b9263d72dbbc45096a6da10da0ed2a6289efdea821063b6c790e37f972" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.055664 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.086292 4854 scope.go:117] "RemoveContainer" containerID="0d6a11b2c54d5e0c3d4795eab2434541737ba64e65747fc72f2ac52dd2f9ad73" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.133673 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-run-httpd\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.134395 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-config-data\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.134535 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-scripts\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.134668 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.134849 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-log-httpd\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.134871 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8kj9\" (UniqueName: \"kubernetes.io/projected/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-kube-api-access-q8kj9\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.134929 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.237249 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-run-httpd\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.237407 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-config-data\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.237479 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-scripts\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.237576 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.237650 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-log-httpd\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.237689 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8kj9\" (UniqueName: \"kubernetes.io/projected/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-kube-api-access-q8kj9\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.237747 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.237959 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-run-httpd\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.238805 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-log-httpd\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.243077 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.243563 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.244991 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-scripts\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.261438 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-config-data\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.262468 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8kj9\" (UniqueName: \"kubernetes.io/projected/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-kube-api-access-q8kj9\") pod \"ceilometer-0\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " pod="openstack/ceilometer-0" Oct 07 14:09:55 crc kubenswrapper[4854]: I1007 14:09:55.385075 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:09:56 crc kubenswrapper[4854]: I1007 14:09:56.220174 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:09:56 crc kubenswrapper[4854]: I1007 14:09:56.716089 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="618b91b9-b089-4347-978b-2d4de31c17e8" path="/var/lib/kubelet/pods/618b91b9-b089-4347-978b-2d4de31c17e8/volumes" Oct 07 14:09:56 crc kubenswrapper[4854]: I1007 14:09:56.967271 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc","Type":"ContainerStarted","Data":"d726e73d585556292fd686beac96ce0e5eb0ee0fcd0f3a23496f5342ab34c650"} Oct 07 14:09:56 crc kubenswrapper[4854]: I1007 14:09:56.969579 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c233cebb-5c68-4c2f-b875-5c26e2af4d6b","Type":"ContainerStarted","Data":"184e91bd9e16763b1c144eae41100934624b13d6ed900a83fa7bfd74160b9a1f"} Oct 07 14:09:57 crc kubenswrapper[4854]: I1007 14:09:57.004225 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.348978594 podStartE2EDuration="8.00420029s" podCreationTimestamp="2025-10-07 14:09:49 +0000 UTC" firstStartedPulling="2025-10-07 14:09:50.080393445 +0000 UTC m=+6306.068225720" lastFinishedPulling="2025-10-07 14:09:55.735615151 +0000 UTC m=+6311.723447416" observedRunningTime="2025-10-07 14:09:56.996339865 +0000 UTC m=+6312.984172120" watchObservedRunningTime="2025-10-07 14:09:57.00420029 +0000 UTC m=+6312.992032545" Oct 07 14:09:57 crc kubenswrapper[4854]: I1007 14:09:57.978496 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc","Type":"ContainerStarted","Data":"725189faa5beb7fdcd8cb218a915d881898c7570c2ab5a178a822da3b329b1a0"} Oct 07 14:09:57 crc kubenswrapper[4854]: I1007 14:09:57.978957 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc","Type":"ContainerStarted","Data":"5091dfd5d4a9c80a5ae7f2a457bd8231e44ce5de14e8faa458a3a329df7e8595"} Oct 07 14:09:58 crc kubenswrapper[4854]: I1007 14:09:58.989189 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc","Type":"ContainerStarted","Data":"bdedc38f22da7a012d9ed870ea15d9df91dc860e3025d3dc6887d05b54faf7fa"} Oct 07 14:10:01 crc kubenswrapper[4854]: I1007 14:10:01.020061 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc","Type":"ContainerStarted","Data":"167833fabf21dfb298f2fb42a6a4811cd54f54e7fcfcbca96b731d1565db3dd6"} Oct 07 14:10:01 crc kubenswrapper[4854]: I1007 14:10:01.020389 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 14:10:01 crc kubenswrapper[4854]: I1007 14:10:01.057213 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.382704746 podStartE2EDuration="7.05713805s" podCreationTimestamp="2025-10-07 14:09:54 +0000 UTC" firstStartedPulling="2025-10-07 14:09:56.22178219 +0000 UTC m=+6312.209614445" lastFinishedPulling="2025-10-07 14:09:59.896215494 +0000 UTC m=+6315.884047749" observedRunningTime="2025-10-07 14:10:01.040678247 +0000 UTC m=+6317.028510542" watchObservedRunningTime="2025-10-07 14:10:01.05713805 +0000 UTC m=+6317.044970335" Oct 07 14:10:04 crc kubenswrapper[4854]: I1007 14:10:04.276816 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-tfg25"] Oct 07 14:10:04 crc kubenswrapper[4854]: I1007 14:10:04.279187 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-tfg25" Oct 07 14:10:04 crc kubenswrapper[4854]: I1007 14:10:04.292164 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-tfg25"] Oct 07 14:10:04 crc kubenswrapper[4854]: I1007 14:10:04.451435 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlrhh\" (UniqueName: \"kubernetes.io/projected/0046cb69-239e-4960-a070-dd6d9c3b6b72-kube-api-access-mlrhh\") pod \"manila-db-create-tfg25\" (UID: \"0046cb69-239e-4960-a070-dd6d9c3b6b72\") " pod="openstack/manila-db-create-tfg25" Oct 07 14:10:04 crc kubenswrapper[4854]: I1007 14:10:04.553821 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlrhh\" (UniqueName: \"kubernetes.io/projected/0046cb69-239e-4960-a070-dd6d9c3b6b72-kube-api-access-mlrhh\") pod \"manila-db-create-tfg25\" (UID: \"0046cb69-239e-4960-a070-dd6d9c3b6b72\") " pod="openstack/manila-db-create-tfg25" Oct 07 14:10:04 crc kubenswrapper[4854]: I1007 14:10:04.578042 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlrhh\" (UniqueName: \"kubernetes.io/projected/0046cb69-239e-4960-a070-dd6d9c3b6b72-kube-api-access-mlrhh\") pod \"manila-db-create-tfg25\" (UID: \"0046cb69-239e-4960-a070-dd6d9c3b6b72\") " pod="openstack/manila-db-create-tfg25" Oct 07 14:10:04 crc kubenswrapper[4854]: I1007 14:10:04.613325 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-tfg25" Oct 07 14:10:05 crc kubenswrapper[4854]: I1007 14:10:05.150950 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-tfg25"] Oct 07 14:10:06 crc kubenswrapper[4854]: I1007 14:10:06.077423 4854 generic.go:334] "Generic (PLEG): container finished" podID="0046cb69-239e-4960-a070-dd6d9c3b6b72" containerID="a9ee70c7ef6997db8a30cd9a91cae1bcc82ec5149b32e4c38d268365d65485ac" exitCode=0 Oct 07 14:10:06 crc kubenswrapper[4854]: I1007 14:10:06.077561 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-tfg25" event={"ID":"0046cb69-239e-4960-a070-dd6d9c3b6b72","Type":"ContainerDied","Data":"a9ee70c7ef6997db8a30cd9a91cae1bcc82ec5149b32e4c38d268365d65485ac"} Oct 07 14:10:06 crc kubenswrapper[4854]: I1007 14:10:06.077744 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-tfg25" event={"ID":"0046cb69-239e-4960-a070-dd6d9c3b6b72","Type":"ContainerStarted","Data":"41f45ca2a81084a91dd289a9446180e31de73de5b691c8d69ae5e19a5d0f58cd"} Oct 07 14:10:07 crc kubenswrapper[4854]: I1007 14:10:07.591166 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-tfg25" Oct 07 14:10:07 crc kubenswrapper[4854]: I1007 14:10:07.734796 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlrhh\" (UniqueName: \"kubernetes.io/projected/0046cb69-239e-4960-a070-dd6d9c3b6b72-kube-api-access-mlrhh\") pod \"0046cb69-239e-4960-a070-dd6d9c3b6b72\" (UID: \"0046cb69-239e-4960-a070-dd6d9c3b6b72\") " Oct 07 14:10:07 crc kubenswrapper[4854]: I1007 14:10:07.745847 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0046cb69-239e-4960-a070-dd6d9c3b6b72-kube-api-access-mlrhh" (OuterVolumeSpecName: "kube-api-access-mlrhh") pod "0046cb69-239e-4960-a070-dd6d9c3b6b72" (UID: "0046cb69-239e-4960-a070-dd6d9c3b6b72"). InnerVolumeSpecName "kube-api-access-mlrhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:10:07 crc kubenswrapper[4854]: I1007 14:10:07.837890 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlrhh\" (UniqueName: \"kubernetes.io/projected/0046cb69-239e-4960-a070-dd6d9c3b6b72-kube-api-access-mlrhh\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:08 crc kubenswrapper[4854]: I1007 14:10:08.100527 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-tfg25" event={"ID":"0046cb69-239e-4960-a070-dd6d9c3b6b72","Type":"ContainerDied","Data":"41f45ca2a81084a91dd289a9446180e31de73de5b691c8d69ae5e19a5d0f58cd"} Oct 07 14:10:08 crc kubenswrapper[4854]: I1007 14:10:08.100568 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41f45ca2a81084a91dd289a9446180e31de73de5b691c8d69ae5e19a5d0f58cd" Oct 07 14:10:08 crc kubenswrapper[4854]: I1007 14:10:08.100589 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-tfg25" Oct 07 14:10:14 crc kubenswrapper[4854]: I1007 14:10:14.357857 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-38f4-account-create-qldhn"] Oct 07 14:10:14 crc kubenswrapper[4854]: E1007 14:10:14.358694 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0046cb69-239e-4960-a070-dd6d9c3b6b72" containerName="mariadb-database-create" Oct 07 14:10:14 crc kubenswrapper[4854]: I1007 14:10:14.358706 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="0046cb69-239e-4960-a070-dd6d9c3b6b72" containerName="mariadb-database-create" Oct 07 14:10:14 crc kubenswrapper[4854]: I1007 14:10:14.358908 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="0046cb69-239e-4960-a070-dd6d9c3b6b72" containerName="mariadb-database-create" Oct 07 14:10:14 crc kubenswrapper[4854]: I1007 14:10:14.359604 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-38f4-account-create-qldhn" Oct 07 14:10:14 crc kubenswrapper[4854]: I1007 14:10:14.362030 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 07 14:10:14 crc kubenswrapper[4854]: I1007 14:10:14.421233 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-38f4-account-create-qldhn"] Oct 07 14:10:14 crc kubenswrapper[4854]: I1007 14:10:14.490338 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnzvn\" (UniqueName: \"kubernetes.io/projected/1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c-kube-api-access-pnzvn\") pod \"manila-38f4-account-create-qldhn\" (UID: \"1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c\") " pod="openstack/manila-38f4-account-create-qldhn" Oct 07 14:10:14 crc kubenswrapper[4854]: I1007 14:10:14.592045 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzvn\" (UniqueName: \"kubernetes.io/projected/1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c-kube-api-access-pnzvn\") pod \"manila-38f4-account-create-qldhn\" (UID: \"1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c\") " pod="openstack/manila-38f4-account-create-qldhn" Oct 07 14:10:14 crc kubenswrapper[4854]: I1007 14:10:14.614495 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzvn\" (UniqueName: \"kubernetes.io/projected/1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c-kube-api-access-pnzvn\") pod \"manila-38f4-account-create-qldhn\" (UID: \"1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c\") " pod="openstack/manila-38f4-account-create-qldhn" Oct 07 14:10:14 crc kubenswrapper[4854]: I1007 14:10:14.683855 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-38f4-account-create-qldhn" Oct 07 14:10:15 crc kubenswrapper[4854]: I1007 14:10:15.144628 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-38f4-account-create-qldhn"] Oct 07 14:10:15 crc kubenswrapper[4854]: I1007 14:10:15.210573 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-38f4-account-create-qldhn" event={"ID":"1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c","Type":"ContainerStarted","Data":"101755f426dc74c4b2bec47dee1c94c4be10854a0e88626ec539a48512062121"} Oct 07 14:10:16 crc kubenswrapper[4854]: I1007 14:10:16.224521 4854 generic.go:334] "Generic (PLEG): container finished" podID="1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c" containerID="8846691a293b3044399849c8b9837157b6c67e808ae125e0383424a7431ab52f" exitCode=0 Oct 07 14:10:16 crc kubenswrapper[4854]: I1007 14:10:16.224783 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-38f4-account-create-qldhn" event={"ID":"1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c","Type":"ContainerDied","Data":"8846691a293b3044399849c8b9837157b6c67e808ae125e0383424a7431ab52f"} Oct 07 14:10:17 crc kubenswrapper[4854]: I1007 14:10:17.784559 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-38f4-account-create-qldhn" Oct 07 14:10:17 crc kubenswrapper[4854]: I1007 14:10:17.876319 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnzvn\" (UniqueName: \"kubernetes.io/projected/1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c-kube-api-access-pnzvn\") pod \"1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c\" (UID: \"1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c\") " Oct 07 14:10:17 crc kubenswrapper[4854]: I1007 14:10:17.886665 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c-kube-api-access-pnzvn" (OuterVolumeSpecName: "kube-api-access-pnzvn") pod "1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c" (UID: "1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c"). InnerVolumeSpecName "kube-api-access-pnzvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:10:17 crc kubenswrapper[4854]: I1007 14:10:17.978864 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnzvn\" (UniqueName: \"kubernetes.io/projected/1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c-kube-api-access-pnzvn\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:18 crc kubenswrapper[4854]: I1007 14:10:18.253096 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-38f4-account-create-qldhn" event={"ID":"1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c","Type":"ContainerDied","Data":"101755f426dc74c4b2bec47dee1c94c4be10854a0e88626ec539a48512062121"} Oct 07 14:10:18 crc kubenswrapper[4854]: I1007 14:10:18.253198 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="101755f426dc74c4b2bec47dee1c94c4be10854a0e88626ec539a48512062121" Oct 07 14:10:18 crc kubenswrapper[4854]: I1007 14:10:18.253233 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-38f4-account-create-qldhn" Oct 07 14:10:19 crc kubenswrapper[4854]: I1007 14:10:19.802744 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-lgwsm"] Oct 07 14:10:19 crc kubenswrapper[4854]: E1007 14:10:19.804175 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c" containerName="mariadb-account-create" Oct 07 14:10:19 crc kubenswrapper[4854]: I1007 14:10:19.804200 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c" containerName="mariadb-account-create" Oct 07 14:10:19 crc kubenswrapper[4854]: I1007 14:10:19.804594 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c" containerName="mariadb-account-create" Oct 07 14:10:19 crc kubenswrapper[4854]: I1007 14:10:19.805867 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:19 crc kubenswrapper[4854]: I1007 14:10:19.808765 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-h57km" Oct 07 14:10:19 crc kubenswrapper[4854]: I1007 14:10:19.809881 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 07 14:10:19 crc kubenswrapper[4854]: I1007 14:10:19.816904 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-lgwsm"] Oct 07 14:10:19 crc kubenswrapper[4854]: I1007 14:10:19.927816 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-job-config-data\") pod \"manila-db-sync-lgwsm\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:19 crc kubenswrapper[4854]: I1007 14:10:19.927944 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqrdq\" (UniqueName: \"kubernetes.io/projected/959c7e83-9813-44c5-9952-cfccd2c8eaf4-kube-api-access-lqrdq\") pod \"manila-db-sync-lgwsm\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:19 crc kubenswrapper[4854]: I1007 14:10:19.928509 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-config-data\") pod \"manila-db-sync-lgwsm\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:19 crc kubenswrapper[4854]: I1007 14:10:19.928601 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-combined-ca-bundle\") pod \"manila-db-sync-lgwsm\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:20 crc kubenswrapper[4854]: I1007 14:10:20.031796 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-config-data\") pod \"manila-db-sync-lgwsm\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:20 crc kubenswrapper[4854]: I1007 14:10:20.031896 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-combined-ca-bundle\") pod \"manila-db-sync-lgwsm\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:20 crc kubenswrapper[4854]: I1007 14:10:20.031973 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-job-config-data\") pod \"manila-db-sync-lgwsm\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:20 crc kubenswrapper[4854]: I1007 14:10:20.032071 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqrdq\" (UniqueName: \"kubernetes.io/projected/959c7e83-9813-44c5-9952-cfccd2c8eaf4-kube-api-access-lqrdq\") pod \"manila-db-sync-lgwsm\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:20 crc kubenswrapper[4854]: I1007 14:10:20.040318 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-combined-ca-bundle\") pod \"manila-db-sync-lgwsm\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:20 crc kubenswrapper[4854]: I1007 14:10:20.040323 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-config-data\") pod \"manila-db-sync-lgwsm\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:20 crc kubenswrapper[4854]: I1007 14:10:20.043378 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-job-config-data\") pod \"manila-db-sync-lgwsm\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:20 crc kubenswrapper[4854]: I1007 14:10:20.065386 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqrdq\" (UniqueName: \"kubernetes.io/projected/959c7e83-9813-44c5-9952-cfccd2c8eaf4-kube-api-access-lqrdq\") pod \"manila-db-sync-lgwsm\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:20 crc kubenswrapper[4854]: I1007 14:10:20.130627 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:21 crc kubenswrapper[4854]: I1007 14:10:21.019877 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-lgwsm"] Oct 07 14:10:21 crc kubenswrapper[4854]: W1007 14:10:21.021707 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod959c7e83_9813_44c5_9952_cfccd2c8eaf4.slice/crio-385d67e3e1d511876844d027817b66f260a635bb4a5bb34ba4051ba0c661f2bb WatchSource:0}: Error finding container 385d67e3e1d511876844d027817b66f260a635bb4a5bb34ba4051ba0c661f2bb: Status 404 returned error can't find the container with id 385d67e3e1d511876844d027817b66f260a635bb4a5bb34ba4051ba0c661f2bb Oct 07 14:10:21 crc kubenswrapper[4854]: I1007 14:10:21.296027 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lgwsm" event={"ID":"959c7e83-9813-44c5-9952-cfccd2c8eaf4","Type":"ContainerStarted","Data":"385d67e3e1d511876844d027817b66f260a635bb4a5bb34ba4051ba0c661f2bb"} Oct 07 14:10:25 crc kubenswrapper[4854]: I1007 14:10:25.403813 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 14:10:25 crc kubenswrapper[4854]: I1007 14:10:25.681908 4854 scope.go:117] "RemoveContainer" containerID="6003185ab2a816d22883b0da860ffb6267b7e077e1428534400a61043b7d80eb" Oct 07 14:10:26 crc kubenswrapper[4854]: I1007 14:10:26.354562 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lgwsm" event={"ID":"959c7e83-9813-44c5-9952-cfccd2c8eaf4","Type":"ContainerStarted","Data":"24f4e7711094c67fcad0eabe63dc95228142be094d99fb4e8a5c64835f25e355"} Oct 07 14:10:26 crc kubenswrapper[4854]: I1007 14:10:26.383923 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-lgwsm" podStartSLOduration=3.476464502 podStartE2EDuration="7.383894881s" podCreationTimestamp="2025-10-07 14:10:19 +0000 UTC" firstStartedPulling="2025-10-07 14:10:21.025331108 +0000 UTC m=+6337.013163353" lastFinishedPulling="2025-10-07 14:10:24.932761457 +0000 UTC m=+6340.920593732" observedRunningTime="2025-10-07 14:10:26.375616433 +0000 UTC m=+6342.363448688" watchObservedRunningTime="2025-10-07 14:10:26.383894881 +0000 UTC m=+6342.371727176" Oct 07 14:10:27 crc kubenswrapper[4854]: I1007 14:10:27.366187 4854 generic.go:334] "Generic (PLEG): container finished" podID="959c7e83-9813-44c5-9952-cfccd2c8eaf4" containerID="24f4e7711094c67fcad0eabe63dc95228142be094d99fb4e8a5c64835f25e355" exitCode=0 Oct 07 14:10:27 crc kubenswrapper[4854]: I1007 14:10:27.366287 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lgwsm" event={"ID":"959c7e83-9813-44c5-9952-cfccd2c8eaf4","Type":"ContainerDied","Data":"24f4e7711094c67fcad0eabe63dc95228142be094d99fb4e8a5c64835f25e355"} Oct 07 14:10:28 crc kubenswrapper[4854]: I1007 14:10:28.918196 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.039256 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-config-data\") pod \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.039361 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-combined-ca-bundle\") pod \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.039507 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-job-config-data\") pod \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.039637 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqrdq\" (UniqueName: \"kubernetes.io/projected/959c7e83-9813-44c5-9952-cfccd2c8eaf4-kube-api-access-lqrdq\") pod \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\" (UID: \"959c7e83-9813-44c5-9952-cfccd2c8eaf4\") " Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.051575 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-config-data" (OuterVolumeSpecName: "config-data") pod "959c7e83-9813-44c5-9952-cfccd2c8eaf4" (UID: "959c7e83-9813-44c5-9952-cfccd2c8eaf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.051673 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "959c7e83-9813-44c5-9952-cfccd2c8eaf4" (UID: "959c7e83-9813-44c5-9952-cfccd2c8eaf4"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.052710 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/959c7e83-9813-44c5-9952-cfccd2c8eaf4-kube-api-access-lqrdq" (OuterVolumeSpecName: "kube-api-access-lqrdq") pod "959c7e83-9813-44c5-9952-cfccd2c8eaf4" (UID: "959c7e83-9813-44c5-9952-cfccd2c8eaf4"). InnerVolumeSpecName "kube-api-access-lqrdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.096437 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "959c7e83-9813-44c5-9952-cfccd2c8eaf4" (UID: "959c7e83-9813-44c5-9952-cfccd2c8eaf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.143375 4854 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.143434 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqrdq\" (UniqueName: \"kubernetes.io/projected/959c7e83-9813-44c5-9952-cfccd2c8eaf4-kube-api-access-lqrdq\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.143448 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.143460 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959c7e83-9813-44c5-9952-cfccd2c8eaf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.398279 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lgwsm" event={"ID":"959c7e83-9813-44c5-9952-cfccd2c8eaf4","Type":"ContainerDied","Data":"385d67e3e1d511876844d027817b66f260a635bb4a5bb34ba4051ba0c661f2bb"} Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.398322 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="385d67e3e1d511876844d027817b66f260a635bb4a5bb34ba4051ba0c661f2bb" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.398338 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lgwsm" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.735175 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 14:10:29 crc kubenswrapper[4854]: E1007 14:10:29.736126 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959c7e83-9813-44c5-9952-cfccd2c8eaf4" containerName="manila-db-sync" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.736143 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="959c7e83-9813-44c5-9952-cfccd2c8eaf4" containerName="manila-db-sync" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.736379 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="959c7e83-9813-44c5-9952-cfccd2c8eaf4" containerName="manila-db-sync" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.737590 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.740432 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-h57km" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.740834 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.741007 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.741208 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.753120 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.754977 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.768948 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.776805 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.818061 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.833602 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74bbfd59d5-x4jc9"] Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.835252 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.858295 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660e5edd-fac1-47c5-855e-d1f5dc5aa455-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.858375 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/660e5edd-fac1-47c5-855e-d1f5dc5aa455-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.858408 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d6d3a535-80ee-43a0-8f03-30206d07d28c-ceph\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.858435 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d3a535-80ee-43a0-8f03-30206d07d28c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.858459 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660e5edd-fac1-47c5-855e-d1f5dc5aa455-scripts\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.858505 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6d3a535-80ee-43a0-8f03-30206d07d28c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.858536 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjzj8\" (UniqueName: \"kubernetes.io/projected/d6d3a535-80ee-43a0-8f03-30206d07d28c-kube-api-access-mjzj8\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.858614 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6d3a535-80ee-43a0-8f03-30206d07d28c-config-data\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.858646 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6d3a535-80ee-43a0-8f03-30206d07d28c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.858677 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/660e5edd-fac1-47c5-855e-d1f5dc5aa455-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.858709 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660e5edd-fac1-47c5-855e-d1f5dc5aa455-config-data\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.858781 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d6d3a535-80ee-43a0-8f03-30206d07d28c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.858823 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxh4c\" (UniqueName: \"kubernetes.io/projected/660e5edd-fac1-47c5-855e-d1f5dc5aa455-kube-api-access-gxh4c\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.858878 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6d3a535-80ee-43a0-8f03-30206d07d28c-scripts\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.862718 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74bbfd59d5-x4jc9"] Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.960474 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-dns-svc\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.961193 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6d3a535-80ee-43a0-8f03-30206d07d28c-config-data\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.961312 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6d3a535-80ee-43a0-8f03-30206d07d28c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.961410 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6d3a535-80ee-43a0-8f03-30206d07d28c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.961432 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/660e5edd-fac1-47c5-855e-d1f5dc5aa455-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.961541 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660e5edd-fac1-47c5-855e-d1f5dc5aa455-config-data\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.961572 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-ovsdbserver-nb\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.961633 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rxjw\" (UniqueName: \"kubernetes.io/projected/9889be68-5da1-49d0-ac67-b649d6d8fc9b-kube-api-access-7rxjw\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.961725 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/660e5edd-fac1-47c5-855e-d1f5dc5aa455-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.961754 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d6d3a535-80ee-43a0-8f03-30206d07d28c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.961892 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d6d3a535-80ee-43a0-8f03-30206d07d28c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.962001 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxh4c\" (UniqueName: \"kubernetes.io/projected/660e5edd-fac1-47c5-855e-d1f5dc5aa455-kube-api-access-gxh4c\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.962128 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-config\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.962266 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6d3a535-80ee-43a0-8f03-30206d07d28c-scripts\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.962423 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660e5edd-fac1-47c5-855e-d1f5dc5aa455-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.962560 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/660e5edd-fac1-47c5-855e-d1f5dc5aa455-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.962681 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d6d3a535-80ee-43a0-8f03-30206d07d28c-ceph\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.962785 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d3a535-80ee-43a0-8f03-30206d07d28c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.962899 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660e5edd-fac1-47c5-855e-d1f5dc5aa455-scripts\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.963039 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-ovsdbserver-sb\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.963226 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6d3a535-80ee-43a0-8f03-30206d07d28c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.963369 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjzj8\" (UniqueName: \"kubernetes.io/projected/d6d3a535-80ee-43a0-8f03-30206d07d28c-kube-api-access-mjzj8\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.970333 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660e5edd-fac1-47c5-855e-d1f5dc5aa455-config-data\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.970994 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6d3a535-80ee-43a0-8f03-30206d07d28c-scripts\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.971213 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660e5edd-fac1-47c5-855e-d1f5dc5aa455-scripts\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.971361 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660e5edd-fac1-47c5-855e-d1f5dc5aa455-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.971747 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6d3a535-80ee-43a0-8f03-30206d07d28c-config-data\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.972929 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6d3a535-80ee-43a0-8f03-30206d07d28c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.974708 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/660e5edd-fac1-47c5-855e-d1f5dc5aa455-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.983664 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d6d3a535-80ee-43a0-8f03-30206d07d28c-ceph\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.985654 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxh4c\" (UniqueName: \"kubernetes.io/projected/660e5edd-fac1-47c5-855e-d1f5dc5aa455-kube-api-access-gxh4c\") pod \"manila-scheduler-0\" (UID: \"660e5edd-fac1-47c5-855e-d1f5dc5aa455\") " pod="openstack/manila-scheduler-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.988821 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d3a535-80ee-43a0-8f03-30206d07d28c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:29 crc kubenswrapper[4854]: I1007 14:10:29.993405 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjzj8\" (UniqueName: \"kubernetes.io/projected/d6d3a535-80ee-43a0-8f03-30206d07d28c-kube-api-access-mjzj8\") pod \"manila-share-share1-0\" (UID: \"d6d3a535-80ee-43a0-8f03-30206d07d28c\") " pod="openstack/manila-share-share1-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.065109 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-ovsdbserver-sb\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.065249 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-dns-svc\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.065354 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-ovsdbserver-nb\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.065390 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rxjw\" (UniqueName: \"kubernetes.io/projected/9889be68-5da1-49d0-ac67-b649d6d8fc9b-kube-api-access-7rxjw\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.065488 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-config\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.066850 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-config\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.067075 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-dns-svc\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.068094 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-ovsdbserver-nb\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.068244 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-ovsdbserver-sb\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.078351 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.095237 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rxjw\" (UniqueName: \"kubernetes.io/projected/9889be68-5da1-49d0-ac67-b649d6d8fc9b-kube-api-access-7rxjw\") pod \"dnsmasq-dns-74bbfd59d5-x4jc9\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.101477 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.169807 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.171701 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.171989 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.175409 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.190821 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.270631 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq48t\" (UniqueName: \"kubernetes.io/projected/f2308954-dc7b-4806-8cb0-171b0fc0de08-kube-api-access-cq48t\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.271273 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2308954-dc7b-4806-8cb0-171b0fc0de08-etc-machine-id\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.271410 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2308954-dc7b-4806-8cb0-171b0fc0de08-config-data-custom\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.288275 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2308954-dc7b-4806-8cb0-171b0fc0de08-scripts\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.288443 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2308954-dc7b-4806-8cb0-171b0fc0de08-config-data\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.288510 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2308954-dc7b-4806-8cb0-171b0fc0de08-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.288673 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2308954-dc7b-4806-8cb0-171b0fc0de08-logs\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.392759 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq48t\" (UniqueName: \"kubernetes.io/projected/f2308954-dc7b-4806-8cb0-171b0fc0de08-kube-api-access-cq48t\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.392819 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2308954-dc7b-4806-8cb0-171b0fc0de08-etc-machine-id\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.392840 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2308954-dc7b-4806-8cb0-171b0fc0de08-config-data-custom\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.392868 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2308954-dc7b-4806-8cb0-171b0fc0de08-scripts\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.392918 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2308954-dc7b-4806-8cb0-171b0fc0de08-config-data\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.392938 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2308954-dc7b-4806-8cb0-171b0fc0de08-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.392976 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2308954-dc7b-4806-8cb0-171b0fc0de08-logs\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.393734 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2308954-dc7b-4806-8cb0-171b0fc0de08-etc-machine-id\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.394024 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2308954-dc7b-4806-8cb0-171b0fc0de08-logs\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.409524 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2308954-dc7b-4806-8cb0-171b0fc0de08-scripts\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.409934 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2308954-dc7b-4806-8cb0-171b0fc0de08-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.410236 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2308954-dc7b-4806-8cb0-171b0fc0de08-config-data\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.413754 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2308954-dc7b-4806-8cb0-171b0fc0de08-config-data-custom\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.417250 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq48t\" (UniqueName: \"kubernetes.io/projected/f2308954-dc7b-4806-8cb0-171b0fc0de08-kube-api-access-cq48t\") pod \"manila-api-0\" (UID: \"f2308954-dc7b-4806-8cb0-171b0fc0de08\") " pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.539683 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.771194 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.888875 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 14:10:30 crc kubenswrapper[4854]: I1007 14:10:30.992287 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74bbfd59d5-x4jc9"] Oct 07 14:10:31 crc kubenswrapper[4854]: I1007 14:10:31.245046 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 07 14:10:31 crc kubenswrapper[4854]: I1007 14:10:31.438127 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f2308954-dc7b-4806-8cb0-171b0fc0de08","Type":"ContainerStarted","Data":"2ede6db7d038e49ceeb9482a8c176728c930f95f8b4e2f2b21f46cdc093adfa6"} Oct 07 14:10:31 crc kubenswrapper[4854]: I1007 14:10:31.439842 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"660e5edd-fac1-47c5-855e-d1f5dc5aa455","Type":"ContainerStarted","Data":"8b9ba1196c4e07f39dd010128d596d2de5f1c8efd56a8233c07f766a9857d9d0"} Oct 07 14:10:31 crc kubenswrapper[4854]: I1007 14:10:31.441847 4854 generic.go:334] "Generic (PLEG): container finished" podID="9889be68-5da1-49d0-ac67-b649d6d8fc9b" containerID="f1dd65f6c8878478b719df919f50ad5c51d680d7c656e5e5bab97970d78aabc7" exitCode=0 Oct 07 14:10:31 crc kubenswrapper[4854]: I1007 14:10:31.441908 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" event={"ID":"9889be68-5da1-49d0-ac67-b649d6d8fc9b","Type":"ContainerDied","Data":"f1dd65f6c8878478b719df919f50ad5c51d680d7c656e5e5bab97970d78aabc7"} Oct 07 14:10:31 crc kubenswrapper[4854]: I1007 14:10:31.441928 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" event={"ID":"9889be68-5da1-49d0-ac67-b649d6d8fc9b","Type":"ContainerStarted","Data":"f416d1e6a3a1b4db3ebe6ece7e25ba45a8152d5f781b190b25d4dcc19f9e3b31"} Oct 07 14:10:31 crc kubenswrapper[4854]: I1007 14:10:31.463482 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d6d3a535-80ee-43a0-8f03-30206d07d28c","Type":"ContainerStarted","Data":"be5c376f978dc75612cbe78ee6180ae46680073cddb0689c901828f63c6be547"} Oct 07 14:10:32 crc kubenswrapper[4854]: I1007 14:10:32.476995 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f2308954-dc7b-4806-8cb0-171b0fc0de08","Type":"ContainerStarted","Data":"18a7a446ff76893abd8a5071df95497aa54651adb23c274cc349a56db53ed362"} Oct 07 14:10:32 crc kubenswrapper[4854]: I1007 14:10:32.477696 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"f2308954-dc7b-4806-8cb0-171b0fc0de08","Type":"ContainerStarted","Data":"3ddef9d1b696198a50a8e9496d23fff8d6d3c9ee10c4868cc7c12c59d0e0a79c"} Oct 07 14:10:32 crc kubenswrapper[4854]: I1007 14:10:32.478734 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 07 14:10:32 crc kubenswrapper[4854]: I1007 14:10:32.487434 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"660e5edd-fac1-47c5-855e-d1f5dc5aa455","Type":"ContainerStarted","Data":"7b4825d0d74985455974dfd1453ce69eacde239181fbe674f9d3b1022881f6a6"} Oct 07 14:10:32 crc kubenswrapper[4854]: I1007 14:10:32.489786 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" event={"ID":"9889be68-5da1-49d0-ac67-b649d6d8fc9b","Type":"ContainerStarted","Data":"8baa1265ac2871963fce0f42f6b8e431cdaa4ece21870b7769548dd68697f9fa"} Oct 07 14:10:32 crc kubenswrapper[4854]: I1007 14:10:32.491013 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:32 crc kubenswrapper[4854]: I1007 14:10:32.505560 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=2.505540518 podStartE2EDuration="2.505540518s" podCreationTimestamp="2025-10-07 14:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:10:32.503425667 +0000 UTC m=+6348.491257922" watchObservedRunningTime="2025-10-07 14:10:32.505540518 +0000 UTC m=+6348.493372773" Oct 07 14:10:32 crc kubenswrapper[4854]: I1007 14:10:32.536180 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" podStartSLOduration=3.536135727 podStartE2EDuration="3.536135727s" podCreationTimestamp="2025-10-07 14:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:10:32.52161088 +0000 UTC m=+6348.509443135" watchObservedRunningTime="2025-10-07 14:10:32.536135727 +0000 UTC m=+6348.523967982" Oct 07 14:10:33 crc kubenswrapper[4854]: I1007 14:10:33.501596 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"660e5edd-fac1-47c5-855e-d1f5dc5aa455","Type":"ContainerStarted","Data":"62d02a70619ad5c95fbf511ce2cf5ecc095bef9829580227703b018b2b53516a"} Oct 07 14:10:34 crc kubenswrapper[4854]: I1007 14:10:34.736708 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.805115767 podStartE2EDuration="5.736687023s" podCreationTimestamp="2025-10-07 14:10:29 +0000 UTC" firstStartedPulling="2025-10-07 14:10:30.773677078 +0000 UTC m=+6346.761509333" lastFinishedPulling="2025-10-07 14:10:31.705248334 +0000 UTC m=+6347.693080589" observedRunningTime="2025-10-07 14:10:33.530837757 +0000 UTC m=+6349.518670012" watchObservedRunningTime="2025-10-07 14:10:34.736687023 +0000 UTC m=+6350.724519268" Oct 07 14:10:39 crc kubenswrapper[4854]: I1007 14:10:39.569244 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d6d3a535-80ee-43a0-8f03-30206d07d28c","Type":"ContainerStarted","Data":"b39711a5de3f75c5614790f41d6bae8fabaff0703f942f1826b9e1697f011e2b"} Oct 07 14:10:39 crc kubenswrapper[4854]: I1007 14:10:39.569645 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d6d3a535-80ee-43a0-8f03-30206d07d28c","Type":"ContainerStarted","Data":"ac71de2e96c0dd0b078cdc1c2166bb0c08fc3546df504197ee60cc32f40b3ab6"} Oct 07 14:10:39 crc kubenswrapper[4854]: I1007 14:10:39.618723 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.438989056 podStartE2EDuration="10.618695884s" podCreationTimestamp="2025-10-07 14:10:29 +0000 UTC" firstStartedPulling="2025-10-07 14:10:30.884634686 +0000 UTC m=+6346.872466941" lastFinishedPulling="2025-10-07 14:10:38.064341504 +0000 UTC m=+6354.052173769" observedRunningTime="2025-10-07 14:10:39.604306371 +0000 UTC m=+6355.592138636" watchObservedRunningTime="2025-10-07 14:10:39.618695884 +0000 UTC m=+6355.606528149" Oct 07 14:10:40 crc kubenswrapper[4854]: I1007 14:10:40.078936 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 07 14:10:40 crc kubenswrapper[4854]: I1007 14:10:40.102575 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 07 14:10:40 crc kubenswrapper[4854]: I1007 14:10:40.175483 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:10:40 crc kubenswrapper[4854]: I1007 14:10:40.305898 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-549dd5b965-mr5x2"] Oct 07 14:10:40 crc kubenswrapper[4854]: I1007 14:10:40.307373 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" podUID="8823e4f7-cb5e-4f62-9db1-84f8d4f28b93" containerName="dnsmasq-dns" containerID="cri-o://37bf59643938313b465d6a9f5849fa5e4e43ae91174ed85f53db0289760ffb7d" gracePeriod=10 Oct 07 14:10:40 crc kubenswrapper[4854]: I1007 14:10:40.592604 4854 generic.go:334] "Generic (PLEG): container finished" podID="8823e4f7-cb5e-4f62-9db1-84f8d4f28b93" containerID="37bf59643938313b465d6a9f5849fa5e4e43ae91174ed85f53db0289760ffb7d" exitCode=0 Oct 07 14:10:40 crc kubenswrapper[4854]: I1007 14:10:40.594843 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" event={"ID":"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93","Type":"ContainerDied","Data":"37bf59643938313b465d6a9f5849fa5e4e43ae91174ed85f53db0289760ffb7d"} Oct 07 14:10:40 crc kubenswrapper[4854]: I1007 14:10:40.877670 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 14:10:40 crc kubenswrapper[4854]: I1007 14:10:40.952644 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-config\") pod \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " Oct 07 14:10:40 crc kubenswrapper[4854]: I1007 14:10:40.952767 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-ovsdbserver-sb\") pod \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " Oct 07 14:10:40 crc kubenswrapper[4854]: I1007 14:10:40.952845 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-ovsdbserver-nb\") pod \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " Oct 07 14:10:40 crc kubenswrapper[4854]: I1007 14:10:40.953355 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mv97\" (UniqueName: \"kubernetes.io/projected/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-kube-api-access-5mv97\") pod \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " Oct 07 14:10:40 crc kubenswrapper[4854]: I1007 14:10:40.953606 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-dns-svc\") pod \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\" (UID: \"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93\") " Oct 07 14:10:40 crc kubenswrapper[4854]: I1007 14:10:40.960434 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-kube-api-access-5mv97" (OuterVolumeSpecName: "kube-api-access-5mv97") pod "8823e4f7-cb5e-4f62-9db1-84f8d4f28b93" (UID: "8823e4f7-cb5e-4f62-9db1-84f8d4f28b93"). InnerVolumeSpecName "kube-api-access-5mv97". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.018001 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8823e4f7-cb5e-4f62-9db1-84f8d4f28b93" (UID: "8823e4f7-cb5e-4f62-9db1-84f8d4f28b93"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.018648 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-config" (OuterVolumeSpecName: "config") pod "8823e4f7-cb5e-4f62-9db1-84f8d4f28b93" (UID: "8823e4f7-cb5e-4f62-9db1-84f8d4f28b93"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.038947 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8823e4f7-cb5e-4f62-9db1-84f8d4f28b93" (UID: "8823e4f7-cb5e-4f62-9db1-84f8d4f28b93"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.043551 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8823e4f7-cb5e-4f62-9db1-84f8d4f28b93" (UID: "8823e4f7-cb5e-4f62-9db1-84f8d4f28b93"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.056485 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.056647 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.056720 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.056781 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.056833 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mv97\" (UniqueName: \"kubernetes.io/projected/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93-kube-api-access-5mv97\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.617621 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.623606 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549dd5b965-mr5x2" event={"ID":"8823e4f7-cb5e-4f62-9db1-84f8d4f28b93","Type":"ContainerDied","Data":"6fbd18681706096ccff3d1e93c8378fbe72c4ed8fd9d0491d6db0fc7472347ad"} Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.623691 4854 scope.go:117] "RemoveContainer" containerID="37bf59643938313b465d6a9f5849fa5e4e43ae91174ed85f53db0289760ffb7d" Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.666069 4854 scope.go:117] "RemoveContainer" containerID="9178e6a3cb41fd1be6f120fff0b3b3a3b47a61cc47d34bd8923e0f0eae0fa5b7" Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.671057 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-549dd5b965-mr5x2"] Oct 07 14:10:41 crc kubenswrapper[4854]: I1007 14:10:41.679509 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-549dd5b965-mr5x2"] Oct 07 14:10:42 crc kubenswrapper[4854]: I1007 14:10:42.718276 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8823e4f7-cb5e-4f62-9db1-84f8d4f28b93" path="/var/lib/kubelet/pods/8823e4f7-cb5e-4f62-9db1-84f8d4f28b93/volumes" Oct 07 14:10:43 crc kubenswrapper[4854]: I1007 14:10:43.233722 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:10:43 crc kubenswrapper[4854]: I1007 14:10:43.234427 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="ceilometer-central-agent" containerID="cri-o://5091dfd5d4a9c80a5ae7f2a457bd8231e44ce5de14e8faa458a3a329df7e8595" gracePeriod=30 Oct 07 14:10:43 crc kubenswrapper[4854]: I1007 14:10:43.234527 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="sg-core" containerID="cri-o://bdedc38f22da7a012d9ed870ea15d9df91dc860e3025d3dc6887d05b54faf7fa" gracePeriod=30 Oct 07 14:10:43 crc kubenswrapper[4854]: I1007 14:10:43.234566 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="ceilometer-notification-agent" containerID="cri-o://725189faa5beb7fdcd8cb218a915d881898c7570c2ab5a178a822da3b329b1a0" gracePeriod=30 Oct 07 14:10:43 crc kubenswrapper[4854]: I1007 14:10:43.234536 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="proxy-httpd" containerID="cri-o://167833fabf21dfb298f2fb42a6a4811cd54f54e7fcfcbca96b731d1565db3dd6" gracePeriod=30 Oct 07 14:10:43 crc kubenswrapper[4854]: I1007 14:10:43.645173 4854 generic.go:334] "Generic (PLEG): container finished" podID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerID="167833fabf21dfb298f2fb42a6a4811cd54f54e7fcfcbca96b731d1565db3dd6" exitCode=0 Oct 07 14:10:43 crc kubenswrapper[4854]: I1007 14:10:43.645479 4854 generic.go:334] "Generic (PLEG): container finished" podID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerID="bdedc38f22da7a012d9ed870ea15d9df91dc860e3025d3dc6887d05b54faf7fa" exitCode=2 Oct 07 14:10:43 crc kubenswrapper[4854]: I1007 14:10:43.645500 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc","Type":"ContainerDied","Data":"167833fabf21dfb298f2fb42a6a4811cd54f54e7fcfcbca96b731d1565db3dd6"} Oct 07 14:10:43 crc kubenswrapper[4854]: I1007 14:10:43.645526 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc","Type":"ContainerDied","Data":"bdedc38f22da7a012d9ed870ea15d9df91dc860e3025d3dc6887d05b54faf7fa"} Oct 07 14:10:44 crc kubenswrapper[4854]: I1007 14:10:44.658448 4854 generic.go:334] "Generic (PLEG): container finished" podID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerID="5091dfd5d4a9c80a5ae7f2a457bd8231e44ce5de14e8faa458a3a329df7e8595" exitCode=0 Oct 07 14:10:44 crc kubenswrapper[4854]: I1007 14:10:44.658495 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc","Type":"ContainerDied","Data":"5091dfd5d4a9c80a5ae7f2a457bd8231e44ce5de14e8faa458a3a329df7e8595"} Oct 07 14:10:45 crc kubenswrapper[4854]: I1007 14:10:45.700687 4854 generic.go:334] "Generic (PLEG): container finished" podID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerID="725189faa5beb7fdcd8cb218a915d881898c7570c2ab5a178a822da3b329b1a0" exitCode=0 Oct 07 14:10:45 crc kubenswrapper[4854]: I1007 14:10:45.700731 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc","Type":"ContainerDied","Data":"725189faa5beb7fdcd8cb218a915d881898c7570c2ab5a178a822da3b329b1a0"} Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.061057 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.165136 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8kj9\" (UniqueName: \"kubernetes.io/projected/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-kube-api-access-q8kj9\") pod \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.165246 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-sg-core-conf-yaml\") pod \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.165313 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-combined-ca-bundle\") pod \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.165351 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-run-httpd\") pod \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.165426 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-log-httpd\") pod \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.165457 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-scripts\") pod \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.165525 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-config-data\") pod \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\" (UID: \"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc\") " Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.167179 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" (UID: "76fcc00d-3457-4ea6-a46e-e6c4ac440cdc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.167417 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" (UID: "76fcc00d-3457-4ea6-a46e-e6c4ac440cdc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.173322 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-kube-api-access-q8kj9" (OuterVolumeSpecName: "kube-api-access-q8kj9") pod "76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" (UID: "76fcc00d-3457-4ea6-a46e-e6c4ac440cdc"). InnerVolumeSpecName "kube-api-access-q8kj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.193119 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-scripts" (OuterVolumeSpecName: "scripts") pod "76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" (UID: "76fcc00d-3457-4ea6-a46e-e6c4ac440cdc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.227516 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" (UID: "76fcc00d-3457-4ea6-a46e-e6c4ac440cdc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.249453 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" (UID: "76fcc00d-3457-4ea6-a46e-e6c4ac440cdc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.268539 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8kj9\" (UniqueName: \"kubernetes.io/projected/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-kube-api-access-q8kj9\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.268577 4854 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.268586 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.268596 4854 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.268606 4854 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.268614 4854 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.288300 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-config-data" (OuterVolumeSpecName: "config-data") pod "76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" (UID: "76fcc00d-3457-4ea6-a46e-e6c4ac440cdc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.370359 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.719372 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.734748 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76fcc00d-3457-4ea6-a46e-e6c4ac440cdc","Type":"ContainerDied","Data":"d726e73d585556292fd686beac96ce0e5eb0ee0fcd0f3a23496f5342ab34c650"} Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.735248 4854 scope.go:117] "RemoveContainer" containerID="167833fabf21dfb298f2fb42a6a4811cd54f54e7fcfcbca96b731d1565db3dd6" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.766028 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.767098 4854 scope.go:117] "RemoveContainer" containerID="bdedc38f22da7a012d9ed870ea15d9df91dc860e3025d3dc6887d05b54faf7fa" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.775610 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.801851 4854 scope.go:117] "RemoveContainer" containerID="725189faa5beb7fdcd8cb218a915d881898c7570c2ab5a178a822da3b329b1a0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.802013 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:10:46 crc kubenswrapper[4854]: E1007 14:10:46.802495 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="proxy-httpd" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.802510 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="proxy-httpd" Oct 07 14:10:46 crc kubenswrapper[4854]: E1007 14:10:46.802537 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="sg-core" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.802544 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="sg-core" Oct 07 14:10:46 crc kubenswrapper[4854]: E1007 14:10:46.802560 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8823e4f7-cb5e-4f62-9db1-84f8d4f28b93" containerName="dnsmasq-dns" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.802566 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="8823e4f7-cb5e-4f62-9db1-84f8d4f28b93" containerName="dnsmasq-dns" Oct 07 14:10:46 crc kubenswrapper[4854]: E1007 14:10:46.802578 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="ceilometer-notification-agent" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.802585 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="ceilometer-notification-agent" Oct 07 14:10:46 crc kubenswrapper[4854]: E1007 14:10:46.802595 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8823e4f7-cb5e-4f62-9db1-84f8d4f28b93" containerName="init" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.802603 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="8823e4f7-cb5e-4f62-9db1-84f8d4f28b93" containerName="init" Oct 07 14:10:46 crc kubenswrapper[4854]: E1007 14:10:46.802636 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="ceilometer-central-agent" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.802646 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="ceilometer-central-agent" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.802885 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="ceilometer-notification-agent" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.802905 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="ceilometer-central-agent" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.802921 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="sg-core" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.802929 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="8823e4f7-cb5e-4f62-9db1-84f8d4f28b93" containerName="dnsmasq-dns" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.802941 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" containerName="proxy-httpd" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.805779 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.809654 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.810369 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.815768 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.851632 4854 scope.go:117] "RemoveContainer" containerID="5091dfd5d4a9c80a5ae7f2a457bd8231e44ce5de14e8faa458a3a329df7e8595" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.885625 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/adaa0e6d-633e-469a-9327-ceb526997466-run-httpd\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.885813 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5kcv\" (UniqueName: \"kubernetes.io/projected/adaa0e6d-633e-469a-9327-ceb526997466-kube-api-access-p5kcv\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.885944 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adaa0e6d-633e-469a-9327-ceb526997466-scripts\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.886010 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adaa0e6d-633e-469a-9327-ceb526997466-config-data\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.886048 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/adaa0e6d-633e-469a-9327-ceb526997466-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.886192 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adaa0e6d-633e-469a-9327-ceb526997466-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.886348 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/adaa0e6d-633e-469a-9327-ceb526997466-log-httpd\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.988115 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/adaa0e6d-633e-469a-9327-ceb526997466-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.988271 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adaa0e6d-633e-469a-9327-ceb526997466-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.988409 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/adaa0e6d-633e-469a-9327-ceb526997466-log-httpd\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.988799 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/adaa0e6d-633e-469a-9327-ceb526997466-log-httpd\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.988954 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/adaa0e6d-633e-469a-9327-ceb526997466-run-httpd\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.989287 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/adaa0e6d-633e-469a-9327-ceb526997466-run-httpd\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.989364 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5kcv\" (UniqueName: \"kubernetes.io/projected/adaa0e6d-633e-469a-9327-ceb526997466-kube-api-access-p5kcv\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.989801 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adaa0e6d-633e-469a-9327-ceb526997466-scripts\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.990221 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adaa0e6d-633e-469a-9327-ceb526997466-config-data\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.991857 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adaa0e6d-633e-469a-9327-ceb526997466-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.991939 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/adaa0e6d-633e-469a-9327-ceb526997466-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:46 crc kubenswrapper[4854]: I1007 14:10:46.993089 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adaa0e6d-633e-469a-9327-ceb526997466-scripts\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:47 crc kubenswrapper[4854]: I1007 14:10:46.998942 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adaa0e6d-633e-469a-9327-ceb526997466-config-data\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:47 crc kubenswrapper[4854]: I1007 14:10:47.013841 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5kcv\" (UniqueName: \"kubernetes.io/projected/adaa0e6d-633e-469a-9327-ceb526997466-kube-api-access-p5kcv\") pod \"ceilometer-0\" (UID: \"adaa0e6d-633e-469a-9327-ceb526997466\") " pod="openstack/ceilometer-0" Oct 07 14:10:47 crc kubenswrapper[4854]: I1007 14:10:47.128640 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:10:47 crc kubenswrapper[4854]: I1007 14:10:47.750104 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:10:47 crc kubenswrapper[4854]: I1007 14:10:47.777020 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:10:48 crc kubenswrapper[4854]: I1007 14:10:48.764648 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76fcc00d-3457-4ea6-a46e-e6c4ac440cdc" path="/var/lib/kubelet/pods/76fcc00d-3457-4ea6-a46e-e6c4ac440cdc/volumes" Oct 07 14:10:48 crc kubenswrapper[4854]: I1007 14:10:48.770973 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"adaa0e6d-633e-469a-9327-ceb526997466","Type":"ContainerStarted","Data":"69ba431be0dd2f1507f30be406445b424a5a664ca6b2e9d5a7ca5301e604f994"} Oct 07 14:10:49 crc kubenswrapper[4854]: I1007 14:10:49.782701 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"adaa0e6d-633e-469a-9327-ceb526997466","Type":"ContainerStarted","Data":"d53707f378664868a4ff0d11dd347ccd14ffff19152b6ac558a12a663cd41d25"} Oct 07 14:10:49 crc kubenswrapper[4854]: I1007 14:10:49.782990 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"adaa0e6d-633e-469a-9327-ceb526997466","Type":"ContainerStarted","Data":"84fc3956affe2ca8c78f064efccd5ec03d8d96cb47de21d9712b4dbbe509c50a"} Oct 07 14:10:50 crc kubenswrapper[4854]: I1007 14:10:50.798940 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"adaa0e6d-633e-469a-9327-ceb526997466","Type":"ContainerStarted","Data":"6117f443c23cc0eac5cc934a89cc4a4cba06f198b1144cbc7fd81559c14c69c5"} Oct 07 14:10:51 crc kubenswrapper[4854]: I1007 14:10:51.639768 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 07 14:10:51 crc kubenswrapper[4854]: I1007 14:10:51.834766 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 07 14:10:52 crc kubenswrapper[4854]: I1007 14:10:52.191449 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 07 14:10:52 crc kubenswrapper[4854]: I1007 14:10:52.833638 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"adaa0e6d-633e-469a-9327-ceb526997466","Type":"ContainerStarted","Data":"299d29fd53247b9b9bd341fb0ea19eb333b81fc5838db80b9f62bc894499fdfd"} Oct 07 14:10:52 crc kubenswrapper[4854]: I1007 14:10:52.834027 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 14:11:17 crc kubenswrapper[4854]: I1007 14:11:17.135707 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 14:11:17 crc kubenswrapper[4854]: I1007 14:11:17.160616 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=27.464443243 podStartE2EDuration="31.160578331s" podCreationTimestamp="2025-10-07 14:10:46 +0000 UTC" firstStartedPulling="2025-10-07 14:10:47.776740962 +0000 UTC m=+6363.764573217" lastFinishedPulling="2025-10-07 14:10:51.47287605 +0000 UTC m=+6367.460708305" observedRunningTime="2025-10-07 14:10:52.860530669 +0000 UTC m=+6368.848362944" watchObservedRunningTime="2025-10-07 14:11:17.160578331 +0000 UTC m=+6393.148410636" Oct 07 14:11:40 crc kubenswrapper[4854]: I1007 14:11:40.800546 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fc954d97f-66mwj"] Oct 07 14:11:40 crc kubenswrapper[4854]: I1007 14:11:40.804281 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:40 crc kubenswrapper[4854]: I1007 14:11:40.809587 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Oct 07 14:11:40 crc kubenswrapper[4854]: I1007 14:11:40.810670 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fc954d97f-66mwj"] Oct 07 14:11:40 crc kubenswrapper[4854]: I1007 14:11:40.935863 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sthcs\" (UniqueName: \"kubernetes.io/projected/3135c457-b279-40f6-95ca-aba94d09008e-kube-api-access-sthcs\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:40 crc kubenswrapper[4854]: I1007 14:11:40.935905 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-config\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:40 crc kubenswrapper[4854]: I1007 14:11:40.935990 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-dns-svc\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:40 crc kubenswrapper[4854]: I1007 14:11:40.936013 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-openstack-cell1\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:40 crc kubenswrapper[4854]: I1007 14:11:40.936033 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:40 crc kubenswrapper[4854]: I1007 14:11:40.936122 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:41 crc kubenswrapper[4854]: I1007 14:11:41.040052 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:41 crc kubenswrapper[4854]: I1007 14:11:41.038616 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-ovsdbserver-nb\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:41 crc kubenswrapper[4854]: I1007 14:11:41.040315 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sthcs\" (UniqueName: \"kubernetes.io/projected/3135c457-b279-40f6-95ca-aba94d09008e-kube-api-access-sthcs\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:41 crc kubenswrapper[4854]: I1007 14:11:41.040763 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-config\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:41 crc kubenswrapper[4854]: I1007 14:11:41.041772 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-config\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:41 crc kubenswrapper[4854]: I1007 14:11:41.042105 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-dns-svc\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:41 crc kubenswrapper[4854]: I1007 14:11:41.043012 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-dns-svc\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:41 crc kubenswrapper[4854]: I1007 14:11:41.043193 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-openstack-cell1\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:41 crc kubenswrapper[4854]: I1007 14:11:41.044049 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-openstack-cell1\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:41 crc kubenswrapper[4854]: I1007 14:11:41.044992 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:41 crc kubenswrapper[4854]: I1007 14:11:41.045046 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-ovsdbserver-sb\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:41 crc kubenswrapper[4854]: I1007 14:11:41.066249 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sthcs\" (UniqueName: \"kubernetes.io/projected/3135c457-b279-40f6-95ca-aba94d09008e-kube-api-access-sthcs\") pod \"dnsmasq-dns-5fc954d97f-66mwj\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:41 crc kubenswrapper[4854]: I1007 14:11:41.150738 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:41 crc kubenswrapper[4854]: I1007 14:11:41.684946 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fc954d97f-66mwj"] Oct 07 14:11:42 crc kubenswrapper[4854]: I1007 14:11:42.471855 4854 generic.go:334] "Generic (PLEG): container finished" podID="3135c457-b279-40f6-95ca-aba94d09008e" containerID="84d2a53088f658431306bed020ce77c8131e50592c9fc3c341c34793064b3f3d" exitCode=0 Oct 07 14:11:42 crc kubenswrapper[4854]: I1007 14:11:42.472497 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" event={"ID":"3135c457-b279-40f6-95ca-aba94d09008e","Type":"ContainerDied","Data":"84d2a53088f658431306bed020ce77c8131e50592c9fc3c341c34793064b3f3d"} Oct 07 14:11:42 crc kubenswrapper[4854]: I1007 14:11:42.472548 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" event={"ID":"3135c457-b279-40f6-95ca-aba94d09008e","Type":"ContainerStarted","Data":"f954237c3030dda852558c898609f86b3d79c59c230fbe7de3b13a4a33336e8a"} Oct 07 14:11:43 crc kubenswrapper[4854]: I1007 14:11:43.485869 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" event={"ID":"3135c457-b279-40f6-95ca-aba94d09008e","Type":"ContainerStarted","Data":"d9c7fbb0e3c4f94209c16e896276b8424eb09813df2aae1f38e83485f29291aa"} Oct 07 14:11:43 crc kubenswrapper[4854]: I1007 14:11:43.486405 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:43 crc kubenswrapper[4854]: I1007 14:11:43.503101 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" podStartSLOduration=3.503077527 podStartE2EDuration="3.503077527s" podCreationTimestamp="2025-10-07 14:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:11:43.501768769 +0000 UTC m=+6419.489601044" watchObservedRunningTime="2025-10-07 14:11:43.503077527 +0000 UTC m=+6419.490909782" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.152393 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.218391 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74bbfd59d5-x4jc9"] Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.218664 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" podUID="9889be68-5da1-49d0-ac67-b649d6d8fc9b" containerName="dnsmasq-dns" containerID="cri-o://8baa1265ac2871963fce0f42f6b8e431cdaa4ece21870b7769548dd68697f9fa" gracePeriod=10 Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.421459 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d9d5d775-vnbhg"] Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.423355 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.438596 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d9d5d775-vnbhg"] Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.575091 4854 generic.go:334] "Generic (PLEG): container finished" podID="9889be68-5da1-49d0-ac67-b649d6d8fc9b" containerID="8baa1265ac2871963fce0f42f6b8e431cdaa4ece21870b7769548dd68697f9fa" exitCode=0 Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.575459 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" event={"ID":"9889be68-5da1-49d0-ac67-b649d6d8fc9b","Type":"ContainerDied","Data":"8baa1265ac2871963fce0f42f6b8e431cdaa4ece21870b7769548dd68697f9fa"} Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.595837 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-ovsdbserver-sb\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.595901 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-config\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.595951 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-openstack-cell1\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.596049 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-dns-svc\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.596303 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-ovsdbserver-nb\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.596504 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4f6d\" (UniqueName: \"kubernetes.io/projected/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-kube-api-access-k4f6d\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.699198 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-ovsdbserver-nb\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.699302 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4f6d\" (UniqueName: \"kubernetes.io/projected/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-kube-api-access-k4f6d\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.699436 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-ovsdbserver-sb\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.699470 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-config\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.699525 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-dns-svc\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.699544 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-openstack-cell1\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.700713 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-openstack-cell1\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.700812 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-ovsdbserver-sb\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.701662 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-ovsdbserver-nb\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.701699 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-config\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.702591 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-dns-svc\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.734368 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4f6d\" (UniqueName: \"kubernetes.io/projected/e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df-kube-api-access-k4f6d\") pod \"dnsmasq-dns-57d9d5d775-vnbhg\" (UID: \"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df\") " pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.783023 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:51 crc kubenswrapper[4854]: I1007 14:11:51.936937 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.126849 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-config\") pod \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.127123 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-dns-svc\") pod \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.127206 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rxjw\" (UniqueName: \"kubernetes.io/projected/9889be68-5da1-49d0-ac67-b649d6d8fc9b-kube-api-access-7rxjw\") pod \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.127255 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-ovsdbserver-nb\") pod \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.127284 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-ovsdbserver-sb\") pod \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\" (UID: \"9889be68-5da1-49d0-ac67-b649d6d8fc9b\") " Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.154658 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9889be68-5da1-49d0-ac67-b649d6d8fc9b-kube-api-access-7rxjw" (OuterVolumeSpecName: "kube-api-access-7rxjw") pod "9889be68-5da1-49d0-ac67-b649d6d8fc9b" (UID: "9889be68-5da1-49d0-ac67-b649d6d8fc9b"). InnerVolumeSpecName "kube-api-access-7rxjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.236871 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rxjw\" (UniqueName: \"kubernetes.io/projected/9889be68-5da1-49d0-ac67-b649d6d8fc9b-kube-api-access-7rxjw\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.237593 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9889be68-5da1-49d0-ac67-b649d6d8fc9b" (UID: "9889be68-5da1-49d0-ac67-b649d6d8fc9b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.271051 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9889be68-5da1-49d0-ac67-b649d6d8fc9b" (UID: "9889be68-5da1-49d0-ac67-b649d6d8fc9b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.283724 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-config" (OuterVolumeSpecName: "config") pod "9889be68-5da1-49d0-ac67-b649d6d8fc9b" (UID: "9889be68-5da1-49d0-ac67-b649d6d8fc9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.296876 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9889be68-5da1-49d0-ac67-b649d6d8fc9b" (UID: "9889be68-5da1-49d0-ac67-b649d6d8fc9b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.338609 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.338637 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.338646 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.338655 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9889be68-5da1-49d0-ac67-b649d6d8fc9b-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.440770 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d9d5d775-vnbhg"] Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.604686 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" event={"ID":"9889be68-5da1-49d0-ac67-b649d6d8fc9b","Type":"ContainerDied","Data":"f416d1e6a3a1b4db3ebe6ece7e25ba45a8152d5f781b190b25d4dcc19f9e3b31"} Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.604999 4854 scope.go:117] "RemoveContainer" containerID="8baa1265ac2871963fce0f42f6b8e431cdaa4ece21870b7769548dd68697f9fa" Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.604722 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74bbfd59d5-x4jc9" Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.615434 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" event={"ID":"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df","Type":"ContainerStarted","Data":"f03d9944692237d1732a78d2860b2e6fbae9a7b474e326da75efbaab50c2335b"} Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.671750 4854 scope.go:117] "RemoveContainer" containerID="f1dd65f6c8878478b719df919f50ad5c51d680d7c656e5e5bab97970d78aabc7" Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.688456 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74bbfd59d5-x4jc9"] Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.699998 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74bbfd59d5-x4jc9"] Oct 07 14:11:52 crc kubenswrapper[4854]: I1007 14:11:52.726848 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9889be68-5da1-49d0-ac67-b649d6d8fc9b" path="/var/lib/kubelet/pods/9889be68-5da1-49d0-ac67-b649d6d8fc9b/volumes" Oct 07 14:11:53 crc kubenswrapper[4854]: I1007 14:11:53.632470 4854 generic.go:334] "Generic (PLEG): container finished" podID="e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df" containerID="5cca19b817a46dcac3c996293cd6608ce76a58100e381f9be2225a460419dec9" exitCode=0 Oct 07 14:11:53 crc kubenswrapper[4854]: I1007 14:11:53.635256 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" event={"ID":"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df","Type":"ContainerDied","Data":"5cca19b817a46dcac3c996293cd6608ce76a58100e381f9be2225a460419dec9"} Oct 07 14:11:54 crc kubenswrapper[4854]: I1007 14:11:54.646472 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" event={"ID":"e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df","Type":"ContainerStarted","Data":"ade777f3b871a281b357a1576afa52fe93c1e2b5ad064d57f5bdced62d038843"} Oct 07 14:11:54 crc kubenswrapper[4854]: I1007 14:11:54.646786 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:11:54 crc kubenswrapper[4854]: I1007 14:11:54.667377 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" podStartSLOduration=3.66735304 podStartE2EDuration="3.66735304s" podCreationTimestamp="2025-10-07 14:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:11:54.661953735 +0000 UTC m=+6430.649786000" watchObservedRunningTime="2025-10-07 14:11:54.66735304 +0000 UTC m=+6430.655185315" Oct 07 14:12:01 crc kubenswrapper[4854]: I1007 14:12:01.785491 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d9d5d775-vnbhg" Oct 07 14:12:01 crc kubenswrapper[4854]: I1007 14:12:01.856568 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fc954d97f-66mwj"] Oct 07 14:12:01 crc kubenswrapper[4854]: I1007 14:12:01.857027 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" podUID="3135c457-b279-40f6-95ca-aba94d09008e" containerName="dnsmasq-dns" containerID="cri-o://d9c7fbb0e3c4f94209c16e896276b8424eb09813df2aae1f38e83485f29291aa" gracePeriod=10 Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.458108 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.595459 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-openstack-cell1\") pod \"3135c457-b279-40f6-95ca-aba94d09008e\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.595525 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-dns-svc\") pod \"3135c457-b279-40f6-95ca-aba94d09008e\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.595638 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-config\") pod \"3135c457-b279-40f6-95ca-aba94d09008e\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.595707 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-ovsdbserver-nb\") pod \"3135c457-b279-40f6-95ca-aba94d09008e\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.595743 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sthcs\" (UniqueName: \"kubernetes.io/projected/3135c457-b279-40f6-95ca-aba94d09008e-kube-api-access-sthcs\") pod \"3135c457-b279-40f6-95ca-aba94d09008e\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.595834 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-ovsdbserver-sb\") pod \"3135c457-b279-40f6-95ca-aba94d09008e\" (UID: \"3135c457-b279-40f6-95ca-aba94d09008e\") " Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.618590 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3135c457-b279-40f6-95ca-aba94d09008e-kube-api-access-sthcs" (OuterVolumeSpecName: "kube-api-access-sthcs") pod "3135c457-b279-40f6-95ca-aba94d09008e" (UID: "3135c457-b279-40f6-95ca-aba94d09008e"). InnerVolumeSpecName "kube-api-access-sthcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.650270 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3135c457-b279-40f6-95ca-aba94d09008e" (UID: "3135c457-b279-40f6-95ca-aba94d09008e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.651045 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "3135c457-b279-40f6-95ca-aba94d09008e" (UID: "3135c457-b279-40f6-95ca-aba94d09008e"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.661597 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3135c457-b279-40f6-95ca-aba94d09008e" (UID: "3135c457-b279-40f6-95ca-aba94d09008e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.673642 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-config" (OuterVolumeSpecName: "config") pod "3135c457-b279-40f6-95ca-aba94d09008e" (UID: "3135c457-b279-40f6-95ca-aba94d09008e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.675421 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3135c457-b279-40f6-95ca-aba94d09008e" (UID: "3135c457-b279-40f6-95ca-aba94d09008e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.698199 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.698233 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sthcs\" (UniqueName: \"kubernetes.io/projected/3135c457-b279-40f6-95ca-aba94d09008e-kube-api-access-sthcs\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.698243 4854 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.698251 4854 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.698259 4854 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.698268 4854 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3135c457-b279-40f6-95ca-aba94d09008e-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.749411 4854 generic.go:334] "Generic (PLEG): container finished" podID="3135c457-b279-40f6-95ca-aba94d09008e" containerID="d9c7fbb0e3c4f94209c16e896276b8424eb09813df2aae1f38e83485f29291aa" exitCode=0 Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.749466 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" event={"ID":"3135c457-b279-40f6-95ca-aba94d09008e","Type":"ContainerDied","Data":"d9c7fbb0e3c4f94209c16e896276b8424eb09813df2aae1f38e83485f29291aa"} Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.749505 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" event={"ID":"3135c457-b279-40f6-95ca-aba94d09008e","Type":"ContainerDied","Data":"f954237c3030dda852558c898609f86b3d79c59c230fbe7de3b13a4a33336e8a"} Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.749522 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fc954d97f-66mwj" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.749529 4854 scope.go:117] "RemoveContainer" containerID="d9c7fbb0e3c4f94209c16e896276b8424eb09813df2aae1f38e83485f29291aa" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.797292 4854 scope.go:117] "RemoveContainer" containerID="84d2a53088f658431306bed020ce77c8131e50592c9fc3c341c34793064b3f3d" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.798617 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fc954d97f-66mwj"] Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.808632 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fc954d97f-66mwj"] Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.819931 4854 scope.go:117] "RemoveContainer" containerID="d9c7fbb0e3c4f94209c16e896276b8424eb09813df2aae1f38e83485f29291aa" Oct 07 14:12:02 crc kubenswrapper[4854]: E1007 14:12:02.820550 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c7fbb0e3c4f94209c16e896276b8424eb09813df2aae1f38e83485f29291aa\": container with ID starting with d9c7fbb0e3c4f94209c16e896276b8424eb09813df2aae1f38e83485f29291aa not found: ID does not exist" containerID="d9c7fbb0e3c4f94209c16e896276b8424eb09813df2aae1f38e83485f29291aa" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.820595 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c7fbb0e3c4f94209c16e896276b8424eb09813df2aae1f38e83485f29291aa"} err="failed to get container status \"d9c7fbb0e3c4f94209c16e896276b8424eb09813df2aae1f38e83485f29291aa\": rpc error: code = NotFound desc = could not find container \"d9c7fbb0e3c4f94209c16e896276b8424eb09813df2aae1f38e83485f29291aa\": container with ID starting with d9c7fbb0e3c4f94209c16e896276b8424eb09813df2aae1f38e83485f29291aa not found: ID does not exist" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.820621 4854 scope.go:117] "RemoveContainer" containerID="84d2a53088f658431306bed020ce77c8131e50592c9fc3c341c34793064b3f3d" Oct 07 14:12:02 crc kubenswrapper[4854]: E1007 14:12:02.820886 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84d2a53088f658431306bed020ce77c8131e50592c9fc3c341c34793064b3f3d\": container with ID starting with 84d2a53088f658431306bed020ce77c8131e50592c9fc3c341c34793064b3f3d not found: ID does not exist" containerID="84d2a53088f658431306bed020ce77c8131e50592c9fc3c341c34793064b3f3d" Oct 07 14:12:02 crc kubenswrapper[4854]: I1007 14:12:02.820915 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d2a53088f658431306bed020ce77c8131e50592c9fc3c341c34793064b3f3d"} err="failed to get container status \"84d2a53088f658431306bed020ce77c8131e50592c9fc3c341c34793064b3f3d\": rpc error: code = NotFound desc = could not find container \"84d2a53088f658431306bed020ce77c8131e50592c9fc3c341c34793064b3f3d\": container with ID starting with 84d2a53088f658431306bed020ce77c8131e50592c9fc3c341c34793064b3f3d not found: ID does not exist" Oct 07 14:12:04 crc kubenswrapper[4854]: I1007 14:12:04.715780 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3135c457-b279-40f6-95ca-aba94d09008e" path="/var/lib/kubelet/pods/3135c457-b279-40f6-95ca-aba94d09008e/volumes" Oct 07 14:12:10 crc kubenswrapper[4854]: I1007 14:12:10.807696 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:12:10 crc kubenswrapper[4854]: I1007 14:12:10.808304 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.510027 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh"] Oct 07 14:12:12 crc kubenswrapper[4854]: E1007 14:12:12.510649 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3135c457-b279-40f6-95ca-aba94d09008e" containerName="init" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.510660 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="3135c457-b279-40f6-95ca-aba94d09008e" containerName="init" Oct 07 14:12:12 crc kubenswrapper[4854]: E1007 14:12:12.510691 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9889be68-5da1-49d0-ac67-b649d6d8fc9b" containerName="dnsmasq-dns" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.510696 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9889be68-5da1-49d0-ac67-b649d6d8fc9b" containerName="dnsmasq-dns" Oct 07 14:12:12 crc kubenswrapper[4854]: E1007 14:12:12.510721 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3135c457-b279-40f6-95ca-aba94d09008e" containerName="dnsmasq-dns" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.510726 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="3135c457-b279-40f6-95ca-aba94d09008e" containerName="dnsmasq-dns" Oct 07 14:12:12 crc kubenswrapper[4854]: E1007 14:12:12.510737 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9889be68-5da1-49d0-ac67-b649d6d8fc9b" containerName="init" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.510743 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9889be68-5da1-49d0-ac67-b649d6d8fc9b" containerName="init" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.510940 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9889be68-5da1-49d0-ac67-b649d6d8fc9b" containerName="dnsmasq-dns" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.510965 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="3135c457-b279-40f6-95ca-aba94d09008e" containerName="dnsmasq-dns" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.512525 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.517042 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.518793 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.520230 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.523306 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.527619 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh"] Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.654685 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.654804 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.655194 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.655259 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.655336 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w54z2\" (UniqueName: \"kubernetes.io/projected/e2cc8cba-1b03-4817-9930-1a31f1971d9a-kube-api-access-w54z2\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.757747 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.757954 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.757989 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.758018 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w54z2\" (UniqueName: \"kubernetes.io/projected/e2cc8cba-1b03-4817-9930-1a31f1971d9a-kube-api-access-w54z2\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.758060 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.764349 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.764829 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.765008 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.767867 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.782770 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w54z2\" (UniqueName: \"kubernetes.io/projected/e2cc8cba-1b03-4817-9930-1a31f1971d9a-kube-api-access-w54z2\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:12 crc kubenswrapper[4854]: I1007 14:12:12.883114 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:13 crc kubenswrapper[4854]: I1007 14:12:13.463942 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh"] Oct 07 14:12:13 crc kubenswrapper[4854]: I1007 14:12:13.918922 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" event={"ID":"e2cc8cba-1b03-4817-9930-1a31f1971d9a","Type":"ContainerStarted","Data":"087c609f1f08f25bbe19d85f575e71eeba83ada7cb33a8d2a176857274f75d2c"} Oct 07 14:12:21 crc kubenswrapper[4854]: I1007 14:12:21.043812 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-wjrd7"] Oct 07 14:12:21 crc kubenswrapper[4854]: I1007 14:12:21.053912 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-wjrd7"] Oct 07 14:12:22 crc kubenswrapper[4854]: I1007 14:12:22.716737 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a661f17-e028-427d-8a00-76cdffdea5ba" path="/var/lib/kubelet/pods/4a661f17-e028-427d-8a00-76cdffdea5ba/volumes" Oct 07 14:12:23 crc kubenswrapper[4854]: I1007 14:12:23.042020 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" event={"ID":"e2cc8cba-1b03-4817-9930-1a31f1971d9a","Type":"ContainerStarted","Data":"220bc5cc9d3856cfdbdbaa79849dd0a3c045ae6bbc22d8a1457db227f3356dc5"} Oct 07 14:12:23 crc kubenswrapper[4854]: I1007 14:12:23.067691 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" podStartSLOduration=2.722313971 podStartE2EDuration="11.06766928s" podCreationTimestamp="2025-10-07 14:12:12 +0000 UTC" firstStartedPulling="2025-10-07 14:12:13.468417614 +0000 UTC m=+6449.456249879" lastFinishedPulling="2025-10-07 14:12:21.813772913 +0000 UTC m=+6457.801605188" observedRunningTime="2025-10-07 14:12:23.064340434 +0000 UTC m=+6459.052172729" watchObservedRunningTime="2025-10-07 14:12:23.06766928 +0000 UTC m=+6459.055501565" Oct 07 14:12:25 crc kubenswrapper[4854]: I1007 14:12:25.855023 4854 scope.go:117] "RemoveContainer" containerID="f93d0b34ee32037545a664909f46902df7c999eb1881c553523483e37e858157" Oct 07 14:12:32 crc kubenswrapper[4854]: I1007 14:12:32.036288 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-4d76-account-create-825gq"] Oct 07 14:12:32 crc kubenswrapper[4854]: I1007 14:12:32.049520 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-4d76-account-create-825gq"] Oct 07 14:12:32 crc kubenswrapper[4854]: I1007 14:12:32.724380 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3b22209-9bb6-4ae3-bcf8-4530fee452c5" path="/var/lib/kubelet/pods/f3b22209-9bb6-4ae3-bcf8-4530fee452c5/volumes" Oct 07 14:12:36 crc kubenswrapper[4854]: I1007 14:12:36.201032 4854 generic.go:334] "Generic (PLEG): container finished" podID="e2cc8cba-1b03-4817-9930-1a31f1971d9a" containerID="220bc5cc9d3856cfdbdbaa79849dd0a3c045ae6bbc22d8a1457db227f3356dc5" exitCode=0 Oct 07 14:12:36 crc kubenswrapper[4854]: I1007 14:12:36.201220 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" event={"ID":"e2cc8cba-1b03-4817-9930-1a31f1971d9a","Type":"ContainerDied","Data":"220bc5cc9d3856cfdbdbaa79849dd0a3c045ae6bbc22d8a1457db227f3356dc5"} Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.713784 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.765422 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-inventory\") pod \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.765683 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w54z2\" (UniqueName: \"kubernetes.io/projected/e2cc8cba-1b03-4817-9930-1a31f1971d9a-kube-api-access-w54z2\") pod \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.765763 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-ceph\") pod \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.765795 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-pre-adoption-validation-combined-ca-bundle\") pod \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.765832 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-ssh-key\") pod \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\" (UID: \"e2cc8cba-1b03-4817-9930-1a31f1971d9a\") " Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.771686 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "e2cc8cba-1b03-4817-9930-1a31f1971d9a" (UID: "e2cc8cba-1b03-4817-9930-1a31f1971d9a"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.771716 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2cc8cba-1b03-4817-9930-1a31f1971d9a-kube-api-access-w54z2" (OuterVolumeSpecName: "kube-api-access-w54z2") pod "e2cc8cba-1b03-4817-9930-1a31f1971d9a" (UID: "e2cc8cba-1b03-4817-9930-1a31f1971d9a"). InnerVolumeSpecName "kube-api-access-w54z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.773243 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-ceph" (OuterVolumeSpecName: "ceph") pod "e2cc8cba-1b03-4817-9930-1a31f1971d9a" (UID: "e2cc8cba-1b03-4817-9930-1a31f1971d9a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.797877 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e2cc8cba-1b03-4817-9930-1a31f1971d9a" (UID: "e2cc8cba-1b03-4817-9930-1a31f1971d9a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.817400 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-inventory" (OuterVolumeSpecName: "inventory") pod "e2cc8cba-1b03-4817-9930-1a31f1971d9a" (UID: "e2cc8cba-1b03-4817-9930-1a31f1971d9a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.869459 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w54z2\" (UniqueName: \"kubernetes.io/projected/e2cc8cba-1b03-4817-9930-1a31f1971d9a-kube-api-access-w54z2\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.869509 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.869528 4854 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.869547 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:37 crc kubenswrapper[4854]: I1007 14:12:37.869565 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2cc8cba-1b03-4817-9930-1a31f1971d9a-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:38 crc kubenswrapper[4854]: I1007 14:12:38.233413 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" event={"ID":"e2cc8cba-1b03-4817-9930-1a31f1971d9a","Type":"ContainerDied","Data":"087c609f1f08f25bbe19d85f575e71eeba83ada7cb33a8d2a176857274f75d2c"} Oct 07 14:12:38 crc kubenswrapper[4854]: I1007 14:12:38.233464 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="087c609f1f08f25bbe19d85f575e71eeba83ada7cb33a8d2a176857274f75d2c" Oct 07 14:12:38 crc kubenswrapper[4854]: I1007 14:12:38.233486 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh" Oct 07 14:12:39 crc kubenswrapper[4854]: I1007 14:12:39.044014 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-q6n4n"] Oct 07 14:12:39 crc kubenswrapper[4854]: I1007 14:12:39.055049 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-q6n4n"] Oct 07 14:12:39 crc kubenswrapper[4854]: I1007 14:12:39.817652 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qdvhm"] Oct 07 14:12:39 crc kubenswrapper[4854]: E1007 14:12:39.818237 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cc8cba-1b03-4817-9930-1a31f1971d9a" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 07 14:12:39 crc kubenswrapper[4854]: I1007 14:12:39.818255 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cc8cba-1b03-4817-9930-1a31f1971d9a" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 07 14:12:39 crc kubenswrapper[4854]: I1007 14:12:39.818561 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2cc8cba-1b03-4817-9930-1a31f1971d9a" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 07 14:12:39 crc kubenswrapper[4854]: I1007 14:12:39.821508 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:39 crc kubenswrapper[4854]: I1007 14:12:39.836424 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdvhm"] Oct 07 14:12:39 crc kubenswrapper[4854]: I1007 14:12:39.925811 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2r7f\" (UniqueName: \"kubernetes.io/projected/61648f4c-7049-44bb-8541-dff3bfba76b2-kube-api-access-l2r7f\") pod \"redhat-marketplace-qdvhm\" (UID: \"61648f4c-7049-44bb-8541-dff3bfba76b2\") " pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:39 crc kubenswrapper[4854]: I1007 14:12:39.925905 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61648f4c-7049-44bb-8541-dff3bfba76b2-catalog-content\") pod \"redhat-marketplace-qdvhm\" (UID: \"61648f4c-7049-44bb-8541-dff3bfba76b2\") " pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:39 crc kubenswrapper[4854]: I1007 14:12:39.926617 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61648f4c-7049-44bb-8541-dff3bfba76b2-utilities\") pod \"redhat-marketplace-qdvhm\" (UID: \"61648f4c-7049-44bb-8541-dff3bfba76b2\") " pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:40 crc kubenswrapper[4854]: I1007 14:12:40.028533 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61648f4c-7049-44bb-8541-dff3bfba76b2-utilities\") pod \"redhat-marketplace-qdvhm\" (UID: \"61648f4c-7049-44bb-8541-dff3bfba76b2\") " pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:40 crc kubenswrapper[4854]: I1007 14:12:40.028744 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2r7f\" (UniqueName: \"kubernetes.io/projected/61648f4c-7049-44bb-8541-dff3bfba76b2-kube-api-access-l2r7f\") pod \"redhat-marketplace-qdvhm\" (UID: \"61648f4c-7049-44bb-8541-dff3bfba76b2\") " pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:40 crc kubenswrapper[4854]: I1007 14:12:40.028839 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61648f4c-7049-44bb-8541-dff3bfba76b2-catalog-content\") pod \"redhat-marketplace-qdvhm\" (UID: \"61648f4c-7049-44bb-8541-dff3bfba76b2\") " pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:40 crc kubenswrapper[4854]: I1007 14:12:40.029017 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61648f4c-7049-44bb-8541-dff3bfba76b2-utilities\") pod \"redhat-marketplace-qdvhm\" (UID: \"61648f4c-7049-44bb-8541-dff3bfba76b2\") " pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:40 crc kubenswrapper[4854]: I1007 14:12:40.029597 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61648f4c-7049-44bb-8541-dff3bfba76b2-catalog-content\") pod \"redhat-marketplace-qdvhm\" (UID: \"61648f4c-7049-44bb-8541-dff3bfba76b2\") " pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:40 crc kubenswrapper[4854]: I1007 14:12:40.052365 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2r7f\" (UniqueName: \"kubernetes.io/projected/61648f4c-7049-44bb-8541-dff3bfba76b2-kube-api-access-l2r7f\") pod \"redhat-marketplace-qdvhm\" (UID: \"61648f4c-7049-44bb-8541-dff3bfba76b2\") " pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:40 crc kubenswrapper[4854]: I1007 14:12:40.164921 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:40 crc kubenswrapper[4854]: I1007 14:12:40.648615 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdvhm"] Oct 07 14:12:40 crc kubenswrapper[4854]: I1007 14:12:40.716448 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d99fd817-df4d-4f1f-8915-a2e87d2266a9" path="/var/lib/kubelet/pods/d99fd817-df4d-4f1f-8915-a2e87d2266a9/volumes" Oct 07 14:12:40 crc kubenswrapper[4854]: I1007 14:12:40.807874 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:12:40 crc kubenswrapper[4854]: I1007 14:12:40.807939 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:12:41 crc kubenswrapper[4854]: I1007 14:12:41.265507 4854 generic.go:334] "Generic (PLEG): container finished" podID="61648f4c-7049-44bb-8541-dff3bfba76b2" containerID="38981941a9fc912432e3b677986370ea0456d6863883bef80ac5ad291fa74885" exitCode=0 Oct 07 14:12:41 crc kubenswrapper[4854]: I1007 14:12:41.265670 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdvhm" event={"ID":"61648f4c-7049-44bb-8541-dff3bfba76b2","Type":"ContainerDied","Data":"38981941a9fc912432e3b677986370ea0456d6863883bef80ac5ad291fa74885"} Oct 07 14:12:41 crc kubenswrapper[4854]: I1007 14:12:41.265919 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdvhm" event={"ID":"61648f4c-7049-44bb-8541-dff3bfba76b2","Type":"ContainerStarted","Data":"26da37f39392e4fe40109835012ac89820f9a9fb146d4e66f080c09a530c5fe0"} Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.005882 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v57sg"] Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.008689 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.055182 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v57sg"] Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.082942 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgmqz\" (UniqueName: \"kubernetes.io/projected/b5595990-c273-46d8-b471-0c688d7d9bb3-kube-api-access-sgmqz\") pod \"redhat-operators-v57sg\" (UID: \"b5595990-c273-46d8-b471-0c688d7d9bb3\") " pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.083494 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5595990-c273-46d8-b471-0c688d7d9bb3-utilities\") pod \"redhat-operators-v57sg\" (UID: \"b5595990-c273-46d8-b471-0c688d7d9bb3\") " pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.083536 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5595990-c273-46d8-b471-0c688d7d9bb3-catalog-content\") pod \"redhat-operators-v57sg\" (UID: \"b5595990-c273-46d8-b471-0c688d7d9bb3\") " pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.185689 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5595990-c273-46d8-b471-0c688d7d9bb3-utilities\") pod \"redhat-operators-v57sg\" (UID: \"b5595990-c273-46d8-b471-0c688d7d9bb3\") " pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.185738 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5595990-c273-46d8-b471-0c688d7d9bb3-catalog-content\") pod \"redhat-operators-v57sg\" (UID: \"b5595990-c273-46d8-b471-0c688d7d9bb3\") " pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.185805 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgmqz\" (UniqueName: \"kubernetes.io/projected/b5595990-c273-46d8-b471-0c688d7d9bb3-kube-api-access-sgmqz\") pod \"redhat-operators-v57sg\" (UID: \"b5595990-c273-46d8-b471-0c688d7d9bb3\") " pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.186713 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5595990-c273-46d8-b471-0c688d7d9bb3-utilities\") pod \"redhat-operators-v57sg\" (UID: \"b5595990-c273-46d8-b471-0c688d7d9bb3\") " pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.186801 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5595990-c273-46d8-b471-0c688d7d9bb3-catalog-content\") pod \"redhat-operators-v57sg\" (UID: \"b5595990-c273-46d8-b471-0c688d7d9bb3\") " pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.205484 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgmqz\" (UniqueName: \"kubernetes.io/projected/b5595990-c273-46d8-b471-0c688d7d9bb3-kube-api-access-sgmqz\") pod \"redhat-operators-v57sg\" (UID: \"b5595990-c273-46d8-b471-0c688d7d9bb3\") " pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.280200 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdvhm" event={"ID":"61648f4c-7049-44bb-8541-dff3bfba76b2","Type":"ContainerStarted","Data":"db1f155b7e81ce24ad37f21550c7b892a1385000c597bac30391f09034aa7db9"} Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.365620 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:12:42 crc kubenswrapper[4854]: I1007 14:12:42.888085 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v57sg"] Oct 07 14:12:42 crc kubenswrapper[4854]: W1007 14:12:42.893944 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5595990_c273_46d8_b471_0c688d7d9bb3.slice/crio-05f54e5d5275be3fc866be4eb03195e52330114a7cbb4178b6fa3e875bc2ee7b WatchSource:0}: Error finding container 05f54e5d5275be3fc866be4eb03195e52330114a7cbb4178b6fa3e875bc2ee7b: Status 404 returned error can't find the container with id 05f54e5d5275be3fc866be4eb03195e52330114a7cbb4178b6fa3e875bc2ee7b Oct 07 14:12:43 crc kubenswrapper[4854]: I1007 14:12:43.293141 4854 generic.go:334] "Generic (PLEG): container finished" podID="61648f4c-7049-44bb-8541-dff3bfba76b2" containerID="db1f155b7e81ce24ad37f21550c7b892a1385000c597bac30391f09034aa7db9" exitCode=0 Oct 07 14:12:43 crc kubenswrapper[4854]: I1007 14:12:43.293283 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdvhm" event={"ID":"61648f4c-7049-44bb-8541-dff3bfba76b2","Type":"ContainerDied","Data":"db1f155b7e81ce24ad37f21550c7b892a1385000c597bac30391f09034aa7db9"} Oct 07 14:12:43 crc kubenswrapper[4854]: I1007 14:12:43.295965 4854 generic.go:334] "Generic (PLEG): container finished" podID="b5595990-c273-46d8-b471-0c688d7d9bb3" containerID="55f4bfbed54586a0a4760ff794b15eaa6682f467a2c4029eb6e08be88d42f6ba" exitCode=0 Oct 07 14:12:43 crc kubenswrapper[4854]: I1007 14:12:43.296000 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57sg" event={"ID":"b5595990-c273-46d8-b471-0c688d7d9bb3","Type":"ContainerDied","Data":"55f4bfbed54586a0a4760ff794b15eaa6682f467a2c4029eb6e08be88d42f6ba"} Oct 07 14:12:43 crc kubenswrapper[4854]: I1007 14:12:43.296030 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57sg" event={"ID":"b5595990-c273-46d8-b471-0c688d7d9bb3","Type":"ContainerStarted","Data":"05f54e5d5275be3fc866be4eb03195e52330114a7cbb4178b6fa3e875bc2ee7b"} Oct 07 14:12:44 crc kubenswrapper[4854]: I1007 14:12:44.308506 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57sg" event={"ID":"b5595990-c273-46d8-b471-0c688d7d9bb3","Type":"ContainerStarted","Data":"375e72af1c61fbf35f5878ee7e3bfa5ca16fbbad38f1f75d63b35b87b738ce55"} Oct 07 14:12:44 crc kubenswrapper[4854]: I1007 14:12:44.311986 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdvhm" event={"ID":"61648f4c-7049-44bb-8541-dff3bfba76b2","Type":"ContainerStarted","Data":"2afc917cc31e701efa78367be583d14e4a2897709ab2f469e3206739b62a0284"} Oct 07 14:12:44 crc kubenswrapper[4854]: I1007 14:12:44.377088 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qdvhm" podStartSLOduration=2.77322807 podStartE2EDuration="5.377071674s" podCreationTimestamp="2025-10-07 14:12:39 +0000 UTC" firstStartedPulling="2025-10-07 14:12:41.267461728 +0000 UTC m=+6477.255293993" lastFinishedPulling="2025-10-07 14:12:43.871305322 +0000 UTC m=+6479.859137597" observedRunningTime="2025-10-07 14:12:44.369797835 +0000 UTC m=+6480.357630090" watchObservedRunningTime="2025-10-07 14:12:44.377071674 +0000 UTC m=+6480.364903929" Oct 07 14:12:45 crc kubenswrapper[4854]: I1007 14:12:45.830601 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm"] Oct 07 14:12:45 crc kubenswrapper[4854]: I1007 14:12:45.833333 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:45 crc kubenswrapper[4854]: I1007 14:12:45.836073 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:12:45 crc kubenswrapper[4854]: I1007 14:12:45.837043 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:12:45 crc kubenswrapper[4854]: I1007 14:12:45.837371 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:12:45 crc kubenswrapper[4854]: I1007 14:12:45.851431 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:12:45 crc kubenswrapper[4854]: I1007 14:12:45.855319 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm"] Oct 07 14:12:45 crc kubenswrapper[4854]: I1007 14:12:45.978615 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:45 crc kubenswrapper[4854]: I1007 14:12:45.978822 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:45 crc kubenswrapper[4854]: I1007 14:12:45.979063 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:45 crc kubenswrapper[4854]: I1007 14:12:45.979310 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:45 crc kubenswrapper[4854]: I1007 14:12:45.979570 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmmj5\" (UniqueName: \"kubernetes.io/projected/17027554-19b0-44c4-8798-f3f0025605ca-kube-api-access-wmmj5\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:46 crc kubenswrapper[4854]: I1007 14:12:46.081739 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:46 crc kubenswrapper[4854]: I1007 14:12:46.081893 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:46 crc kubenswrapper[4854]: I1007 14:12:46.082061 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmmj5\" (UniqueName: \"kubernetes.io/projected/17027554-19b0-44c4-8798-f3f0025605ca-kube-api-access-wmmj5\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:46 crc kubenswrapper[4854]: I1007 14:12:46.082129 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:46 crc kubenswrapper[4854]: I1007 14:12:46.082251 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:46 crc kubenswrapper[4854]: I1007 14:12:46.090652 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:46 crc kubenswrapper[4854]: I1007 14:12:46.091243 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:46 crc kubenswrapper[4854]: I1007 14:12:46.091852 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:46 crc kubenswrapper[4854]: I1007 14:12:46.092794 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:46 crc kubenswrapper[4854]: I1007 14:12:46.111589 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmmj5\" (UniqueName: \"kubernetes.io/projected/17027554-19b0-44c4-8798-f3f0025605ca-kube-api-access-wmmj5\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:46 crc kubenswrapper[4854]: I1007 14:12:46.174943 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:12:46 crc kubenswrapper[4854]: I1007 14:12:46.805489 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm"] Oct 07 14:12:47 crc kubenswrapper[4854]: I1007 14:12:47.353387 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" event={"ID":"17027554-19b0-44c4-8798-f3f0025605ca","Type":"ContainerStarted","Data":"5b7f6c66ba5741bcc30009e0511fe3816038e5ae424da948dd4ad2656b824037"} Oct 07 14:12:48 crc kubenswrapper[4854]: I1007 14:12:48.385382 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" event={"ID":"17027554-19b0-44c4-8798-f3f0025605ca","Type":"ContainerStarted","Data":"45effe9481935440024e2d6117b877865af7f22211bfc285d9a1b76eda32931a"} Oct 07 14:12:48 crc kubenswrapper[4854]: I1007 14:12:48.421591 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" podStartSLOduration=2.890016857 podStartE2EDuration="3.42155747s" podCreationTimestamp="2025-10-07 14:12:45 +0000 UTC" firstStartedPulling="2025-10-07 14:12:46.810332896 +0000 UTC m=+6482.798165151" lastFinishedPulling="2025-10-07 14:12:47.341873509 +0000 UTC m=+6483.329705764" observedRunningTime="2025-10-07 14:12:48.405500599 +0000 UTC m=+6484.393332854" watchObservedRunningTime="2025-10-07 14:12:48.42155747 +0000 UTC m=+6484.409389755" Oct 07 14:12:49 crc kubenswrapper[4854]: I1007 14:12:49.403315 4854 generic.go:334] "Generic (PLEG): container finished" podID="b5595990-c273-46d8-b471-0c688d7d9bb3" containerID="375e72af1c61fbf35f5878ee7e3bfa5ca16fbbad38f1f75d63b35b87b738ce55" exitCode=0 Oct 07 14:12:49 crc kubenswrapper[4854]: I1007 14:12:49.403534 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57sg" event={"ID":"b5595990-c273-46d8-b471-0c688d7d9bb3","Type":"ContainerDied","Data":"375e72af1c61fbf35f5878ee7e3bfa5ca16fbbad38f1f75d63b35b87b738ce55"} Oct 07 14:12:50 crc kubenswrapper[4854]: I1007 14:12:50.165283 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:50 crc kubenswrapper[4854]: I1007 14:12:50.165834 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:50 crc kubenswrapper[4854]: I1007 14:12:50.243571 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:50 crc kubenswrapper[4854]: I1007 14:12:50.476196 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:50 crc kubenswrapper[4854]: I1007 14:12:50.650617 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdvhm"] Oct 07 14:12:51 crc kubenswrapper[4854]: I1007 14:12:51.032518 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-414c-account-create-fwl6z"] Oct 07 14:12:51 crc kubenswrapper[4854]: I1007 14:12:51.041585 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-414c-account-create-fwl6z"] Oct 07 14:12:51 crc kubenswrapper[4854]: I1007 14:12:51.427054 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57sg" event={"ID":"b5595990-c273-46d8-b471-0c688d7d9bb3","Type":"ContainerStarted","Data":"a688659bff2d27bef67ddad64e6e8648a94118dc655198298b62634ad960b86c"} Oct 07 14:12:52 crc kubenswrapper[4854]: I1007 14:12:52.366467 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:12:52 crc kubenswrapper[4854]: I1007 14:12:52.368343 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:12:52 crc kubenswrapper[4854]: I1007 14:12:52.456881 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qdvhm" podUID="61648f4c-7049-44bb-8541-dff3bfba76b2" containerName="registry-server" containerID="cri-o://2afc917cc31e701efa78367be583d14e4a2897709ab2f469e3206739b62a0284" gracePeriod=2 Oct 07 14:12:52 crc kubenswrapper[4854]: I1007 14:12:52.718445 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2c854b9-6e47-40df-ac06-bcce0489547b" path="/var/lib/kubelet/pods/b2c854b9-6e47-40df-ac06-bcce0489547b/volumes" Oct 07 14:12:52 crc kubenswrapper[4854]: I1007 14:12:52.995892 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.016808 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v57sg" podStartSLOduration=5.275516801 podStartE2EDuration="12.016784482s" podCreationTimestamp="2025-10-07 14:12:41 +0000 UTC" firstStartedPulling="2025-10-07 14:12:43.298041441 +0000 UTC m=+6479.285873706" lastFinishedPulling="2025-10-07 14:12:50.039309092 +0000 UTC m=+6486.027141387" observedRunningTime="2025-10-07 14:12:51.450705475 +0000 UTC m=+6487.438537760" watchObservedRunningTime="2025-10-07 14:12:53.016784482 +0000 UTC m=+6489.004616737" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.162277 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61648f4c-7049-44bb-8541-dff3bfba76b2-catalog-content\") pod \"61648f4c-7049-44bb-8541-dff3bfba76b2\" (UID: \"61648f4c-7049-44bb-8541-dff3bfba76b2\") " Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.162414 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61648f4c-7049-44bb-8541-dff3bfba76b2-utilities\") pod \"61648f4c-7049-44bb-8541-dff3bfba76b2\" (UID: \"61648f4c-7049-44bb-8541-dff3bfba76b2\") " Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.162612 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2r7f\" (UniqueName: \"kubernetes.io/projected/61648f4c-7049-44bb-8541-dff3bfba76b2-kube-api-access-l2r7f\") pod \"61648f4c-7049-44bb-8541-dff3bfba76b2\" (UID: \"61648f4c-7049-44bb-8541-dff3bfba76b2\") " Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.163032 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61648f4c-7049-44bb-8541-dff3bfba76b2-utilities" (OuterVolumeSpecName: "utilities") pod "61648f4c-7049-44bb-8541-dff3bfba76b2" (UID: "61648f4c-7049-44bb-8541-dff3bfba76b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.163788 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61648f4c-7049-44bb-8541-dff3bfba76b2-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.174320 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61648f4c-7049-44bb-8541-dff3bfba76b2-kube-api-access-l2r7f" (OuterVolumeSpecName: "kube-api-access-l2r7f") pod "61648f4c-7049-44bb-8541-dff3bfba76b2" (UID: "61648f4c-7049-44bb-8541-dff3bfba76b2"). InnerVolumeSpecName "kube-api-access-l2r7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.175600 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61648f4c-7049-44bb-8541-dff3bfba76b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61648f4c-7049-44bb-8541-dff3bfba76b2" (UID: "61648f4c-7049-44bb-8541-dff3bfba76b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.265267 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61648f4c-7049-44bb-8541-dff3bfba76b2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.265310 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2r7f\" (UniqueName: \"kubernetes.io/projected/61648f4c-7049-44bb-8541-dff3bfba76b2-kube-api-access-l2r7f\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.431288 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57sg" podUID="b5595990-c273-46d8-b471-0c688d7d9bb3" containerName="registry-server" probeResult="failure" output=< Oct 07 14:12:53 crc kubenswrapper[4854]: timeout: failed to connect service ":50051" within 1s Oct 07 14:12:53 crc kubenswrapper[4854]: > Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.474523 4854 generic.go:334] "Generic (PLEG): container finished" podID="61648f4c-7049-44bb-8541-dff3bfba76b2" containerID="2afc917cc31e701efa78367be583d14e4a2897709ab2f469e3206739b62a0284" exitCode=0 Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.474594 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdvhm" event={"ID":"61648f4c-7049-44bb-8541-dff3bfba76b2","Type":"ContainerDied","Data":"2afc917cc31e701efa78367be583d14e4a2897709ab2f469e3206739b62a0284"} Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.475214 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qdvhm" event={"ID":"61648f4c-7049-44bb-8541-dff3bfba76b2","Type":"ContainerDied","Data":"26da37f39392e4fe40109835012ac89820f9a9fb146d4e66f080c09a530c5fe0"} Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.474676 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qdvhm" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.475318 4854 scope.go:117] "RemoveContainer" containerID="2afc917cc31e701efa78367be583d14e4a2897709ab2f469e3206739b62a0284" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.536258 4854 scope.go:117] "RemoveContainer" containerID="db1f155b7e81ce24ad37f21550c7b892a1385000c597bac30391f09034aa7db9" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.550388 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdvhm"] Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.562228 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qdvhm"] Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.588300 4854 scope.go:117] "RemoveContainer" containerID="38981941a9fc912432e3b677986370ea0456d6863883bef80ac5ad291fa74885" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.613916 4854 scope.go:117] "RemoveContainer" containerID="2afc917cc31e701efa78367be583d14e4a2897709ab2f469e3206739b62a0284" Oct 07 14:12:53 crc kubenswrapper[4854]: E1007 14:12:53.615010 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2afc917cc31e701efa78367be583d14e4a2897709ab2f469e3206739b62a0284\": container with ID starting with 2afc917cc31e701efa78367be583d14e4a2897709ab2f469e3206739b62a0284 not found: ID does not exist" containerID="2afc917cc31e701efa78367be583d14e4a2897709ab2f469e3206739b62a0284" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.615094 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2afc917cc31e701efa78367be583d14e4a2897709ab2f469e3206739b62a0284"} err="failed to get container status \"2afc917cc31e701efa78367be583d14e4a2897709ab2f469e3206739b62a0284\": rpc error: code = NotFound desc = could not find container \"2afc917cc31e701efa78367be583d14e4a2897709ab2f469e3206739b62a0284\": container with ID starting with 2afc917cc31e701efa78367be583d14e4a2897709ab2f469e3206739b62a0284 not found: ID does not exist" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.615126 4854 scope.go:117] "RemoveContainer" containerID="db1f155b7e81ce24ad37f21550c7b892a1385000c597bac30391f09034aa7db9" Oct 07 14:12:53 crc kubenswrapper[4854]: E1007 14:12:53.615609 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db1f155b7e81ce24ad37f21550c7b892a1385000c597bac30391f09034aa7db9\": container with ID starting with db1f155b7e81ce24ad37f21550c7b892a1385000c597bac30391f09034aa7db9 not found: ID does not exist" containerID="db1f155b7e81ce24ad37f21550c7b892a1385000c597bac30391f09034aa7db9" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.615680 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db1f155b7e81ce24ad37f21550c7b892a1385000c597bac30391f09034aa7db9"} err="failed to get container status \"db1f155b7e81ce24ad37f21550c7b892a1385000c597bac30391f09034aa7db9\": rpc error: code = NotFound desc = could not find container \"db1f155b7e81ce24ad37f21550c7b892a1385000c597bac30391f09034aa7db9\": container with ID starting with db1f155b7e81ce24ad37f21550c7b892a1385000c597bac30391f09034aa7db9 not found: ID does not exist" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.615710 4854 scope.go:117] "RemoveContainer" containerID="38981941a9fc912432e3b677986370ea0456d6863883bef80ac5ad291fa74885" Oct 07 14:12:53 crc kubenswrapper[4854]: E1007 14:12:53.616124 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38981941a9fc912432e3b677986370ea0456d6863883bef80ac5ad291fa74885\": container with ID starting with 38981941a9fc912432e3b677986370ea0456d6863883bef80ac5ad291fa74885 not found: ID does not exist" containerID="38981941a9fc912432e3b677986370ea0456d6863883bef80ac5ad291fa74885" Oct 07 14:12:53 crc kubenswrapper[4854]: I1007 14:12:53.616200 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38981941a9fc912432e3b677986370ea0456d6863883bef80ac5ad291fa74885"} err="failed to get container status \"38981941a9fc912432e3b677986370ea0456d6863883bef80ac5ad291fa74885\": rpc error: code = NotFound desc = could not find container \"38981941a9fc912432e3b677986370ea0456d6863883bef80ac5ad291fa74885\": container with ID starting with 38981941a9fc912432e3b677986370ea0456d6863883bef80ac5ad291fa74885 not found: ID does not exist" Oct 07 14:12:54 crc kubenswrapper[4854]: I1007 14:12:54.714138 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61648f4c-7049-44bb-8541-dff3bfba76b2" path="/var/lib/kubelet/pods/61648f4c-7049-44bb-8541-dff3bfba76b2/volumes" Oct 07 14:13:03 crc kubenswrapper[4854]: I1007 14:13:03.474203 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57sg" podUID="b5595990-c273-46d8-b471-0c688d7d9bb3" containerName="registry-server" probeResult="failure" output=< Oct 07 14:13:03 crc kubenswrapper[4854]: timeout: failed to connect service ":50051" within 1s Oct 07 14:13:03 crc kubenswrapper[4854]: > Oct 07 14:13:10 crc kubenswrapper[4854]: I1007 14:13:10.808307 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:13:10 crc kubenswrapper[4854]: I1007 14:13:10.808954 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:13:10 crc kubenswrapper[4854]: I1007 14:13:10.809008 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 14:13:10 crc kubenswrapper[4854]: I1007 14:13:10.810226 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"23da9ad8d99b4f3fecff0a63dbca297f16a76579dd1005e295268c0507480214"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:13:10 crc kubenswrapper[4854]: I1007 14:13:10.810285 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://23da9ad8d99b4f3fecff0a63dbca297f16a76579dd1005e295268c0507480214" gracePeriod=600 Oct 07 14:13:11 crc kubenswrapper[4854]: I1007 14:13:11.703104 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="23da9ad8d99b4f3fecff0a63dbca297f16a76579dd1005e295268c0507480214" exitCode=0 Oct 07 14:13:11 crc kubenswrapper[4854]: I1007 14:13:11.703180 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"23da9ad8d99b4f3fecff0a63dbca297f16a76579dd1005e295268c0507480214"} Oct 07 14:13:11 crc kubenswrapper[4854]: I1007 14:13:11.703622 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7"} Oct 07 14:13:11 crc kubenswrapper[4854]: I1007 14:13:11.703644 4854 scope.go:117] "RemoveContainer" containerID="4fe613980076ca3ed8559fe0e286e39c2d0d1c3badb8354aa3600a7ada37572b" Oct 07 14:13:13 crc kubenswrapper[4854]: I1007 14:13:13.450979 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57sg" podUID="b5595990-c273-46d8-b471-0c688d7d9bb3" containerName="registry-server" probeResult="failure" output=< Oct 07 14:13:13 crc kubenswrapper[4854]: timeout: failed to connect service ":50051" within 1s Oct 07 14:13:13 crc kubenswrapper[4854]: > Oct 07 14:13:22 crc kubenswrapper[4854]: I1007 14:13:22.420077 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:13:22 crc kubenswrapper[4854]: I1007 14:13:22.478865 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:13:22 crc kubenswrapper[4854]: I1007 14:13:22.655783 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v57sg"] Oct 07 14:13:23 crc kubenswrapper[4854]: I1007 14:13:23.836299 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v57sg" podUID="b5595990-c273-46d8-b471-0c688d7d9bb3" containerName="registry-server" containerID="cri-o://a688659bff2d27bef67ddad64e6e8648a94118dc655198298b62634ad960b86c" gracePeriod=2 Oct 07 14:13:24 crc kubenswrapper[4854]: I1007 14:13:24.851007 4854 generic.go:334] "Generic (PLEG): container finished" podID="b5595990-c273-46d8-b471-0c688d7d9bb3" containerID="a688659bff2d27bef67ddad64e6e8648a94118dc655198298b62634ad960b86c" exitCode=0 Oct 07 14:13:24 crc kubenswrapper[4854]: I1007 14:13:24.851099 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57sg" event={"ID":"b5595990-c273-46d8-b471-0c688d7d9bb3","Type":"ContainerDied","Data":"a688659bff2d27bef67ddad64e6e8648a94118dc655198298b62634ad960b86c"} Oct 07 14:13:24 crc kubenswrapper[4854]: I1007 14:13:24.962907 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.114559 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5595990-c273-46d8-b471-0c688d7d9bb3-utilities\") pod \"b5595990-c273-46d8-b471-0c688d7d9bb3\" (UID: \"b5595990-c273-46d8-b471-0c688d7d9bb3\") " Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.114731 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgmqz\" (UniqueName: \"kubernetes.io/projected/b5595990-c273-46d8-b471-0c688d7d9bb3-kube-api-access-sgmqz\") pod \"b5595990-c273-46d8-b471-0c688d7d9bb3\" (UID: \"b5595990-c273-46d8-b471-0c688d7d9bb3\") " Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.114766 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5595990-c273-46d8-b471-0c688d7d9bb3-catalog-content\") pod \"b5595990-c273-46d8-b471-0c688d7d9bb3\" (UID: \"b5595990-c273-46d8-b471-0c688d7d9bb3\") " Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.116320 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5595990-c273-46d8-b471-0c688d7d9bb3-utilities" (OuterVolumeSpecName: "utilities") pod "b5595990-c273-46d8-b471-0c688d7d9bb3" (UID: "b5595990-c273-46d8-b471-0c688d7d9bb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.122054 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5595990-c273-46d8-b471-0c688d7d9bb3-kube-api-access-sgmqz" (OuterVolumeSpecName: "kube-api-access-sgmqz") pod "b5595990-c273-46d8-b471-0c688d7d9bb3" (UID: "b5595990-c273-46d8-b471-0c688d7d9bb3"). InnerVolumeSpecName "kube-api-access-sgmqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.217494 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5595990-c273-46d8-b471-0c688d7d9bb3-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.217545 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgmqz\" (UniqueName: \"kubernetes.io/projected/b5595990-c273-46d8-b471-0c688d7d9bb3-kube-api-access-sgmqz\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.220787 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5595990-c273-46d8-b471-0c688d7d9bb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5595990-c273-46d8-b471-0c688d7d9bb3" (UID: "b5595990-c273-46d8-b471-0c688d7d9bb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.320331 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5595990-c273-46d8-b471-0c688d7d9bb3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.862784 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57sg" event={"ID":"b5595990-c273-46d8-b471-0c688d7d9bb3","Type":"ContainerDied","Data":"05f54e5d5275be3fc866be4eb03195e52330114a7cbb4178b6fa3e875bc2ee7b"} Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.863127 4854 scope.go:117] "RemoveContainer" containerID="a688659bff2d27bef67ddad64e6e8648a94118dc655198298b62634ad960b86c" Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.862864 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v57sg" Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.885387 4854 scope.go:117] "RemoveContainer" containerID="375e72af1c61fbf35f5878ee7e3bfa5ca16fbbad38f1f75d63b35b87b738ce55" Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.908220 4854 scope.go:117] "RemoveContainer" containerID="55f4bfbed54586a0a4760ff794b15eaa6682f467a2c4029eb6e08be88d42f6ba" Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.909713 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v57sg"] Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.922538 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v57sg"] Oct 07 14:13:25 crc kubenswrapper[4854]: I1007 14:13:25.950429 4854 scope.go:117] "RemoveContainer" containerID="85f0ed58c1f0270d454f5b478f6549461d576fed90482ad5e3de7205eeaa3af0" Oct 07 14:13:26 crc kubenswrapper[4854]: I1007 14:13:26.078964 4854 scope.go:117] "RemoveContainer" containerID="8a2d62b0e54763882d622240e5dc7552851155a784e35405c9243ecb3a76f780" Oct 07 14:13:26 crc kubenswrapper[4854]: I1007 14:13:26.113372 4854 scope.go:117] "RemoveContainer" containerID="784c0b89017c39a90ac2e572fe3b83c0969a8eb7e49d6082619c0859201f921a" Oct 07 14:13:26 crc kubenswrapper[4854]: I1007 14:13:26.719772 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5595990-c273-46d8-b471-0c688d7d9bb3" path="/var/lib/kubelet/pods/b5595990-c273-46d8-b471-0c688d7d9bb3/volumes" Oct 07 14:13:43 crc kubenswrapper[4854]: I1007 14:13:43.058243 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-l6pgg"] Oct 07 14:13:43 crc kubenswrapper[4854]: I1007 14:13:43.082042 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-l6pgg"] Oct 07 14:13:44 crc kubenswrapper[4854]: I1007 14:13:44.725184 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44e40bbe-6d98-4e62-8fc1-79dfd4ccff94" path="/var/lib/kubelet/pods/44e40bbe-6d98-4e62-8fc1-79dfd4ccff94/volumes" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.088961 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mpt9d"] Oct 07 14:14:13 crc kubenswrapper[4854]: E1007 14:14:13.090555 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61648f4c-7049-44bb-8541-dff3bfba76b2" containerName="extract-utilities" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.090580 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="61648f4c-7049-44bb-8541-dff3bfba76b2" containerName="extract-utilities" Oct 07 14:14:13 crc kubenswrapper[4854]: E1007 14:14:13.090612 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5595990-c273-46d8-b471-0c688d7d9bb3" containerName="registry-server" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.090628 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5595990-c273-46d8-b471-0c688d7d9bb3" containerName="registry-server" Oct 07 14:14:13 crc kubenswrapper[4854]: E1007 14:14:13.090649 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5595990-c273-46d8-b471-0c688d7d9bb3" containerName="extract-utilities" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.090661 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5595990-c273-46d8-b471-0c688d7d9bb3" containerName="extract-utilities" Oct 07 14:14:13 crc kubenswrapper[4854]: E1007 14:14:13.090698 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5595990-c273-46d8-b471-0c688d7d9bb3" containerName="extract-content" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.090710 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5595990-c273-46d8-b471-0c688d7d9bb3" containerName="extract-content" Oct 07 14:14:13 crc kubenswrapper[4854]: E1007 14:14:13.090738 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61648f4c-7049-44bb-8541-dff3bfba76b2" containerName="extract-content" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.090750 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="61648f4c-7049-44bb-8541-dff3bfba76b2" containerName="extract-content" Oct 07 14:14:13 crc kubenswrapper[4854]: E1007 14:14:13.090791 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61648f4c-7049-44bb-8541-dff3bfba76b2" containerName="registry-server" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.090803 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="61648f4c-7049-44bb-8541-dff3bfba76b2" containerName="registry-server" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.091186 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5595990-c273-46d8-b471-0c688d7d9bb3" containerName="registry-server" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.091250 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="61648f4c-7049-44bb-8541-dff3bfba76b2" containerName="registry-server" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.093985 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.103192 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mpt9d"] Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.181671 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3e58c7-1c37-493f-823b-38c013b4fd3e-utilities\") pod \"community-operators-mpt9d\" (UID: \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\") " pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.181759 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3e58c7-1c37-493f-823b-38c013b4fd3e-catalog-content\") pod \"community-operators-mpt9d\" (UID: \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\") " pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.181842 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v725v\" (UniqueName: \"kubernetes.io/projected/5b3e58c7-1c37-493f-823b-38c013b4fd3e-kube-api-access-v725v\") pod \"community-operators-mpt9d\" (UID: \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\") " pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.284641 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v725v\" (UniqueName: \"kubernetes.io/projected/5b3e58c7-1c37-493f-823b-38c013b4fd3e-kube-api-access-v725v\") pod \"community-operators-mpt9d\" (UID: \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\") " pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.284842 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3e58c7-1c37-493f-823b-38c013b4fd3e-utilities\") pod \"community-operators-mpt9d\" (UID: \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\") " pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.284906 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3e58c7-1c37-493f-823b-38c013b4fd3e-catalog-content\") pod \"community-operators-mpt9d\" (UID: \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\") " pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.285330 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3e58c7-1c37-493f-823b-38c013b4fd3e-utilities\") pod \"community-operators-mpt9d\" (UID: \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\") " pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.285429 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3e58c7-1c37-493f-823b-38c013b4fd3e-catalog-content\") pod \"community-operators-mpt9d\" (UID: \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\") " pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.303886 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v725v\" (UniqueName: \"kubernetes.io/projected/5b3e58c7-1c37-493f-823b-38c013b4fd3e-kube-api-access-v725v\") pod \"community-operators-mpt9d\" (UID: \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\") " pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.426710 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:13 crc kubenswrapper[4854]: I1007 14:14:13.964742 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mpt9d"] Oct 07 14:14:14 crc kubenswrapper[4854]: I1007 14:14:14.404738 4854 generic.go:334] "Generic (PLEG): container finished" podID="5b3e58c7-1c37-493f-823b-38c013b4fd3e" containerID="33acf081ebe18a6dd90d3fe8192c5122439784f6983da3af7cc1cf1f518fa485" exitCode=0 Oct 07 14:14:14 crc kubenswrapper[4854]: I1007 14:14:14.404911 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpt9d" event={"ID":"5b3e58c7-1c37-493f-823b-38c013b4fd3e","Type":"ContainerDied","Data":"33acf081ebe18a6dd90d3fe8192c5122439784f6983da3af7cc1cf1f518fa485"} Oct 07 14:14:14 crc kubenswrapper[4854]: I1007 14:14:14.405055 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpt9d" event={"ID":"5b3e58c7-1c37-493f-823b-38c013b4fd3e","Type":"ContainerStarted","Data":"7f4aff456aab41ba78afdf905043b4b3739bdc1f6f6f8a4299ce009a8d91ee6b"} Oct 07 14:14:16 crc kubenswrapper[4854]: I1007 14:14:16.449475 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpt9d" event={"ID":"5b3e58c7-1c37-493f-823b-38c013b4fd3e","Type":"ContainerStarted","Data":"dfe290e93bbcc08f0b09d76991318fd9f91a6104b3cefdc7904167610bc7918b"} Oct 07 14:14:18 crc kubenswrapper[4854]: I1007 14:14:18.471334 4854 generic.go:334] "Generic (PLEG): container finished" podID="5b3e58c7-1c37-493f-823b-38c013b4fd3e" containerID="dfe290e93bbcc08f0b09d76991318fd9f91a6104b3cefdc7904167610bc7918b" exitCode=0 Oct 07 14:14:18 crc kubenswrapper[4854]: I1007 14:14:18.471415 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpt9d" event={"ID":"5b3e58c7-1c37-493f-823b-38c013b4fd3e","Type":"ContainerDied","Data":"dfe290e93bbcc08f0b09d76991318fd9f91a6104b3cefdc7904167610bc7918b"} Oct 07 14:14:20 crc kubenswrapper[4854]: I1007 14:14:20.497173 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpt9d" event={"ID":"5b3e58c7-1c37-493f-823b-38c013b4fd3e","Type":"ContainerStarted","Data":"80d9194d0ad3a5119946a1d6afc14a44377be4c21f4911e7248f8f001f148df6"} Oct 07 14:14:20 crc kubenswrapper[4854]: I1007 14:14:20.525498 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mpt9d" podStartSLOduration=2.63616154 podStartE2EDuration="7.52546394s" podCreationTimestamp="2025-10-07 14:14:13 +0000 UTC" firstStartedPulling="2025-10-07 14:14:14.4068359 +0000 UTC m=+6570.394668195" lastFinishedPulling="2025-10-07 14:14:19.29613832 +0000 UTC m=+6575.283970595" observedRunningTime="2025-10-07 14:14:20.514940178 +0000 UTC m=+6576.502772473" watchObservedRunningTime="2025-10-07 14:14:20.52546394 +0000 UTC m=+6576.513296235" Oct 07 14:14:23 crc kubenswrapper[4854]: I1007 14:14:23.427829 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:23 crc kubenswrapper[4854]: I1007 14:14:23.428562 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:23 crc kubenswrapper[4854]: I1007 14:14:23.500256 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:26 crc kubenswrapper[4854]: I1007 14:14:26.320298 4854 scope.go:117] "RemoveContainer" containerID="435606fc6df192f50dbc67a2421f1ab2e060d79ad120c9643fe0d6a7abb29001" Oct 07 14:14:26 crc kubenswrapper[4854]: I1007 14:14:26.426789 4854 scope.go:117] "RemoveContainer" containerID="248341ff6a952ec5666566969828ab89551b3ce71b595c4df9821322f0db3732" Oct 07 14:14:33 crc kubenswrapper[4854]: I1007 14:14:33.485419 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:33 crc kubenswrapper[4854]: I1007 14:14:33.551754 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mpt9d"] Oct 07 14:14:33 crc kubenswrapper[4854]: I1007 14:14:33.656625 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mpt9d" podUID="5b3e58c7-1c37-493f-823b-38c013b4fd3e" containerName="registry-server" containerID="cri-o://80d9194d0ad3a5119946a1d6afc14a44377be4c21f4911e7248f8f001f148df6" gracePeriod=2 Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.175163 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.317353 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3e58c7-1c37-493f-823b-38c013b4fd3e-catalog-content\") pod \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\" (UID: \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\") " Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.317608 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v725v\" (UniqueName: \"kubernetes.io/projected/5b3e58c7-1c37-493f-823b-38c013b4fd3e-kube-api-access-v725v\") pod \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\" (UID: \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\") " Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.317670 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3e58c7-1c37-493f-823b-38c013b4fd3e-utilities\") pod \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\" (UID: \"5b3e58c7-1c37-493f-823b-38c013b4fd3e\") " Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.318761 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b3e58c7-1c37-493f-823b-38c013b4fd3e-utilities" (OuterVolumeSpecName: "utilities") pod "5b3e58c7-1c37-493f-823b-38c013b4fd3e" (UID: "5b3e58c7-1c37-493f-823b-38c013b4fd3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.323722 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3e58c7-1c37-493f-823b-38c013b4fd3e-kube-api-access-v725v" (OuterVolumeSpecName: "kube-api-access-v725v") pod "5b3e58c7-1c37-493f-823b-38c013b4fd3e" (UID: "5b3e58c7-1c37-493f-823b-38c013b4fd3e"). InnerVolumeSpecName "kube-api-access-v725v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.396634 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b3e58c7-1c37-493f-823b-38c013b4fd3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b3e58c7-1c37-493f-823b-38c013b4fd3e" (UID: "5b3e58c7-1c37-493f-823b-38c013b4fd3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.419683 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v725v\" (UniqueName: \"kubernetes.io/projected/5b3e58c7-1c37-493f-823b-38c013b4fd3e-kube-api-access-v725v\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.419715 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3e58c7-1c37-493f-823b-38c013b4fd3e-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.419727 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3e58c7-1c37-493f-823b-38c013b4fd3e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.667133 4854 generic.go:334] "Generic (PLEG): container finished" podID="5b3e58c7-1c37-493f-823b-38c013b4fd3e" containerID="80d9194d0ad3a5119946a1d6afc14a44377be4c21f4911e7248f8f001f148df6" exitCode=0 Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.667383 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpt9d" event={"ID":"5b3e58c7-1c37-493f-823b-38c013b4fd3e","Type":"ContainerDied","Data":"80d9194d0ad3a5119946a1d6afc14a44377be4c21f4911e7248f8f001f148df6"} Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.668504 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpt9d" event={"ID":"5b3e58c7-1c37-493f-823b-38c013b4fd3e","Type":"ContainerDied","Data":"7f4aff456aab41ba78afdf905043b4b3739bdc1f6f6f8a4299ce009a8d91ee6b"} Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.668595 4854 scope.go:117] "RemoveContainer" containerID="80d9194d0ad3a5119946a1d6afc14a44377be4c21f4911e7248f8f001f148df6" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.667488 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mpt9d" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.709763 4854 scope.go:117] "RemoveContainer" containerID="dfe290e93bbcc08f0b09d76991318fd9f91a6104b3cefdc7904167610bc7918b" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.741902 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mpt9d"] Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.741939 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mpt9d"] Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.742082 4854 scope.go:117] "RemoveContainer" containerID="33acf081ebe18a6dd90d3fe8192c5122439784f6983da3af7cc1cf1f518fa485" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.783126 4854 scope.go:117] "RemoveContainer" containerID="80d9194d0ad3a5119946a1d6afc14a44377be4c21f4911e7248f8f001f148df6" Oct 07 14:14:34 crc kubenswrapper[4854]: E1007 14:14:34.783543 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80d9194d0ad3a5119946a1d6afc14a44377be4c21f4911e7248f8f001f148df6\": container with ID starting with 80d9194d0ad3a5119946a1d6afc14a44377be4c21f4911e7248f8f001f148df6 not found: ID does not exist" containerID="80d9194d0ad3a5119946a1d6afc14a44377be4c21f4911e7248f8f001f148df6" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.783655 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d9194d0ad3a5119946a1d6afc14a44377be4c21f4911e7248f8f001f148df6"} err="failed to get container status \"80d9194d0ad3a5119946a1d6afc14a44377be4c21f4911e7248f8f001f148df6\": rpc error: code = NotFound desc = could not find container \"80d9194d0ad3a5119946a1d6afc14a44377be4c21f4911e7248f8f001f148df6\": container with ID starting with 80d9194d0ad3a5119946a1d6afc14a44377be4c21f4911e7248f8f001f148df6 not found: ID does not exist" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.783683 4854 scope.go:117] "RemoveContainer" containerID="dfe290e93bbcc08f0b09d76991318fd9f91a6104b3cefdc7904167610bc7918b" Oct 07 14:14:34 crc kubenswrapper[4854]: E1007 14:14:34.783953 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfe290e93bbcc08f0b09d76991318fd9f91a6104b3cefdc7904167610bc7918b\": container with ID starting with dfe290e93bbcc08f0b09d76991318fd9f91a6104b3cefdc7904167610bc7918b not found: ID does not exist" containerID="dfe290e93bbcc08f0b09d76991318fd9f91a6104b3cefdc7904167610bc7918b" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.783973 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe290e93bbcc08f0b09d76991318fd9f91a6104b3cefdc7904167610bc7918b"} err="failed to get container status \"dfe290e93bbcc08f0b09d76991318fd9f91a6104b3cefdc7904167610bc7918b\": rpc error: code = NotFound desc = could not find container \"dfe290e93bbcc08f0b09d76991318fd9f91a6104b3cefdc7904167610bc7918b\": container with ID starting with dfe290e93bbcc08f0b09d76991318fd9f91a6104b3cefdc7904167610bc7918b not found: ID does not exist" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.783984 4854 scope.go:117] "RemoveContainer" containerID="33acf081ebe18a6dd90d3fe8192c5122439784f6983da3af7cc1cf1f518fa485" Oct 07 14:14:34 crc kubenswrapper[4854]: E1007 14:14:34.784249 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33acf081ebe18a6dd90d3fe8192c5122439784f6983da3af7cc1cf1f518fa485\": container with ID starting with 33acf081ebe18a6dd90d3fe8192c5122439784f6983da3af7cc1cf1f518fa485 not found: ID does not exist" containerID="33acf081ebe18a6dd90d3fe8192c5122439784f6983da3af7cc1cf1f518fa485" Oct 07 14:14:34 crc kubenswrapper[4854]: I1007 14:14:34.784277 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33acf081ebe18a6dd90d3fe8192c5122439784f6983da3af7cc1cf1f518fa485"} err="failed to get container status \"33acf081ebe18a6dd90d3fe8192c5122439784f6983da3af7cc1cf1f518fa485\": rpc error: code = NotFound desc = could not find container \"33acf081ebe18a6dd90d3fe8192c5122439784f6983da3af7cc1cf1f518fa485\": container with ID starting with 33acf081ebe18a6dd90d3fe8192c5122439784f6983da3af7cc1cf1f518fa485 not found: ID does not exist" Oct 07 14:14:36 crc kubenswrapper[4854]: I1007 14:14:36.729496 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b3e58c7-1c37-493f-823b-38c013b4fd3e" path="/var/lib/kubelet/pods/5b3e58c7-1c37-493f-823b-38c013b4fd3e/volumes" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.168971 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z"] Oct 07 14:15:00 crc kubenswrapper[4854]: E1007 14:15:00.170796 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3e58c7-1c37-493f-823b-38c013b4fd3e" containerName="extract-content" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.170876 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3e58c7-1c37-493f-823b-38c013b4fd3e" containerName="extract-content" Oct 07 14:15:00 crc kubenswrapper[4854]: E1007 14:15:00.170953 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3e58c7-1c37-493f-823b-38c013b4fd3e" containerName="registry-server" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.171028 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3e58c7-1c37-493f-823b-38c013b4fd3e" containerName="registry-server" Oct 07 14:15:00 crc kubenswrapper[4854]: E1007 14:15:00.171131 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3e58c7-1c37-493f-823b-38c013b4fd3e" containerName="extract-utilities" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.171240 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3e58c7-1c37-493f-823b-38c013b4fd3e" containerName="extract-utilities" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.171558 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3e58c7-1c37-493f-823b-38c013b4fd3e" containerName="registry-server" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.172503 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.181194 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z"] Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.212890 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.213189 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.337726 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-config-volume\") pod \"collect-profiles-29330775-wcn9z\" (UID: \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.338136 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-secret-volume\") pod \"collect-profiles-29330775-wcn9z\" (UID: \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.338499 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ll9c\" (UniqueName: \"kubernetes.io/projected/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-kube-api-access-6ll9c\") pod \"collect-profiles-29330775-wcn9z\" (UID: \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.440561 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-secret-volume\") pod \"collect-profiles-29330775-wcn9z\" (UID: \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.440865 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ll9c\" (UniqueName: \"kubernetes.io/projected/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-kube-api-access-6ll9c\") pod \"collect-profiles-29330775-wcn9z\" (UID: \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.441022 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-config-volume\") pod \"collect-profiles-29330775-wcn9z\" (UID: \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.442555 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-config-volume\") pod \"collect-profiles-29330775-wcn9z\" (UID: \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.453846 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-secret-volume\") pod \"collect-profiles-29330775-wcn9z\" (UID: \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.474494 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ll9c\" (UniqueName: \"kubernetes.io/projected/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-kube-api-access-6ll9c\") pod \"collect-profiles-29330775-wcn9z\" (UID: \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" Oct 07 14:15:00 crc kubenswrapper[4854]: I1007 14:15:00.526786 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" Oct 07 14:15:01 crc kubenswrapper[4854]: I1007 14:15:01.049568 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z"] Oct 07 14:15:01 crc kubenswrapper[4854]: W1007 14:15:01.061989 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb83bc05_c755_4cac_9e66_cf1f0ecfaa6d.slice/crio-49831c30b318ce5d1b27fdd944b8187ab7e8fb97b3522d183c8e42884d9b1ef0 WatchSource:0}: Error finding container 49831c30b318ce5d1b27fdd944b8187ab7e8fb97b3522d183c8e42884d9b1ef0: Status 404 returned error can't find the container with id 49831c30b318ce5d1b27fdd944b8187ab7e8fb97b3522d183c8e42884d9b1ef0 Oct 07 14:15:02 crc kubenswrapper[4854]: I1007 14:15:02.019984 4854 generic.go:334] "Generic (PLEG): container finished" podID="cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d" containerID="b75a32137dae60a5b40000a02c61eb34fa49e9c81e786da904141ab3b376d898" exitCode=0 Oct 07 14:15:02 crc kubenswrapper[4854]: I1007 14:15:02.020070 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" event={"ID":"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d","Type":"ContainerDied","Data":"b75a32137dae60a5b40000a02c61eb34fa49e9c81e786da904141ab3b376d898"} Oct 07 14:15:02 crc kubenswrapper[4854]: I1007 14:15:02.020495 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" event={"ID":"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d","Type":"ContainerStarted","Data":"49831c30b318ce5d1b27fdd944b8187ab7e8fb97b3522d183c8e42884d9b1ef0"} Oct 07 14:15:03 crc kubenswrapper[4854]: I1007 14:15:03.512291 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" Oct 07 14:15:03 crc kubenswrapper[4854]: I1007 14:15:03.619631 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-config-volume\") pod \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\" (UID: \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\") " Oct 07 14:15:03 crc kubenswrapper[4854]: I1007 14:15:03.619949 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ll9c\" (UniqueName: \"kubernetes.io/projected/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-kube-api-access-6ll9c\") pod \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\" (UID: \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\") " Oct 07 14:15:03 crc kubenswrapper[4854]: I1007 14:15:03.620087 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-secret-volume\") pod \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\" (UID: \"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d\") " Oct 07 14:15:03 crc kubenswrapper[4854]: I1007 14:15:03.620483 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-config-volume" (OuterVolumeSpecName: "config-volume") pod "cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d" (UID: "cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:15:03 crc kubenswrapper[4854]: I1007 14:15:03.620859 4854 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:03 crc kubenswrapper[4854]: I1007 14:15:03.626863 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-kube-api-access-6ll9c" (OuterVolumeSpecName: "kube-api-access-6ll9c") pod "cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d" (UID: "cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d"). InnerVolumeSpecName "kube-api-access-6ll9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:15:03 crc kubenswrapper[4854]: I1007 14:15:03.628002 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d" (UID: "cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:03 crc kubenswrapper[4854]: I1007 14:15:03.722186 4854 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:03 crc kubenswrapper[4854]: I1007 14:15:03.722215 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ll9c\" (UniqueName: \"kubernetes.io/projected/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d-kube-api-access-6ll9c\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:04 crc kubenswrapper[4854]: I1007 14:15:04.047948 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" event={"ID":"cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d","Type":"ContainerDied","Data":"49831c30b318ce5d1b27fdd944b8187ab7e8fb97b3522d183c8e42884d9b1ef0"} Oct 07 14:15:04 crc kubenswrapper[4854]: I1007 14:15:04.047997 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49831c30b318ce5d1b27fdd944b8187ab7e8fb97b3522d183c8e42884d9b1ef0" Oct 07 14:15:04 crc kubenswrapper[4854]: I1007 14:15:04.048053 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z" Oct 07 14:15:04 crc kubenswrapper[4854]: I1007 14:15:04.633944 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4"] Oct 07 14:15:04 crc kubenswrapper[4854]: I1007 14:15:04.641302 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330730-hcsm4"] Oct 07 14:15:04 crc kubenswrapper[4854]: I1007 14:15:04.713350 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd6d7fce-8018-4050-94e3-efbff9794019" path="/var/lib/kubelet/pods/dd6d7fce-8018-4050-94e3-efbff9794019/volumes" Oct 07 14:15:26 crc kubenswrapper[4854]: I1007 14:15:26.939287 4854 scope.go:117] "RemoveContainer" containerID="d5f12edd93de68ea6e78de6731e614869c4c7f57e32f0fbe91101f79340683ec" Oct 07 14:15:40 crc kubenswrapper[4854]: I1007 14:15:40.807991 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:15:40 crc kubenswrapper[4854]: I1007 14:15:40.808720 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:16:10 crc kubenswrapper[4854]: I1007 14:16:10.814127 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:16:10 crc kubenswrapper[4854]: I1007 14:16:10.814688 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:16:40 crc kubenswrapper[4854]: I1007 14:16:40.807819 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:16:40 crc kubenswrapper[4854]: I1007 14:16:40.808458 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:16:40 crc kubenswrapper[4854]: I1007 14:16:40.808666 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 14:16:40 crc kubenswrapper[4854]: I1007 14:16:40.809887 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:16:40 crc kubenswrapper[4854]: I1007 14:16:40.809975 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" gracePeriod=600 Oct 07 14:16:40 crc kubenswrapper[4854]: E1007 14:16:40.946497 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:16:41 crc kubenswrapper[4854]: I1007 14:16:41.249690 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" exitCode=0 Oct 07 14:16:41 crc kubenswrapper[4854]: I1007 14:16:41.249753 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7"} Oct 07 14:16:41 crc kubenswrapper[4854]: I1007 14:16:41.249792 4854 scope.go:117] "RemoveContainer" containerID="23da9ad8d99b4f3fecff0a63dbca297f16a76579dd1005e295268c0507480214" Oct 07 14:16:41 crc kubenswrapper[4854]: I1007 14:16:41.250502 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:16:41 crc kubenswrapper[4854]: E1007 14:16:41.250841 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:16:55 crc kubenswrapper[4854]: I1007 14:16:55.703631 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:16:55 crc kubenswrapper[4854]: E1007 14:16:55.704475 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:16:58 crc kubenswrapper[4854]: I1007 14:16:58.065522 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-nt4wp"] Oct 07 14:16:58 crc kubenswrapper[4854]: I1007 14:16:58.078005 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-nt4wp"] Oct 07 14:16:58 crc kubenswrapper[4854]: I1007 14:16:58.716583 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a7de750-cb21-40dd-a6d9-c845a77cd0e5" path="/var/lib/kubelet/pods/5a7de750-cb21-40dd-a6d9-c845a77cd0e5/volumes" Oct 07 14:17:08 crc kubenswrapper[4854]: I1007 14:17:08.046277 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-a239-account-create-r8js5"] Oct 07 14:17:08 crc kubenswrapper[4854]: I1007 14:17:08.060017 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-a239-account-create-r8js5"] Oct 07 14:17:08 crc kubenswrapper[4854]: I1007 14:17:08.715355 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="154324fd-9a6c-421c-aa91-3b3cb4f5d842" path="/var/lib/kubelet/pods/154324fd-9a6c-421c-aa91-3b3cb4f5d842/volumes" Oct 07 14:17:10 crc kubenswrapper[4854]: I1007 14:17:10.703171 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:17:10 crc kubenswrapper[4854]: E1007 14:17:10.703609 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:17:21 crc kubenswrapper[4854]: I1007 14:17:21.042842 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-5sl76"] Oct 07 14:17:21 crc kubenswrapper[4854]: I1007 14:17:21.059884 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-5sl76"] Oct 07 14:17:22 crc kubenswrapper[4854]: I1007 14:17:22.703477 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:17:22 crc kubenswrapper[4854]: E1007 14:17:22.704507 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:17:22 crc kubenswrapper[4854]: I1007 14:17:22.720404 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10373da9-c986-4dee-853d-9bcc9892b5c1" path="/var/lib/kubelet/pods/10373da9-c986-4dee-853d-9bcc9892b5c1/volumes" Oct 07 14:17:27 crc kubenswrapper[4854]: I1007 14:17:27.070664 4854 scope.go:117] "RemoveContainer" containerID="3e06c9ff028286cf5698cc142e8dea349fc8ca7ab64e62eef9f6592629d6d9e9" Oct 07 14:17:27 crc kubenswrapper[4854]: I1007 14:17:27.108571 4854 scope.go:117] "RemoveContainer" containerID="2a86ef79dce0dfb289dc6586d81d23e3efda6f45a6968d50210066e280274609" Oct 07 14:17:27 crc kubenswrapper[4854]: I1007 14:17:27.174694 4854 scope.go:117] "RemoveContainer" containerID="ac1981ef1b44c97ceaeeb6a070174463d2109817ab081269d11a107a32aa75d9" Oct 07 14:17:36 crc kubenswrapper[4854]: I1007 14:17:36.703116 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:17:36 crc kubenswrapper[4854]: E1007 14:17:36.703876 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:17:51 crc kubenswrapper[4854]: I1007 14:17:51.704694 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:17:51 crc kubenswrapper[4854]: E1007 14:17:51.705938 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.025766 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jxt5b"] Oct 07 14:18:04 crc kubenswrapper[4854]: E1007 14:18:04.027430 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d" containerName="collect-profiles" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.027467 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d" containerName="collect-profiles" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.028088 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d" containerName="collect-profiles" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.033273 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.035166 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jxt5b"] Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.227223 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qplv4\" (UniqueName: \"kubernetes.io/projected/c1a19938-4ae0-4e68-9786-2d68a837f668-kube-api-access-qplv4\") pod \"certified-operators-jxt5b\" (UID: \"c1a19938-4ae0-4e68-9786-2d68a837f668\") " pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.228103 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1a19938-4ae0-4e68-9786-2d68a837f668-catalog-content\") pod \"certified-operators-jxt5b\" (UID: \"c1a19938-4ae0-4e68-9786-2d68a837f668\") " pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.228205 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1a19938-4ae0-4e68-9786-2d68a837f668-utilities\") pod \"certified-operators-jxt5b\" (UID: \"c1a19938-4ae0-4e68-9786-2d68a837f668\") " pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.329927 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1a19938-4ae0-4e68-9786-2d68a837f668-catalog-content\") pod \"certified-operators-jxt5b\" (UID: \"c1a19938-4ae0-4e68-9786-2d68a837f668\") " pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.330401 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1a19938-4ae0-4e68-9786-2d68a837f668-catalog-content\") pod \"certified-operators-jxt5b\" (UID: \"c1a19938-4ae0-4e68-9786-2d68a837f668\") " pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.331505 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1a19938-4ae0-4e68-9786-2d68a837f668-utilities\") pod \"certified-operators-jxt5b\" (UID: \"c1a19938-4ae0-4e68-9786-2d68a837f668\") " pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.331800 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1a19938-4ae0-4e68-9786-2d68a837f668-utilities\") pod \"certified-operators-jxt5b\" (UID: \"c1a19938-4ae0-4e68-9786-2d68a837f668\") " pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.331943 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qplv4\" (UniqueName: \"kubernetes.io/projected/c1a19938-4ae0-4e68-9786-2d68a837f668-kube-api-access-qplv4\") pod \"certified-operators-jxt5b\" (UID: \"c1a19938-4ae0-4e68-9786-2d68a837f668\") " pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.351798 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qplv4\" (UniqueName: \"kubernetes.io/projected/c1a19938-4ae0-4e68-9786-2d68a837f668-kube-api-access-qplv4\") pod \"certified-operators-jxt5b\" (UID: \"c1a19938-4ae0-4e68-9786-2d68a837f668\") " pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.362661 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.714310 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:18:04 crc kubenswrapper[4854]: E1007 14:18:04.719589 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:18:04 crc kubenswrapper[4854]: W1007 14:18:04.972088 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1a19938_4ae0_4e68_9786_2d68a837f668.slice/crio-3acafe8096bc6b339e574cd8c4e4ea822c312c44d4688789d05e71992712971f WatchSource:0}: Error finding container 3acafe8096bc6b339e574cd8c4e4ea822c312c44d4688789d05e71992712971f: Status 404 returned error can't find the container with id 3acafe8096bc6b339e574cd8c4e4ea822c312c44d4688789d05e71992712971f Oct 07 14:18:04 crc kubenswrapper[4854]: I1007 14:18:04.986411 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jxt5b"] Oct 07 14:18:05 crc kubenswrapper[4854]: I1007 14:18:05.212361 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxt5b" event={"ID":"c1a19938-4ae0-4e68-9786-2d68a837f668","Type":"ContainerStarted","Data":"3acafe8096bc6b339e574cd8c4e4ea822c312c44d4688789d05e71992712971f"} Oct 07 14:18:06 crc kubenswrapper[4854]: I1007 14:18:06.234391 4854 generic.go:334] "Generic (PLEG): container finished" podID="c1a19938-4ae0-4e68-9786-2d68a837f668" containerID="e639aa655e9e0d79f04d5ffd60e8455b35d472792a010b9ac139e869d9795d0b" exitCode=0 Oct 07 14:18:06 crc kubenswrapper[4854]: I1007 14:18:06.234454 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxt5b" event={"ID":"c1a19938-4ae0-4e68-9786-2d68a837f668","Type":"ContainerDied","Data":"e639aa655e9e0d79f04d5ffd60e8455b35d472792a010b9ac139e869d9795d0b"} Oct 07 14:18:06 crc kubenswrapper[4854]: I1007 14:18:06.237088 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:18:08 crc kubenswrapper[4854]: I1007 14:18:08.265887 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxt5b" event={"ID":"c1a19938-4ae0-4e68-9786-2d68a837f668","Type":"ContainerStarted","Data":"41edde8b3d7331b2ad07b4f2fa4cf5025429996a46752cfd812b568806de80c8"} Oct 07 14:18:09 crc kubenswrapper[4854]: I1007 14:18:09.281416 4854 generic.go:334] "Generic (PLEG): container finished" podID="c1a19938-4ae0-4e68-9786-2d68a837f668" containerID="41edde8b3d7331b2ad07b4f2fa4cf5025429996a46752cfd812b568806de80c8" exitCode=0 Oct 07 14:18:09 crc kubenswrapper[4854]: I1007 14:18:09.281524 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxt5b" event={"ID":"c1a19938-4ae0-4e68-9786-2d68a837f668","Type":"ContainerDied","Data":"41edde8b3d7331b2ad07b4f2fa4cf5025429996a46752cfd812b568806de80c8"} Oct 07 14:18:10 crc kubenswrapper[4854]: I1007 14:18:10.295975 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxt5b" event={"ID":"c1a19938-4ae0-4e68-9786-2d68a837f668","Type":"ContainerStarted","Data":"206230f9201bc383547798793610bb9da2a119de1ec17d09ae6e8b0fd20f493c"} Oct 07 14:18:10 crc kubenswrapper[4854]: I1007 14:18:10.326754 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jxt5b" podStartSLOduration=3.840634612 podStartE2EDuration="7.326708554s" podCreationTimestamp="2025-10-07 14:18:03 +0000 UTC" firstStartedPulling="2025-10-07 14:18:06.236841293 +0000 UTC m=+6802.224673558" lastFinishedPulling="2025-10-07 14:18:09.722915225 +0000 UTC m=+6805.710747500" observedRunningTime="2025-10-07 14:18:10.31962036 +0000 UTC m=+6806.307452625" watchObservedRunningTime="2025-10-07 14:18:10.326708554 +0000 UTC m=+6806.314540829" Oct 07 14:18:14 crc kubenswrapper[4854]: I1007 14:18:14.367328 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:14 crc kubenswrapper[4854]: I1007 14:18:14.368223 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:14 crc kubenswrapper[4854]: I1007 14:18:14.444969 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:15 crc kubenswrapper[4854]: I1007 14:18:15.435770 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:15 crc kubenswrapper[4854]: I1007 14:18:15.509006 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jxt5b"] Oct 07 14:18:16 crc kubenswrapper[4854]: I1007 14:18:16.703381 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:18:16 crc kubenswrapper[4854]: E1007 14:18:16.704572 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:18:17 crc kubenswrapper[4854]: I1007 14:18:17.374547 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jxt5b" podUID="c1a19938-4ae0-4e68-9786-2d68a837f668" containerName="registry-server" containerID="cri-o://206230f9201bc383547798793610bb9da2a119de1ec17d09ae6e8b0fd20f493c" gracePeriod=2 Oct 07 14:18:17 crc kubenswrapper[4854]: I1007 14:18:17.885382 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:17 crc kubenswrapper[4854]: I1007 14:18:17.939879 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1a19938-4ae0-4e68-9786-2d68a837f668-catalog-content\") pod \"c1a19938-4ae0-4e68-9786-2d68a837f668\" (UID: \"c1a19938-4ae0-4e68-9786-2d68a837f668\") " Oct 07 14:18:17 crc kubenswrapper[4854]: I1007 14:18:17.940098 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qplv4\" (UniqueName: \"kubernetes.io/projected/c1a19938-4ae0-4e68-9786-2d68a837f668-kube-api-access-qplv4\") pod \"c1a19938-4ae0-4e68-9786-2d68a837f668\" (UID: \"c1a19938-4ae0-4e68-9786-2d68a837f668\") " Oct 07 14:18:17 crc kubenswrapper[4854]: I1007 14:18:17.940170 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1a19938-4ae0-4e68-9786-2d68a837f668-utilities\") pod \"c1a19938-4ae0-4e68-9786-2d68a837f668\" (UID: \"c1a19938-4ae0-4e68-9786-2d68a837f668\") " Oct 07 14:18:17 crc kubenswrapper[4854]: I1007 14:18:17.941067 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1a19938-4ae0-4e68-9786-2d68a837f668-utilities" (OuterVolumeSpecName: "utilities") pod "c1a19938-4ae0-4e68-9786-2d68a837f668" (UID: "c1a19938-4ae0-4e68-9786-2d68a837f668"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:18:17 crc kubenswrapper[4854]: I1007 14:18:17.946343 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a19938-4ae0-4e68-9786-2d68a837f668-kube-api-access-qplv4" (OuterVolumeSpecName: "kube-api-access-qplv4") pod "c1a19938-4ae0-4e68-9786-2d68a837f668" (UID: "c1a19938-4ae0-4e68-9786-2d68a837f668"). InnerVolumeSpecName "kube-api-access-qplv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:18:17 crc kubenswrapper[4854]: I1007 14:18:17.986291 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1a19938-4ae0-4e68-9786-2d68a837f668-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1a19938-4ae0-4e68-9786-2d68a837f668" (UID: "c1a19938-4ae0-4e68-9786-2d68a837f668"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.042927 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qplv4\" (UniqueName: \"kubernetes.io/projected/c1a19938-4ae0-4e68-9786-2d68a837f668-kube-api-access-qplv4\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.042967 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1a19938-4ae0-4e68-9786-2d68a837f668-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.042986 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1a19938-4ae0-4e68-9786-2d68a837f668-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.392396 4854 generic.go:334] "Generic (PLEG): container finished" podID="c1a19938-4ae0-4e68-9786-2d68a837f668" containerID="206230f9201bc383547798793610bb9da2a119de1ec17d09ae6e8b0fd20f493c" exitCode=0 Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.392479 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxt5b" event={"ID":"c1a19938-4ae0-4e68-9786-2d68a837f668","Type":"ContainerDied","Data":"206230f9201bc383547798793610bb9da2a119de1ec17d09ae6e8b0fd20f493c"} Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.392878 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxt5b" event={"ID":"c1a19938-4ae0-4e68-9786-2d68a837f668","Type":"ContainerDied","Data":"3acafe8096bc6b339e574cd8c4e4ea822c312c44d4688789d05e71992712971f"} Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.392912 4854 scope.go:117] "RemoveContainer" containerID="206230f9201bc383547798793610bb9da2a119de1ec17d09ae6e8b0fd20f493c" Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.392559 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxt5b" Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.442311 4854 scope.go:117] "RemoveContainer" containerID="41edde8b3d7331b2ad07b4f2fa4cf5025429996a46752cfd812b568806de80c8" Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.460133 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jxt5b"] Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.495040 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jxt5b"] Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.525912 4854 scope.go:117] "RemoveContainer" containerID="e639aa655e9e0d79f04d5ffd60e8455b35d472792a010b9ac139e869d9795d0b" Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.560189 4854 scope.go:117] "RemoveContainer" containerID="206230f9201bc383547798793610bb9da2a119de1ec17d09ae6e8b0fd20f493c" Oct 07 14:18:18 crc kubenswrapper[4854]: E1007 14:18:18.560704 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"206230f9201bc383547798793610bb9da2a119de1ec17d09ae6e8b0fd20f493c\": container with ID starting with 206230f9201bc383547798793610bb9da2a119de1ec17d09ae6e8b0fd20f493c not found: ID does not exist" containerID="206230f9201bc383547798793610bb9da2a119de1ec17d09ae6e8b0fd20f493c" Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.560756 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"206230f9201bc383547798793610bb9da2a119de1ec17d09ae6e8b0fd20f493c"} err="failed to get container status \"206230f9201bc383547798793610bb9da2a119de1ec17d09ae6e8b0fd20f493c\": rpc error: code = NotFound desc = could not find container \"206230f9201bc383547798793610bb9da2a119de1ec17d09ae6e8b0fd20f493c\": container with ID starting with 206230f9201bc383547798793610bb9da2a119de1ec17d09ae6e8b0fd20f493c not found: ID does not exist" Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.560789 4854 scope.go:117] "RemoveContainer" containerID="41edde8b3d7331b2ad07b4f2fa4cf5025429996a46752cfd812b568806de80c8" Oct 07 14:18:18 crc kubenswrapper[4854]: E1007 14:18:18.561255 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41edde8b3d7331b2ad07b4f2fa4cf5025429996a46752cfd812b568806de80c8\": container with ID starting with 41edde8b3d7331b2ad07b4f2fa4cf5025429996a46752cfd812b568806de80c8 not found: ID does not exist" containerID="41edde8b3d7331b2ad07b4f2fa4cf5025429996a46752cfd812b568806de80c8" Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.561452 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41edde8b3d7331b2ad07b4f2fa4cf5025429996a46752cfd812b568806de80c8"} err="failed to get container status \"41edde8b3d7331b2ad07b4f2fa4cf5025429996a46752cfd812b568806de80c8\": rpc error: code = NotFound desc = could not find container \"41edde8b3d7331b2ad07b4f2fa4cf5025429996a46752cfd812b568806de80c8\": container with ID starting with 41edde8b3d7331b2ad07b4f2fa4cf5025429996a46752cfd812b568806de80c8 not found: ID does not exist" Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.561614 4854 scope.go:117] "RemoveContainer" containerID="e639aa655e9e0d79f04d5ffd60e8455b35d472792a010b9ac139e869d9795d0b" Oct 07 14:18:18 crc kubenswrapper[4854]: E1007 14:18:18.562325 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e639aa655e9e0d79f04d5ffd60e8455b35d472792a010b9ac139e869d9795d0b\": container with ID starting with e639aa655e9e0d79f04d5ffd60e8455b35d472792a010b9ac139e869d9795d0b not found: ID does not exist" containerID="e639aa655e9e0d79f04d5ffd60e8455b35d472792a010b9ac139e869d9795d0b" Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.562358 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e639aa655e9e0d79f04d5ffd60e8455b35d472792a010b9ac139e869d9795d0b"} err="failed to get container status \"e639aa655e9e0d79f04d5ffd60e8455b35d472792a010b9ac139e869d9795d0b\": rpc error: code = NotFound desc = could not find container \"e639aa655e9e0d79f04d5ffd60e8455b35d472792a010b9ac139e869d9795d0b\": container with ID starting with e639aa655e9e0d79f04d5ffd60e8455b35d472792a010b9ac139e869d9795d0b not found: ID does not exist" Oct 07 14:18:18 crc kubenswrapper[4854]: I1007 14:18:18.719094 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1a19938-4ae0-4e68-9786-2d68a837f668" path="/var/lib/kubelet/pods/c1a19938-4ae0-4e68-9786-2d68a837f668/volumes" Oct 07 14:18:28 crc kubenswrapper[4854]: I1007 14:18:28.703323 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:18:28 crc kubenswrapper[4854]: E1007 14:18:28.704416 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:18:40 crc kubenswrapper[4854]: I1007 14:18:40.703759 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:18:40 crc kubenswrapper[4854]: E1007 14:18:40.705465 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:18:55 crc kubenswrapper[4854]: I1007 14:18:55.704308 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:18:55 crc kubenswrapper[4854]: E1007 14:18:55.705432 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:19:06 crc kubenswrapper[4854]: I1007 14:19:06.703348 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:19:06 crc kubenswrapper[4854]: E1007 14:19:06.704652 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:19:20 crc kubenswrapper[4854]: I1007 14:19:20.703733 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:19:20 crc kubenswrapper[4854]: E1007 14:19:20.705177 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:19:28 crc kubenswrapper[4854]: I1007 14:19:28.062625 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-lzlxs"] Oct 07 14:19:28 crc kubenswrapper[4854]: I1007 14:19:28.073930 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-lzlxs"] Oct 07 14:19:28 crc kubenswrapper[4854]: I1007 14:19:28.716418 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9083bc7a-21f0-4620-bb10-15aa3011034d" path="/var/lib/kubelet/pods/9083bc7a-21f0-4620-bb10-15aa3011034d/volumes" Oct 07 14:19:34 crc kubenswrapper[4854]: I1007 14:19:34.711730 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:19:34 crc kubenswrapper[4854]: E1007 14:19:34.712576 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:19:38 crc kubenswrapper[4854]: I1007 14:19:38.058644 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-7966-account-create-9t5bt"] Oct 07 14:19:38 crc kubenswrapper[4854]: I1007 14:19:38.074529 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-7966-account-create-9t5bt"] Oct 07 14:19:38 crc kubenswrapper[4854]: I1007 14:19:38.723673 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada900e0-3cee-4921-bf22-24d4b90a3297" path="/var/lib/kubelet/pods/ada900e0-3cee-4921-bf22-24d4b90a3297/volumes" Oct 07 14:19:46 crc kubenswrapper[4854]: I1007 14:19:46.703772 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:19:46 crc kubenswrapper[4854]: E1007 14:19:46.704968 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:19:48 crc kubenswrapper[4854]: I1007 14:19:48.050851 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-4gtk9"] Oct 07 14:19:48 crc kubenswrapper[4854]: I1007 14:19:48.067796 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-4gtk9"] Oct 07 14:19:48 crc kubenswrapper[4854]: I1007 14:19:48.723603 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2290c3b-653d-4e9f-96be-1ce645310b22" path="/var/lib/kubelet/pods/e2290c3b-653d-4e9f-96be-1ce645310b22/volumes" Oct 07 14:20:00 crc kubenswrapper[4854]: I1007 14:20:00.703735 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:20:00 crc kubenswrapper[4854]: E1007 14:20:00.705497 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:20:08 crc kubenswrapper[4854]: I1007 14:20:08.062432 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-tfg25"] Oct 07 14:20:08 crc kubenswrapper[4854]: I1007 14:20:08.077424 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-tfg25"] Oct 07 14:20:08 crc kubenswrapper[4854]: I1007 14:20:08.720724 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0046cb69-239e-4960-a070-dd6d9c3b6b72" path="/var/lib/kubelet/pods/0046cb69-239e-4960-a070-dd6d9c3b6b72/volumes" Oct 07 14:20:14 crc kubenswrapper[4854]: I1007 14:20:14.711506 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:20:14 crc kubenswrapper[4854]: E1007 14:20:14.714570 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:20:18 crc kubenswrapper[4854]: I1007 14:20:18.031921 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-38f4-account-create-qldhn"] Oct 07 14:20:18 crc kubenswrapper[4854]: I1007 14:20:18.043481 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-38f4-account-create-qldhn"] Oct 07 14:20:18 crc kubenswrapper[4854]: I1007 14:20:18.729343 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c" path="/var/lib/kubelet/pods/1e6d4cd5-ca3a-4d65-9bd5-5d063cda169c/volumes" Oct 07 14:20:27 crc kubenswrapper[4854]: I1007 14:20:27.416231 4854 scope.go:117] "RemoveContainer" containerID="76987baad39c168f1d37164d5c27fc07893b828c214f9819ddc6a4e29cf15ca2" Oct 07 14:20:27 crc kubenswrapper[4854]: I1007 14:20:27.457354 4854 scope.go:117] "RemoveContainer" containerID="8f3452eac734a7fb33369eb6903d47315593c7f583357b468335361af056926b" Oct 07 14:20:27 crc kubenswrapper[4854]: I1007 14:20:27.534281 4854 scope.go:117] "RemoveContainer" containerID="fa81e2550057dc26f74a34254ef8654d301b65a5499a926440b3801b9d78016e" Oct 07 14:20:27 crc kubenswrapper[4854]: I1007 14:20:27.585298 4854 scope.go:117] "RemoveContainer" containerID="8846691a293b3044399849c8b9837157b6c67e808ae125e0383424a7431ab52f" Oct 07 14:20:27 crc kubenswrapper[4854]: I1007 14:20:27.625833 4854 scope.go:117] "RemoveContainer" containerID="a9ee70c7ef6997db8a30cd9a91cae1bcc82ec5149b32e4c38d268365d65485ac" Oct 07 14:20:29 crc kubenswrapper[4854]: I1007 14:20:29.035646 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-lgwsm"] Oct 07 14:20:29 crc kubenswrapper[4854]: I1007 14:20:29.043486 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-lgwsm"] Oct 07 14:20:29 crc kubenswrapper[4854]: I1007 14:20:29.702496 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:20:29 crc kubenswrapper[4854]: E1007 14:20:29.703056 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:20:30 crc kubenswrapper[4854]: I1007 14:20:30.725593 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="959c7e83-9813-44c5-9952-cfccd2c8eaf4" path="/var/lib/kubelet/pods/959c7e83-9813-44c5-9952-cfccd2c8eaf4/volumes" Oct 07 14:20:43 crc kubenswrapper[4854]: I1007 14:20:43.703061 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:20:43 crc kubenswrapper[4854]: E1007 14:20:43.705258 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:20:58 crc kubenswrapper[4854]: I1007 14:20:58.703069 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:20:58 crc kubenswrapper[4854]: E1007 14:20:58.703840 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:21:11 crc kubenswrapper[4854]: I1007 14:21:11.703278 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:21:11 crc kubenswrapper[4854]: E1007 14:21:11.704163 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:21:22 crc kubenswrapper[4854]: I1007 14:21:22.702627 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:21:22 crc kubenswrapper[4854]: E1007 14:21:22.703437 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:21:27 crc kubenswrapper[4854]: I1007 14:21:27.809588 4854 scope.go:117] "RemoveContainer" containerID="24f4e7711094c67fcad0eabe63dc95228142be094d99fb4e8a5c64835f25e355" Oct 07 14:21:37 crc kubenswrapper[4854]: I1007 14:21:37.703977 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:21:37 crc kubenswrapper[4854]: E1007 14:21:37.705108 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:21:48 crc kubenswrapper[4854]: I1007 14:21:48.703628 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:21:49 crc kubenswrapper[4854]: I1007 14:21:49.948686 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"fb5754d52b825a7168e1a2b62822d22b024d8163c0efe9cbc9c03c5805be855f"} Oct 07 14:23:02 crc kubenswrapper[4854]: I1007 14:23:02.793925 4854 generic.go:334] "Generic (PLEG): container finished" podID="17027554-19b0-44c4-8798-f3f0025605ca" containerID="45effe9481935440024e2d6117b877865af7f22211bfc285d9a1b76eda32931a" exitCode=0 Oct 07 14:23:02 crc kubenswrapper[4854]: I1007 14:23:02.793977 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" event={"ID":"17027554-19b0-44c4-8798-f3f0025605ca","Type":"ContainerDied","Data":"45effe9481935440024e2d6117b877865af7f22211bfc285d9a1b76eda32931a"} Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.244276 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.343316 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-ssh-key\") pod \"17027554-19b0-44c4-8798-f3f0025605ca\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.343431 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmmj5\" (UniqueName: \"kubernetes.io/projected/17027554-19b0-44c4-8798-f3f0025605ca-kube-api-access-wmmj5\") pod \"17027554-19b0-44c4-8798-f3f0025605ca\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.343458 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-inventory\") pod \"17027554-19b0-44c4-8798-f3f0025605ca\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.343544 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-ceph\") pod \"17027554-19b0-44c4-8798-f3f0025605ca\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.343606 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-tripleo-cleanup-combined-ca-bundle\") pod \"17027554-19b0-44c4-8798-f3f0025605ca\" (UID: \"17027554-19b0-44c4-8798-f3f0025605ca\") " Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.350701 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "17027554-19b0-44c4-8798-f3f0025605ca" (UID: "17027554-19b0-44c4-8798-f3f0025605ca"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.350856 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-ceph" (OuterVolumeSpecName: "ceph") pod "17027554-19b0-44c4-8798-f3f0025605ca" (UID: "17027554-19b0-44c4-8798-f3f0025605ca"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.352843 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17027554-19b0-44c4-8798-f3f0025605ca-kube-api-access-wmmj5" (OuterVolumeSpecName: "kube-api-access-wmmj5") pod "17027554-19b0-44c4-8798-f3f0025605ca" (UID: "17027554-19b0-44c4-8798-f3f0025605ca"). InnerVolumeSpecName "kube-api-access-wmmj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.377726 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-inventory" (OuterVolumeSpecName: "inventory") pod "17027554-19b0-44c4-8798-f3f0025605ca" (UID: "17027554-19b0-44c4-8798-f3f0025605ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.402785 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "17027554-19b0-44c4-8798-f3f0025605ca" (UID: "17027554-19b0-44c4-8798-f3f0025605ca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.445962 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.446001 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmmj5\" (UniqueName: \"kubernetes.io/projected/17027554-19b0-44c4-8798-f3f0025605ca-kube-api-access-wmmj5\") on node \"crc\" DevicePath \"\"" Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.446015 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.446024 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.446033 4854 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17027554-19b0-44c4-8798-f3f0025605ca-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.816631 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" event={"ID":"17027554-19b0-44c4-8798-f3f0025605ca","Type":"ContainerDied","Data":"5b7f6c66ba5741bcc30009e0511fe3816038e5ae424da948dd4ad2656b824037"} Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.816672 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b7f6c66ba5741bcc30009e0511fe3816038e5ae424da948dd4ad2656b824037" Oct 07 14:23:04 crc kubenswrapper[4854]: I1007 14:23:04.816675 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.768391 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-wsj8v"] Oct 07 14:23:13 crc kubenswrapper[4854]: E1007 14:23:13.769467 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a19938-4ae0-4e68-9786-2d68a837f668" containerName="extract-content" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.769487 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a19938-4ae0-4e68-9786-2d68a837f668" containerName="extract-content" Oct 07 14:23:13 crc kubenswrapper[4854]: E1007 14:23:13.769510 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a19938-4ae0-4e68-9786-2d68a837f668" containerName="extract-utilities" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.769518 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a19938-4ae0-4e68-9786-2d68a837f668" containerName="extract-utilities" Oct 07 14:23:13 crc kubenswrapper[4854]: E1007 14:23:13.769536 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17027554-19b0-44c4-8798-f3f0025605ca" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.769545 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="17027554-19b0-44c4-8798-f3f0025605ca" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 07 14:23:13 crc kubenswrapper[4854]: E1007 14:23:13.769617 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a19938-4ae0-4e68-9786-2d68a837f668" containerName="registry-server" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.769641 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a19938-4ae0-4e68-9786-2d68a837f668" containerName="registry-server" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.769926 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="17027554-19b0-44c4-8798-f3f0025605ca" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.769942 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1a19938-4ae0-4e68-9786-2d68a837f668" containerName="registry-server" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.771094 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.778523 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.778822 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.778987 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.779141 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.786283 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-wsj8v"] Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.832166 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.832252 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-ceph\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.832341 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.832380 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-inventory\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.832424 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlxmt\" (UniqueName: \"kubernetes.io/projected/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-kube-api-access-wlxmt\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.933244 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.933325 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-ceph\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.933408 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.933439 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-inventory\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.933476 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlxmt\" (UniqueName: \"kubernetes.io/projected/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-kube-api-access-wlxmt\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.940309 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.940651 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-ceph\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.940422 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.943137 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-inventory\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:13 crc kubenswrapper[4854]: I1007 14:23:13.958604 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlxmt\" (UniqueName: \"kubernetes.io/projected/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-kube-api-access-wlxmt\") pod \"bootstrap-openstack-openstack-cell1-wsj8v\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:14 crc kubenswrapper[4854]: I1007 14:23:14.101905 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:23:14 crc kubenswrapper[4854]: I1007 14:23:14.689621 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-wsj8v"] Oct 07 14:23:14 crc kubenswrapper[4854]: I1007 14:23:14.697234 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:23:14 crc kubenswrapper[4854]: I1007 14:23:14.946393 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" event={"ID":"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa","Type":"ContainerStarted","Data":"c6eaffd3a715381af4deb309e76a12b9b328fdf46ca554e8c5ed0a1b9870bda2"} Oct 07 14:23:16 crc kubenswrapper[4854]: I1007 14:23:16.970236 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" event={"ID":"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa","Type":"ContainerStarted","Data":"dccf982ef92f02da96ce2220fb94ce5ca2eec30e78092f614dd7dd34f0d0a943"} Oct 07 14:23:16 crc kubenswrapper[4854]: I1007 14:23:16.992270 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" podStartSLOduration=2.630272123 podStartE2EDuration="3.992247025s" podCreationTimestamp="2025-10-07 14:23:13 +0000 UTC" firstStartedPulling="2025-10-07 14:23:14.696955738 +0000 UTC m=+7110.684788003" lastFinishedPulling="2025-10-07 14:23:16.05893063 +0000 UTC m=+7112.046762905" observedRunningTime="2025-10-07 14:23:16.983272647 +0000 UTC m=+7112.971104902" watchObservedRunningTime="2025-10-07 14:23:16.992247025 +0000 UTC m=+7112.980079320" Oct 07 14:23:54 crc kubenswrapper[4854]: I1007 14:23:54.464941 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qtms9"] Oct 07 14:23:54 crc kubenswrapper[4854]: I1007 14:23:54.469354 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:23:54 crc kubenswrapper[4854]: I1007 14:23:54.486801 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qtms9"] Oct 07 14:23:54 crc kubenswrapper[4854]: I1007 14:23:54.505653 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9qqw\" (UniqueName: \"kubernetes.io/projected/9299714f-83d3-487f-9173-f750d4b5f185-kube-api-access-j9qqw\") pod \"redhat-operators-qtms9\" (UID: \"9299714f-83d3-487f-9173-f750d4b5f185\") " pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:23:54 crc kubenswrapper[4854]: I1007 14:23:54.505715 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9299714f-83d3-487f-9173-f750d4b5f185-utilities\") pod \"redhat-operators-qtms9\" (UID: \"9299714f-83d3-487f-9173-f750d4b5f185\") " pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:23:54 crc kubenswrapper[4854]: I1007 14:23:54.505874 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9299714f-83d3-487f-9173-f750d4b5f185-catalog-content\") pod \"redhat-operators-qtms9\" (UID: \"9299714f-83d3-487f-9173-f750d4b5f185\") " pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:23:54 crc kubenswrapper[4854]: I1007 14:23:54.608842 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9299714f-83d3-487f-9173-f750d4b5f185-catalog-content\") pod \"redhat-operators-qtms9\" (UID: \"9299714f-83d3-487f-9173-f750d4b5f185\") " pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:23:54 crc kubenswrapper[4854]: I1007 14:23:54.609340 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9299714f-83d3-487f-9173-f750d4b5f185-catalog-content\") pod \"redhat-operators-qtms9\" (UID: \"9299714f-83d3-487f-9173-f750d4b5f185\") " pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:23:54 crc kubenswrapper[4854]: I1007 14:23:54.609506 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9qqw\" (UniqueName: \"kubernetes.io/projected/9299714f-83d3-487f-9173-f750d4b5f185-kube-api-access-j9qqw\") pod \"redhat-operators-qtms9\" (UID: \"9299714f-83d3-487f-9173-f750d4b5f185\") " pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:23:54 crc kubenswrapper[4854]: I1007 14:23:54.609560 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9299714f-83d3-487f-9173-f750d4b5f185-utilities\") pod \"redhat-operators-qtms9\" (UID: \"9299714f-83d3-487f-9173-f750d4b5f185\") " pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:23:54 crc kubenswrapper[4854]: I1007 14:23:54.609991 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9299714f-83d3-487f-9173-f750d4b5f185-utilities\") pod \"redhat-operators-qtms9\" (UID: \"9299714f-83d3-487f-9173-f750d4b5f185\") " pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:23:54 crc kubenswrapper[4854]: I1007 14:23:54.629679 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9qqw\" (UniqueName: \"kubernetes.io/projected/9299714f-83d3-487f-9173-f750d4b5f185-kube-api-access-j9qqw\") pod \"redhat-operators-qtms9\" (UID: \"9299714f-83d3-487f-9173-f750d4b5f185\") " pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:23:54 crc kubenswrapper[4854]: I1007 14:23:54.804071 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:23:55 crc kubenswrapper[4854]: I1007 14:23:55.301939 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qtms9"] Oct 07 14:23:55 crc kubenswrapper[4854]: I1007 14:23:55.434558 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtms9" event={"ID":"9299714f-83d3-487f-9173-f750d4b5f185","Type":"ContainerStarted","Data":"54c4ddde8e9ed1ec69ce167f36b26f7fc781b93e24e1ce12cd9203031aa11109"} Oct 07 14:23:56 crc kubenswrapper[4854]: I1007 14:23:56.448801 4854 generic.go:334] "Generic (PLEG): container finished" podID="9299714f-83d3-487f-9173-f750d4b5f185" containerID="9d4447ecef567bf5ca8b756e7a3b81f2585793c80c386cc9ee52b9febdf60bd7" exitCode=0 Oct 07 14:23:56 crc kubenswrapper[4854]: I1007 14:23:56.448855 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtms9" event={"ID":"9299714f-83d3-487f-9173-f750d4b5f185","Type":"ContainerDied","Data":"9d4447ecef567bf5ca8b756e7a3b81f2585793c80c386cc9ee52b9febdf60bd7"} Oct 07 14:23:58 crc kubenswrapper[4854]: I1007 14:23:58.849401 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t5xxl"] Oct 07 14:23:58 crc kubenswrapper[4854]: I1007 14:23:58.852959 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:23:58 crc kubenswrapper[4854]: I1007 14:23:58.871832 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5xxl"] Oct 07 14:23:58 crc kubenswrapper[4854]: I1007 14:23:58.914433 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwkcp\" (UniqueName: \"kubernetes.io/projected/86873e1b-d883-492f-bd0a-a64a664c21f6-kube-api-access-qwkcp\") pod \"redhat-marketplace-t5xxl\" (UID: \"86873e1b-d883-492f-bd0a-a64a664c21f6\") " pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:23:58 crc kubenswrapper[4854]: I1007 14:23:58.914558 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86873e1b-d883-492f-bd0a-a64a664c21f6-utilities\") pod \"redhat-marketplace-t5xxl\" (UID: \"86873e1b-d883-492f-bd0a-a64a664c21f6\") " pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:23:58 crc kubenswrapper[4854]: I1007 14:23:58.914649 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86873e1b-d883-492f-bd0a-a64a664c21f6-catalog-content\") pod \"redhat-marketplace-t5xxl\" (UID: \"86873e1b-d883-492f-bd0a-a64a664c21f6\") " pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:23:59 crc kubenswrapper[4854]: I1007 14:23:59.017118 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86873e1b-d883-492f-bd0a-a64a664c21f6-catalog-content\") pod \"redhat-marketplace-t5xxl\" (UID: \"86873e1b-d883-492f-bd0a-a64a664c21f6\") " pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:23:59 crc kubenswrapper[4854]: I1007 14:23:59.017324 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwkcp\" (UniqueName: \"kubernetes.io/projected/86873e1b-d883-492f-bd0a-a64a664c21f6-kube-api-access-qwkcp\") pod \"redhat-marketplace-t5xxl\" (UID: \"86873e1b-d883-492f-bd0a-a64a664c21f6\") " pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:23:59 crc kubenswrapper[4854]: I1007 14:23:59.017372 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86873e1b-d883-492f-bd0a-a64a664c21f6-utilities\") pod \"redhat-marketplace-t5xxl\" (UID: \"86873e1b-d883-492f-bd0a-a64a664c21f6\") " pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:23:59 crc kubenswrapper[4854]: I1007 14:23:59.017675 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86873e1b-d883-492f-bd0a-a64a664c21f6-catalog-content\") pod \"redhat-marketplace-t5xxl\" (UID: \"86873e1b-d883-492f-bd0a-a64a664c21f6\") " pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:23:59 crc kubenswrapper[4854]: I1007 14:23:59.017836 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86873e1b-d883-492f-bd0a-a64a664c21f6-utilities\") pod \"redhat-marketplace-t5xxl\" (UID: \"86873e1b-d883-492f-bd0a-a64a664c21f6\") " pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:23:59 crc kubenswrapper[4854]: I1007 14:23:59.051055 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwkcp\" (UniqueName: \"kubernetes.io/projected/86873e1b-d883-492f-bd0a-a64a664c21f6-kube-api-access-qwkcp\") pod \"redhat-marketplace-t5xxl\" (UID: \"86873e1b-d883-492f-bd0a-a64a664c21f6\") " pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:23:59 crc kubenswrapper[4854]: I1007 14:23:59.180802 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:23:59 crc kubenswrapper[4854]: I1007 14:23:59.695860 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5xxl"] Oct 07 14:23:59 crc kubenswrapper[4854]: W1007 14:23:59.706361 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86873e1b_d883_492f_bd0a_a64a664c21f6.slice/crio-52b5e79dd1ccf5e77858f679a2b7791755cf9f201a059ffffd16c6dda8c88b52 WatchSource:0}: Error finding container 52b5e79dd1ccf5e77858f679a2b7791755cf9f201a059ffffd16c6dda8c88b52: Status 404 returned error can't find the container with id 52b5e79dd1ccf5e77858f679a2b7791755cf9f201a059ffffd16c6dda8c88b52 Oct 07 14:24:00 crc kubenswrapper[4854]: I1007 14:24:00.531289 4854 generic.go:334] "Generic (PLEG): container finished" podID="86873e1b-d883-492f-bd0a-a64a664c21f6" containerID="9350dda5d0b17fcfe4f375c8ac77ab7912bbab577f517fc11d629cfd48c3df88" exitCode=0 Oct 07 14:24:00 crc kubenswrapper[4854]: I1007 14:24:00.531394 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5xxl" event={"ID":"86873e1b-d883-492f-bd0a-a64a664c21f6","Type":"ContainerDied","Data":"9350dda5d0b17fcfe4f375c8ac77ab7912bbab577f517fc11d629cfd48c3df88"} Oct 07 14:24:00 crc kubenswrapper[4854]: I1007 14:24:00.531549 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5xxl" event={"ID":"86873e1b-d883-492f-bd0a-a64a664c21f6","Type":"ContainerStarted","Data":"52b5e79dd1ccf5e77858f679a2b7791755cf9f201a059ffffd16c6dda8c88b52"} Oct 07 14:24:09 crc kubenswrapper[4854]: I1007 14:24:09.618629 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtms9" event={"ID":"9299714f-83d3-487f-9173-f750d4b5f185","Type":"ContainerStarted","Data":"17773809967fb66ea840bcd2b05ddfcc4ae5d64761b4884f27c914983060d8c6"} Oct 07 14:24:09 crc kubenswrapper[4854]: I1007 14:24:09.622967 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5xxl" event={"ID":"86873e1b-d883-492f-bd0a-a64a664c21f6","Type":"ContainerStarted","Data":"7741193f9128fde30336c5c470af98ef78bd117e161e75cc280fb028632bcbca"} Oct 07 14:24:10 crc kubenswrapper[4854]: I1007 14:24:10.807926 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:24:10 crc kubenswrapper[4854]: I1007 14:24:10.809878 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:24:11 crc kubenswrapper[4854]: I1007 14:24:11.646441 4854 generic.go:334] "Generic (PLEG): container finished" podID="9299714f-83d3-487f-9173-f750d4b5f185" containerID="17773809967fb66ea840bcd2b05ddfcc4ae5d64761b4884f27c914983060d8c6" exitCode=0 Oct 07 14:24:11 crc kubenswrapper[4854]: I1007 14:24:11.646511 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtms9" event={"ID":"9299714f-83d3-487f-9173-f750d4b5f185","Type":"ContainerDied","Data":"17773809967fb66ea840bcd2b05ddfcc4ae5d64761b4884f27c914983060d8c6"} Oct 07 14:24:12 crc kubenswrapper[4854]: I1007 14:24:12.677597 4854 generic.go:334] "Generic (PLEG): container finished" podID="86873e1b-d883-492f-bd0a-a64a664c21f6" containerID="7741193f9128fde30336c5c470af98ef78bd117e161e75cc280fb028632bcbca" exitCode=0 Oct 07 14:24:12 crc kubenswrapper[4854]: I1007 14:24:12.677661 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5xxl" event={"ID":"86873e1b-d883-492f-bd0a-a64a664c21f6","Type":"ContainerDied","Data":"7741193f9128fde30336c5c470af98ef78bd117e161e75cc280fb028632bcbca"} Oct 07 14:24:12 crc kubenswrapper[4854]: I1007 14:24:12.681831 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtms9" event={"ID":"9299714f-83d3-487f-9173-f750d4b5f185","Type":"ContainerStarted","Data":"174d06c53c389644452ffad1ca260f6453062e3ce3ed4bc98f7b07c6aea1cc13"} Oct 07 14:24:12 crc kubenswrapper[4854]: I1007 14:24:12.741580 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qtms9" podStartSLOduration=2.959557989 podStartE2EDuration="18.741562419s" podCreationTimestamp="2025-10-07 14:23:54 +0000 UTC" firstStartedPulling="2025-10-07 14:23:56.451583254 +0000 UTC m=+7152.439415539" lastFinishedPulling="2025-10-07 14:24:12.233587714 +0000 UTC m=+7168.221419969" observedRunningTime="2025-10-07 14:24:12.730231862 +0000 UTC m=+7168.718064157" watchObservedRunningTime="2025-10-07 14:24:12.741562419 +0000 UTC m=+7168.729394674" Oct 07 14:24:13 crc kubenswrapper[4854]: I1007 14:24:13.696331 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5xxl" event={"ID":"86873e1b-d883-492f-bd0a-a64a664c21f6","Type":"ContainerStarted","Data":"7078d08927a571e51da94eee52d21451430c41bc1e3cad14a6620e42fd406e39"} Oct 07 14:24:13 crc kubenswrapper[4854]: I1007 14:24:13.723189 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t5xxl" podStartSLOduration=3.041634686 podStartE2EDuration="15.723142435s" podCreationTimestamp="2025-10-07 14:23:58 +0000 UTC" firstStartedPulling="2025-10-07 14:24:00.534196069 +0000 UTC m=+7156.522028324" lastFinishedPulling="2025-10-07 14:24:13.215703818 +0000 UTC m=+7169.203536073" observedRunningTime="2025-10-07 14:24:13.719781208 +0000 UTC m=+7169.707613473" watchObservedRunningTime="2025-10-07 14:24:13.723142435 +0000 UTC m=+7169.710974700" Oct 07 14:24:14 crc kubenswrapper[4854]: I1007 14:24:14.804584 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:24:14 crc kubenswrapper[4854]: I1007 14:24:14.804931 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:24:15 crc kubenswrapper[4854]: I1007 14:24:15.869445 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qtms9" podUID="9299714f-83d3-487f-9173-f750d4b5f185" containerName="registry-server" probeResult="failure" output=< Oct 07 14:24:15 crc kubenswrapper[4854]: timeout: failed to connect service ":50051" within 1s Oct 07 14:24:15 crc kubenswrapper[4854]: > Oct 07 14:24:19 crc kubenswrapper[4854]: I1007 14:24:19.181515 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:24:19 crc kubenswrapper[4854]: I1007 14:24:19.181851 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:24:19 crc kubenswrapper[4854]: I1007 14:24:19.262343 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:24:19 crc kubenswrapper[4854]: I1007 14:24:19.823591 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:24:19 crc kubenswrapper[4854]: I1007 14:24:19.902443 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5xxl"] Oct 07 14:24:21 crc kubenswrapper[4854]: I1007 14:24:21.794134 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t5xxl" podUID="86873e1b-d883-492f-bd0a-a64a664c21f6" containerName="registry-server" containerID="cri-o://7078d08927a571e51da94eee52d21451430c41bc1e3cad14a6620e42fd406e39" gracePeriod=2 Oct 07 14:24:22 crc kubenswrapper[4854]: E1007 14:24:22.057122 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86873e1b_d883_492f_bd0a_a64a664c21f6.slice/crio-7078d08927a571e51da94eee52d21451430c41bc1e3cad14a6620e42fd406e39.scope\": RecentStats: unable to find data in memory cache]" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.320274 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.403345 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86873e1b-d883-492f-bd0a-a64a664c21f6-catalog-content\") pod \"86873e1b-d883-492f-bd0a-a64a664c21f6\" (UID: \"86873e1b-d883-492f-bd0a-a64a664c21f6\") " Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.403467 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86873e1b-d883-492f-bd0a-a64a664c21f6-utilities\") pod \"86873e1b-d883-492f-bd0a-a64a664c21f6\" (UID: \"86873e1b-d883-492f-bd0a-a64a664c21f6\") " Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.403655 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwkcp\" (UniqueName: \"kubernetes.io/projected/86873e1b-d883-492f-bd0a-a64a664c21f6-kube-api-access-qwkcp\") pod \"86873e1b-d883-492f-bd0a-a64a664c21f6\" (UID: \"86873e1b-d883-492f-bd0a-a64a664c21f6\") " Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.404308 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86873e1b-d883-492f-bd0a-a64a664c21f6-utilities" (OuterVolumeSpecName: "utilities") pod "86873e1b-d883-492f-bd0a-a64a664c21f6" (UID: "86873e1b-d883-492f-bd0a-a64a664c21f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.415291 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86873e1b-d883-492f-bd0a-a64a664c21f6-kube-api-access-qwkcp" (OuterVolumeSpecName: "kube-api-access-qwkcp") pod "86873e1b-d883-492f-bd0a-a64a664c21f6" (UID: "86873e1b-d883-492f-bd0a-a64a664c21f6"). InnerVolumeSpecName "kube-api-access-qwkcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.427785 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86873e1b-d883-492f-bd0a-a64a664c21f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86873e1b-d883-492f-bd0a-a64a664c21f6" (UID: "86873e1b-d883-492f-bd0a-a64a664c21f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.506463 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86873e1b-d883-492f-bd0a-a64a664c21f6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.506489 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86873e1b-d883-492f-bd0a-a64a664c21f6-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.506500 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwkcp\" (UniqueName: \"kubernetes.io/projected/86873e1b-d883-492f-bd0a-a64a664c21f6-kube-api-access-qwkcp\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.806738 4854 generic.go:334] "Generic (PLEG): container finished" podID="86873e1b-d883-492f-bd0a-a64a664c21f6" containerID="7078d08927a571e51da94eee52d21451430c41bc1e3cad14a6620e42fd406e39" exitCode=0 Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.806808 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5xxl" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.806824 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5xxl" event={"ID":"86873e1b-d883-492f-bd0a-a64a664c21f6","Type":"ContainerDied","Data":"7078d08927a571e51da94eee52d21451430c41bc1e3cad14a6620e42fd406e39"} Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.806878 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5xxl" event={"ID":"86873e1b-d883-492f-bd0a-a64a664c21f6","Type":"ContainerDied","Data":"52b5e79dd1ccf5e77858f679a2b7791755cf9f201a059ffffd16c6dda8c88b52"} Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.806897 4854 scope.go:117] "RemoveContainer" containerID="7078d08927a571e51da94eee52d21451430c41bc1e3cad14a6620e42fd406e39" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.842215 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5xxl"] Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.848055 4854 scope.go:117] "RemoveContainer" containerID="7741193f9128fde30336c5c470af98ef78bd117e161e75cc280fb028632bcbca" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.850200 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5xxl"] Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.884783 4854 scope.go:117] "RemoveContainer" containerID="9350dda5d0b17fcfe4f375c8ac77ab7912bbab577f517fc11d629cfd48c3df88" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.933787 4854 scope.go:117] "RemoveContainer" containerID="7078d08927a571e51da94eee52d21451430c41bc1e3cad14a6620e42fd406e39" Oct 07 14:24:22 crc kubenswrapper[4854]: E1007 14:24:22.934448 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7078d08927a571e51da94eee52d21451430c41bc1e3cad14a6620e42fd406e39\": container with ID starting with 7078d08927a571e51da94eee52d21451430c41bc1e3cad14a6620e42fd406e39 not found: ID does not exist" containerID="7078d08927a571e51da94eee52d21451430c41bc1e3cad14a6620e42fd406e39" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.934502 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7078d08927a571e51da94eee52d21451430c41bc1e3cad14a6620e42fd406e39"} err="failed to get container status \"7078d08927a571e51da94eee52d21451430c41bc1e3cad14a6620e42fd406e39\": rpc error: code = NotFound desc = could not find container \"7078d08927a571e51da94eee52d21451430c41bc1e3cad14a6620e42fd406e39\": container with ID starting with 7078d08927a571e51da94eee52d21451430c41bc1e3cad14a6620e42fd406e39 not found: ID does not exist" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.934535 4854 scope.go:117] "RemoveContainer" containerID="7741193f9128fde30336c5c470af98ef78bd117e161e75cc280fb028632bcbca" Oct 07 14:24:22 crc kubenswrapper[4854]: E1007 14:24:22.935024 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7741193f9128fde30336c5c470af98ef78bd117e161e75cc280fb028632bcbca\": container with ID starting with 7741193f9128fde30336c5c470af98ef78bd117e161e75cc280fb028632bcbca not found: ID does not exist" containerID="7741193f9128fde30336c5c470af98ef78bd117e161e75cc280fb028632bcbca" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.935095 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7741193f9128fde30336c5c470af98ef78bd117e161e75cc280fb028632bcbca"} err="failed to get container status \"7741193f9128fde30336c5c470af98ef78bd117e161e75cc280fb028632bcbca\": rpc error: code = NotFound desc = could not find container \"7741193f9128fde30336c5c470af98ef78bd117e161e75cc280fb028632bcbca\": container with ID starting with 7741193f9128fde30336c5c470af98ef78bd117e161e75cc280fb028632bcbca not found: ID does not exist" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.935131 4854 scope.go:117] "RemoveContainer" containerID="9350dda5d0b17fcfe4f375c8ac77ab7912bbab577f517fc11d629cfd48c3df88" Oct 07 14:24:22 crc kubenswrapper[4854]: E1007 14:24:22.935520 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9350dda5d0b17fcfe4f375c8ac77ab7912bbab577f517fc11d629cfd48c3df88\": container with ID starting with 9350dda5d0b17fcfe4f375c8ac77ab7912bbab577f517fc11d629cfd48c3df88 not found: ID does not exist" containerID="9350dda5d0b17fcfe4f375c8ac77ab7912bbab577f517fc11d629cfd48c3df88" Oct 07 14:24:22 crc kubenswrapper[4854]: I1007 14:24:22.935550 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9350dda5d0b17fcfe4f375c8ac77ab7912bbab577f517fc11d629cfd48c3df88"} err="failed to get container status \"9350dda5d0b17fcfe4f375c8ac77ab7912bbab577f517fc11d629cfd48c3df88\": rpc error: code = NotFound desc = could not find container \"9350dda5d0b17fcfe4f375c8ac77ab7912bbab577f517fc11d629cfd48c3df88\": container with ID starting with 9350dda5d0b17fcfe4f375c8ac77ab7912bbab577f517fc11d629cfd48c3df88 not found: ID does not exist" Oct 07 14:24:24 crc kubenswrapper[4854]: I1007 14:24:24.719244 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86873e1b-d883-492f-bd0a-a64a664c21f6" path="/var/lib/kubelet/pods/86873e1b-d883-492f-bd0a-a64a664c21f6/volumes" Oct 07 14:24:24 crc kubenswrapper[4854]: I1007 14:24:24.864449 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:24:24 crc kubenswrapper[4854]: I1007 14:24:24.919823 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qtms9" Oct 07 14:24:25 crc kubenswrapper[4854]: I1007 14:24:25.723913 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qtms9"] Oct 07 14:24:25 crc kubenswrapper[4854]: I1007 14:24:25.907863 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tvnbf"] Oct 07 14:24:25 crc kubenswrapper[4854]: I1007 14:24:25.908344 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tvnbf" podUID="8cc9e0a4-e77e-4160-bf58-09897c8543a3" containerName="registry-server" containerID="cri-o://9497ec7f562591fc4c3b7572eaed05970c4c715c66f6f8f04f040253aff254d1" gracePeriod=2 Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.369026 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.497173 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc9e0a4-e77e-4160-bf58-09897c8543a3-utilities\") pod \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\" (UID: \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\") " Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.497482 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht8ss\" (UniqueName: \"kubernetes.io/projected/8cc9e0a4-e77e-4160-bf58-09897c8543a3-kube-api-access-ht8ss\") pod \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\" (UID: \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\") " Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.497550 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc9e0a4-e77e-4160-bf58-09897c8543a3-catalog-content\") pod \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\" (UID: \"8cc9e0a4-e77e-4160-bf58-09897c8543a3\") " Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.498948 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cc9e0a4-e77e-4160-bf58-09897c8543a3-utilities" (OuterVolumeSpecName: "utilities") pod "8cc9e0a4-e77e-4160-bf58-09897c8543a3" (UID: "8cc9e0a4-e77e-4160-bf58-09897c8543a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.504079 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cc9e0a4-e77e-4160-bf58-09897c8543a3-kube-api-access-ht8ss" (OuterVolumeSpecName: "kube-api-access-ht8ss") pod "8cc9e0a4-e77e-4160-bf58-09897c8543a3" (UID: "8cc9e0a4-e77e-4160-bf58-09897c8543a3"). InnerVolumeSpecName "kube-api-access-ht8ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.574796 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cc9e0a4-e77e-4160-bf58-09897c8543a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8cc9e0a4-e77e-4160-bf58-09897c8543a3" (UID: "8cc9e0a4-e77e-4160-bf58-09897c8543a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.600177 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht8ss\" (UniqueName: \"kubernetes.io/projected/8cc9e0a4-e77e-4160-bf58-09897c8543a3-kube-api-access-ht8ss\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.600211 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc9e0a4-e77e-4160-bf58-09897c8543a3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.600223 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc9e0a4-e77e-4160-bf58-09897c8543a3-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.853662 4854 generic.go:334] "Generic (PLEG): container finished" podID="8cc9e0a4-e77e-4160-bf58-09897c8543a3" containerID="9497ec7f562591fc4c3b7572eaed05970c4c715c66f6f8f04f040253aff254d1" exitCode=0 Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.853719 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tvnbf" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.853769 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvnbf" event={"ID":"8cc9e0a4-e77e-4160-bf58-09897c8543a3","Type":"ContainerDied","Data":"9497ec7f562591fc4c3b7572eaed05970c4c715c66f6f8f04f040253aff254d1"} Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.853837 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvnbf" event={"ID":"8cc9e0a4-e77e-4160-bf58-09897c8543a3","Type":"ContainerDied","Data":"8828d7e454bcbc845e107b85c9ef1a1fff58a19dc16deecdaa8b72cb11f42c06"} Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.853863 4854 scope.go:117] "RemoveContainer" containerID="9497ec7f562591fc4c3b7572eaed05970c4c715c66f6f8f04f040253aff254d1" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.882323 4854 scope.go:117] "RemoveContainer" containerID="8386927863c8347ac6695157b52edeea41d72620c07012996e1151886e34a96f" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.882463 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tvnbf"] Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.895043 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tvnbf"] Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.912682 4854 scope.go:117] "RemoveContainer" containerID="dc6384b296849bac6716aace0ccdb0f4ef7d1b9e5f1fb2d6b95e30b849e0158c" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.957048 4854 scope.go:117] "RemoveContainer" containerID="9497ec7f562591fc4c3b7572eaed05970c4c715c66f6f8f04f040253aff254d1" Oct 07 14:24:26 crc kubenswrapper[4854]: E1007 14:24:26.957675 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9497ec7f562591fc4c3b7572eaed05970c4c715c66f6f8f04f040253aff254d1\": container with ID starting with 9497ec7f562591fc4c3b7572eaed05970c4c715c66f6f8f04f040253aff254d1 not found: ID does not exist" containerID="9497ec7f562591fc4c3b7572eaed05970c4c715c66f6f8f04f040253aff254d1" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.957705 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9497ec7f562591fc4c3b7572eaed05970c4c715c66f6f8f04f040253aff254d1"} err="failed to get container status \"9497ec7f562591fc4c3b7572eaed05970c4c715c66f6f8f04f040253aff254d1\": rpc error: code = NotFound desc = could not find container \"9497ec7f562591fc4c3b7572eaed05970c4c715c66f6f8f04f040253aff254d1\": container with ID starting with 9497ec7f562591fc4c3b7572eaed05970c4c715c66f6f8f04f040253aff254d1 not found: ID does not exist" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.957724 4854 scope.go:117] "RemoveContainer" containerID="8386927863c8347ac6695157b52edeea41d72620c07012996e1151886e34a96f" Oct 07 14:24:26 crc kubenswrapper[4854]: E1007 14:24:26.958185 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8386927863c8347ac6695157b52edeea41d72620c07012996e1151886e34a96f\": container with ID starting with 8386927863c8347ac6695157b52edeea41d72620c07012996e1151886e34a96f not found: ID does not exist" containerID="8386927863c8347ac6695157b52edeea41d72620c07012996e1151886e34a96f" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.958214 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8386927863c8347ac6695157b52edeea41d72620c07012996e1151886e34a96f"} err="failed to get container status \"8386927863c8347ac6695157b52edeea41d72620c07012996e1151886e34a96f\": rpc error: code = NotFound desc = could not find container \"8386927863c8347ac6695157b52edeea41d72620c07012996e1151886e34a96f\": container with ID starting with 8386927863c8347ac6695157b52edeea41d72620c07012996e1151886e34a96f not found: ID does not exist" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.958231 4854 scope.go:117] "RemoveContainer" containerID="dc6384b296849bac6716aace0ccdb0f4ef7d1b9e5f1fb2d6b95e30b849e0158c" Oct 07 14:24:26 crc kubenswrapper[4854]: E1007 14:24:26.958643 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6384b296849bac6716aace0ccdb0f4ef7d1b9e5f1fb2d6b95e30b849e0158c\": container with ID starting with dc6384b296849bac6716aace0ccdb0f4ef7d1b9e5f1fb2d6b95e30b849e0158c not found: ID does not exist" containerID="dc6384b296849bac6716aace0ccdb0f4ef7d1b9e5f1fb2d6b95e30b849e0158c" Oct 07 14:24:26 crc kubenswrapper[4854]: I1007 14:24:26.958670 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6384b296849bac6716aace0ccdb0f4ef7d1b9e5f1fb2d6b95e30b849e0158c"} err="failed to get container status \"dc6384b296849bac6716aace0ccdb0f4ef7d1b9e5f1fb2d6b95e30b849e0158c\": rpc error: code = NotFound desc = could not find container \"dc6384b296849bac6716aace0ccdb0f4ef7d1b9e5f1fb2d6b95e30b849e0158c\": container with ID starting with dc6384b296849bac6716aace0ccdb0f4ef7d1b9e5f1fb2d6b95e30b849e0158c not found: ID does not exist" Oct 07 14:24:28 crc kubenswrapper[4854]: I1007 14:24:28.719019 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cc9e0a4-e77e-4160-bf58-09897c8543a3" path="/var/lib/kubelet/pods/8cc9e0a4-e77e-4160-bf58-09897c8543a3/volumes" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.732697 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mhvjl"] Oct 07 14:24:32 crc kubenswrapper[4854]: E1007 14:24:32.735630 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86873e1b-d883-492f-bd0a-a64a664c21f6" containerName="extract-content" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.735812 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="86873e1b-d883-492f-bd0a-a64a664c21f6" containerName="extract-content" Oct 07 14:24:32 crc kubenswrapper[4854]: E1007 14:24:32.735954 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc9e0a4-e77e-4160-bf58-09897c8543a3" containerName="extract-content" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.736073 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc9e0a4-e77e-4160-bf58-09897c8543a3" containerName="extract-content" Oct 07 14:24:32 crc kubenswrapper[4854]: E1007 14:24:32.736239 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86873e1b-d883-492f-bd0a-a64a664c21f6" containerName="extract-utilities" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.736371 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="86873e1b-d883-492f-bd0a-a64a664c21f6" containerName="extract-utilities" Oct 07 14:24:32 crc kubenswrapper[4854]: E1007 14:24:32.736524 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc9e0a4-e77e-4160-bf58-09897c8543a3" containerName="registry-server" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.736651 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc9e0a4-e77e-4160-bf58-09897c8543a3" containerName="registry-server" Oct 07 14:24:32 crc kubenswrapper[4854]: E1007 14:24:32.736843 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86873e1b-d883-492f-bd0a-a64a664c21f6" containerName="registry-server" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.736967 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="86873e1b-d883-492f-bd0a-a64a664c21f6" containerName="registry-server" Oct 07 14:24:32 crc kubenswrapper[4854]: E1007 14:24:32.737179 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc9e0a4-e77e-4160-bf58-09897c8543a3" containerName="extract-utilities" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.737321 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc9e0a4-e77e-4160-bf58-09897c8543a3" containerName="extract-utilities" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.737863 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="86873e1b-d883-492f-bd0a-a64a664c21f6" containerName="registry-server" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.738029 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cc9e0a4-e77e-4160-bf58-09897c8543a3" containerName="registry-server" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.742938 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.749126 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mhvjl"] Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.828569 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwxlf\" (UniqueName: \"kubernetes.io/projected/56dd4456-34c6-4f52-b565-04ae25674f99-kube-api-access-pwxlf\") pod \"community-operators-mhvjl\" (UID: \"56dd4456-34c6-4f52-b565-04ae25674f99\") " pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.828740 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56dd4456-34c6-4f52-b565-04ae25674f99-utilities\") pod \"community-operators-mhvjl\" (UID: \"56dd4456-34c6-4f52-b565-04ae25674f99\") " pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.828916 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56dd4456-34c6-4f52-b565-04ae25674f99-catalog-content\") pod \"community-operators-mhvjl\" (UID: \"56dd4456-34c6-4f52-b565-04ae25674f99\") " pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.931745 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56dd4456-34c6-4f52-b565-04ae25674f99-utilities\") pod \"community-operators-mhvjl\" (UID: \"56dd4456-34c6-4f52-b565-04ae25674f99\") " pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.931940 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56dd4456-34c6-4f52-b565-04ae25674f99-catalog-content\") pod \"community-operators-mhvjl\" (UID: \"56dd4456-34c6-4f52-b565-04ae25674f99\") " pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.932210 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwxlf\" (UniqueName: \"kubernetes.io/projected/56dd4456-34c6-4f52-b565-04ae25674f99-kube-api-access-pwxlf\") pod \"community-operators-mhvjl\" (UID: \"56dd4456-34c6-4f52-b565-04ae25674f99\") " pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.932627 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56dd4456-34c6-4f52-b565-04ae25674f99-catalog-content\") pod \"community-operators-mhvjl\" (UID: \"56dd4456-34c6-4f52-b565-04ae25674f99\") " pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.933014 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56dd4456-34c6-4f52-b565-04ae25674f99-utilities\") pod \"community-operators-mhvjl\" (UID: \"56dd4456-34c6-4f52-b565-04ae25674f99\") " pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:32 crc kubenswrapper[4854]: I1007 14:24:32.969949 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwxlf\" (UniqueName: \"kubernetes.io/projected/56dd4456-34c6-4f52-b565-04ae25674f99-kube-api-access-pwxlf\") pod \"community-operators-mhvjl\" (UID: \"56dd4456-34c6-4f52-b565-04ae25674f99\") " pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:33 crc kubenswrapper[4854]: I1007 14:24:33.066936 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:33 crc kubenswrapper[4854]: W1007 14:24:33.627449 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56dd4456_34c6_4f52_b565_04ae25674f99.slice/crio-80aec723e3b1c6afd044b358350cd1e5b6ffb7837037f8b8495d37496e582710 WatchSource:0}: Error finding container 80aec723e3b1c6afd044b358350cd1e5b6ffb7837037f8b8495d37496e582710: Status 404 returned error can't find the container with id 80aec723e3b1c6afd044b358350cd1e5b6ffb7837037f8b8495d37496e582710 Oct 07 14:24:33 crc kubenswrapper[4854]: I1007 14:24:33.635666 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mhvjl"] Oct 07 14:24:33 crc kubenswrapper[4854]: I1007 14:24:33.929353 4854 generic.go:334] "Generic (PLEG): container finished" podID="56dd4456-34c6-4f52-b565-04ae25674f99" containerID="32e9135aa8292d1ad0e27d7deb1f4d58e3d93de36931e119a66c045e44397605" exitCode=0 Oct 07 14:24:33 crc kubenswrapper[4854]: I1007 14:24:33.929435 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhvjl" event={"ID":"56dd4456-34c6-4f52-b565-04ae25674f99","Type":"ContainerDied","Data":"32e9135aa8292d1ad0e27d7deb1f4d58e3d93de36931e119a66c045e44397605"} Oct 07 14:24:33 crc kubenswrapper[4854]: I1007 14:24:33.930907 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhvjl" event={"ID":"56dd4456-34c6-4f52-b565-04ae25674f99","Type":"ContainerStarted","Data":"80aec723e3b1c6afd044b358350cd1e5b6ffb7837037f8b8495d37496e582710"} Oct 07 14:24:35 crc kubenswrapper[4854]: I1007 14:24:35.950934 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhvjl" event={"ID":"56dd4456-34c6-4f52-b565-04ae25674f99","Type":"ContainerStarted","Data":"33701bb770e1968d566d31d372f2295d60e0636b194d99923642d46a5f03fc88"} Oct 07 14:24:36 crc kubenswrapper[4854]: I1007 14:24:36.977062 4854 generic.go:334] "Generic (PLEG): container finished" podID="56dd4456-34c6-4f52-b565-04ae25674f99" containerID="33701bb770e1968d566d31d372f2295d60e0636b194d99923642d46a5f03fc88" exitCode=0 Oct 07 14:24:36 crc kubenswrapper[4854]: I1007 14:24:36.977109 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhvjl" event={"ID":"56dd4456-34c6-4f52-b565-04ae25674f99","Type":"ContainerDied","Data":"33701bb770e1968d566d31d372f2295d60e0636b194d99923642d46a5f03fc88"} Oct 07 14:24:37 crc kubenswrapper[4854]: I1007 14:24:37.992499 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhvjl" event={"ID":"56dd4456-34c6-4f52-b565-04ae25674f99","Type":"ContainerStarted","Data":"2cd74ceabdbdbc8e3f288f3977ad3dc4fdada79d5d0a132133d11cef7f44f15b"} Oct 07 14:24:38 crc kubenswrapper[4854]: I1007 14:24:38.024053 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mhvjl" podStartSLOduration=2.555665865 podStartE2EDuration="6.024028982s" podCreationTimestamp="2025-10-07 14:24:32 +0000 UTC" firstStartedPulling="2025-10-07 14:24:33.932926432 +0000 UTC m=+7189.920758727" lastFinishedPulling="2025-10-07 14:24:37.401289579 +0000 UTC m=+7193.389121844" observedRunningTime="2025-10-07 14:24:38.01457742 +0000 UTC m=+7194.002409685" watchObservedRunningTime="2025-10-07 14:24:38.024028982 +0000 UTC m=+7194.011861247" Oct 07 14:24:40 crc kubenswrapper[4854]: I1007 14:24:40.808486 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:24:40 crc kubenswrapper[4854]: I1007 14:24:40.808866 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:24:43 crc kubenswrapper[4854]: I1007 14:24:43.068008 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:43 crc kubenswrapper[4854]: I1007 14:24:43.068809 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:43 crc kubenswrapper[4854]: I1007 14:24:43.172577 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:44 crc kubenswrapper[4854]: I1007 14:24:44.112508 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:44 crc kubenswrapper[4854]: I1007 14:24:44.174287 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mhvjl"] Oct 07 14:24:46 crc kubenswrapper[4854]: I1007 14:24:46.090880 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mhvjl" podUID="56dd4456-34c6-4f52-b565-04ae25674f99" containerName="registry-server" containerID="cri-o://2cd74ceabdbdbc8e3f288f3977ad3dc4fdada79d5d0a132133d11cef7f44f15b" gracePeriod=2 Oct 07 14:24:47 crc kubenswrapper[4854]: I1007 14:24:47.104171 4854 generic.go:334] "Generic (PLEG): container finished" podID="56dd4456-34c6-4f52-b565-04ae25674f99" containerID="2cd74ceabdbdbc8e3f288f3977ad3dc4fdada79d5d0a132133d11cef7f44f15b" exitCode=0 Oct 07 14:24:47 crc kubenswrapper[4854]: I1007 14:24:47.104282 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhvjl" event={"ID":"56dd4456-34c6-4f52-b565-04ae25674f99","Type":"ContainerDied","Data":"2cd74ceabdbdbc8e3f288f3977ad3dc4fdada79d5d0a132133d11cef7f44f15b"} Oct 07 14:24:47 crc kubenswrapper[4854]: I1007 14:24:47.104512 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mhvjl" event={"ID":"56dd4456-34c6-4f52-b565-04ae25674f99","Type":"ContainerDied","Data":"80aec723e3b1c6afd044b358350cd1e5b6ffb7837037f8b8495d37496e582710"} Oct 07 14:24:47 crc kubenswrapper[4854]: I1007 14:24:47.104532 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80aec723e3b1c6afd044b358350cd1e5b6ffb7837037f8b8495d37496e582710" Oct 07 14:24:47 crc kubenswrapper[4854]: I1007 14:24:47.149437 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:47 crc kubenswrapper[4854]: I1007 14:24:47.275851 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwxlf\" (UniqueName: \"kubernetes.io/projected/56dd4456-34c6-4f52-b565-04ae25674f99-kube-api-access-pwxlf\") pod \"56dd4456-34c6-4f52-b565-04ae25674f99\" (UID: \"56dd4456-34c6-4f52-b565-04ae25674f99\") " Oct 07 14:24:47 crc kubenswrapper[4854]: I1007 14:24:47.275998 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56dd4456-34c6-4f52-b565-04ae25674f99-utilities\") pod \"56dd4456-34c6-4f52-b565-04ae25674f99\" (UID: \"56dd4456-34c6-4f52-b565-04ae25674f99\") " Oct 07 14:24:47 crc kubenswrapper[4854]: I1007 14:24:47.276126 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56dd4456-34c6-4f52-b565-04ae25674f99-catalog-content\") pod \"56dd4456-34c6-4f52-b565-04ae25674f99\" (UID: \"56dd4456-34c6-4f52-b565-04ae25674f99\") " Oct 07 14:24:47 crc kubenswrapper[4854]: I1007 14:24:47.277170 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56dd4456-34c6-4f52-b565-04ae25674f99-utilities" (OuterVolumeSpecName: "utilities") pod "56dd4456-34c6-4f52-b565-04ae25674f99" (UID: "56dd4456-34c6-4f52-b565-04ae25674f99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:24:47 crc kubenswrapper[4854]: I1007 14:24:47.284370 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56dd4456-34c6-4f52-b565-04ae25674f99-kube-api-access-pwxlf" (OuterVolumeSpecName: "kube-api-access-pwxlf") pod "56dd4456-34c6-4f52-b565-04ae25674f99" (UID: "56dd4456-34c6-4f52-b565-04ae25674f99"). InnerVolumeSpecName "kube-api-access-pwxlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:24:47 crc kubenswrapper[4854]: I1007 14:24:47.326121 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56dd4456-34c6-4f52-b565-04ae25674f99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56dd4456-34c6-4f52-b565-04ae25674f99" (UID: "56dd4456-34c6-4f52-b565-04ae25674f99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:24:47 crc kubenswrapper[4854]: I1007 14:24:47.378837 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56dd4456-34c6-4f52-b565-04ae25674f99-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:47 crc kubenswrapper[4854]: I1007 14:24:47.378885 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56dd4456-34c6-4f52-b565-04ae25674f99-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:47 crc kubenswrapper[4854]: I1007 14:24:47.378899 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwxlf\" (UniqueName: \"kubernetes.io/projected/56dd4456-34c6-4f52-b565-04ae25674f99-kube-api-access-pwxlf\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:48 crc kubenswrapper[4854]: I1007 14:24:48.119506 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mhvjl" Oct 07 14:24:48 crc kubenswrapper[4854]: I1007 14:24:48.204946 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mhvjl"] Oct 07 14:24:48 crc kubenswrapper[4854]: I1007 14:24:48.217460 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mhvjl"] Oct 07 14:24:48 crc kubenswrapper[4854]: I1007 14:24:48.726536 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56dd4456-34c6-4f52-b565-04ae25674f99" path="/var/lib/kubelet/pods/56dd4456-34c6-4f52-b565-04ae25674f99/volumes" Oct 07 14:25:10 crc kubenswrapper[4854]: I1007 14:25:10.807578 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:25:10 crc kubenswrapper[4854]: I1007 14:25:10.808224 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:25:10 crc kubenswrapper[4854]: I1007 14:25:10.808284 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 14:25:10 crc kubenswrapper[4854]: I1007 14:25:10.809253 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb5754d52b825a7168e1a2b62822d22b024d8163c0efe9cbc9c03c5805be855f"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:25:10 crc kubenswrapper[4854]: I1007 14:25:10.809329 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://fb5754d52b825a7168e1a2b62822d22b024d8163c0efe9cbc9c03c5805be855f" gracePeriod=600 Oct 07 14:25:11 crc kubenswrapper[4854]: I1007 14:25:11.389815 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="fb5754d52b825a7168e1a2b62822d22b024d8163c0efe9cbc9c03c5805be855f" exitCode=0 Oct 07 14:25:11 crc kubenswrapper[4854]: I1007 14:25:11.390036 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"fb5754d52b825a7168e1a2b62822d22b024d8163c0efe9cbc9c03c5805be855f"} Oct 07 14:25:11 crc kubenswrapper[4854]: I1007 14:25:11.390231 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a"} Oct 07 14:25:11 crc kubenswrapper[4854]: I1007 14:25:11.390252 4854 scope.go:117] "RemoveContainer" containerID="98256c42a54e94b9b889e5d466e2ae1d1852c7195bf7391d9da923423a42d1c7" Oct 07 14:26:27 crc kubenswrapper[4854]: I1007 14:26:27.222984 4854 generic.go:334] "Generic (PLEG): container finished" podID="8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa" containerID="dccf982ef92f02da96ce2220fb94ce5ca2eec30e78092f614dd7dd34f0d0a943" exitCode=0 Oct 07 14:26:27 crc kubenswrapper[4854]: I1007 14:26:27.223211 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" event={"ID":"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa","Type":"ContainerDied","Data":"dccf982ef92f02da96ce2220fb94ce5ca2eec30e78092f614dd7dd34f0d0a943"} Oct 07 14:26:28 crc kubenswrapper[4854]: I1007 14:26:28.818551 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:26:28 crc kubenswrapper[4854]: I1007 14:26:28.931289 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-bootstrap-combined-ca-bundle\") pod \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " Oct 07 14:26:28 crc kubenswrapper[4854]: I1007 14:26:28.931469 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-ceph\") pod \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " Oct 07 14:26:28 crc kubenswrapper[4854]: I1007 14:26:28.931535 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-ssh-key\") pod \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " Oct 07 14:26:28 crc kubenswrapper[4854]: I1007 14:26:28.931653 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-inventory\") pod \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " Oct 07 14:26:28 crc kubenswrapper[4854]: I1007 14:26:28.931735 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlxmt\" (UniqueName: \"kubernetes.io/projected/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-kube-api-access-wlxmt\") pod \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\" (UID: \"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa\") " Oct 07 14:26:28 crc kubenswrapper[4854]: I1007 14:26:28.937992 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-kube-api-access-wlxmt" (OuterVolumeSpecName: "kube-api-access-wlxmt") pod "8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa" (UID: "8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa"). InnerVolumeSpecName "kube-api-access-wlxmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:26:28 crc kubenswrapper[4854]: I1007 14:26:28.939647 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-ceph" (OuterVolumeSpecName: "ceph") pod "8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa" (UID: "8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:26:28 crc kubenswrapper[4854]: I1007 14:26:28.940476 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa" (UID: "8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:26:28 crc kubenswrapper[4854]: I1007 14:26:28.961331 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa" (UID: "8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:26:28 crc kubenswrapper[4854]: I1007 14:26:28.964683 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-inventory" (OuterVolumeSpecName: "inventory") pod "8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa" (UID: "8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.035573 4854 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.035607 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.035620 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.035632 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.035644 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlxmt\" (UniqueName: \"kubernetes.io/projected/8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa-kube-api-access-wlxmt\") on node \"crc\" DevicePath \"\"" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.246300 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" event={"ID":"8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa","Type":"ContainerDied","Data":"c6eaffd3a715381af4deb309e76a12b9b328fdf46ca554e8c5ed0a1b9870bda2"} Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.246338 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6eaffd3a715381af4deb309e76a12b9b328fdf46ca554e8c5ed0a1b9870bda2" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.246381 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-wsj8v" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.337732 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-7kss6"] Oct 07 14:26:29 crc kubenswrapper[4854]: E1007 14:26:29.338447 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56dd4456-34c6-4f52-b565-04ae25674f99" containerName="extract-utilities" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.338465 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="56dd4456-34c6-4f52-b565-04ae25674f99" containerName="extract-utilities" Oct 07 14:26:29 crc kubenswrapper[4854]: E1007 14:26:29.338487 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56dd4456-34c6-4f52-b565-04ae25674f99" containerName="registry-server" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.338494 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="56dd4456-34c6-4f52-b565-04ae25674f99" containerName="registry-server" Oct 07 14:26:29 crc kubenswrapper[4854]: E1007 14:26:29.338507 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa" containerName="bootstrap-openstack-openstack-cell1" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.338514 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa" containerName="bootstrap-openstack-openstack-cell1" Oct 07 14:26:29 crc kubenswrapper[4854]: E1007 14:26:29.338568 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56dd4456-34c6-4f52-b565-04ae25674f99" containerName="extract-content" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.338577 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="56dd4456-34c6-4f52-b565-04ae25674f99" containerName="extract-content" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.338803 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa" containerName="bootstrap-openstack-openstack-cell1" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.338819 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="56dd4456-34c6-4f52-b565-04ae25674f99" containerName="registry-server" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.339583 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.346594 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.346975 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.347494 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.348409 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.356161 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-7kss6"] Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.442601 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-ceph\") pod \"download-cache-openstack-openstack-cell1-7kss6\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.442718 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-ssh-key\") pod \"download-cache-openstack-openstack-cell1-7kss6\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.442757 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-inventory\") pod \"download-cache-openstack-openstack-cell1-7kss6\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.442984 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94zj2\" (UniqueName: \"kubernetes.io/projected/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-kube-api-access-94zj2\") pod \"download-cache-openstack-openstack-cell1-7kss6\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.545008 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-ssh-key\") pod \"download-cache-openstack-openstack-cell1-7kss6\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.545070 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-inventory\") pod \"download-cache-openstack-openstack-cell1-7kss6\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.545126 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94zj2\" (UniqueName: \"kubernetes.io/projected/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-kube-api-access-94zj2\") pod \"download-cache-openstack-openstack-cell1-7kss6\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.545303 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-ceph\") pod \"download-cache-openstack-openstack-cell1-7kss6\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.551024 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-ssh-key\") pod \"download-cache-openstack-openstack-cell1-7kss6\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.553867 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-inventory\") pod \"download-cache-openstack-openstack-cell1-7kss6\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.554713 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-ceph\") pod \"download-cache-openstack-openstack-cell1-7kss6\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.575861 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94zj2\" (UniqueName: \"kubernetes.io/projected/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-kube-api-access-94zj2\") pod \"download-cache-openstack-openstack-cell1-7kss6\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:26:29 crc kubenswrapper[4854]: I1007 14:26:29.663663 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:26:30 crc kubenswrapper[4854]: I1007 14:26:30.240543 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-7kss6"] Oct 07 14:26:30 crc kubenswrapper[4854]: I1007 14:26:30.257016 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-7kss6" event={"ID":"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a","Type":"ContainerStarted","Data":"647c8c36e82c78d237866b69c4d70913eab7febf6e92e54c4b0ad3dc5e65823d"} Oct 07 14:26:31 crc kubenswrapper[4854]: I1007 14:26:31.267230 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-7kss6" event={"ID":"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a","Type":"ContainerStarted","Data":"f89554e98252c66fd68a318bb452d3681b9a61fc2aae79284ac79505433c3c0e"} Oct 07 14:26:31 crc kubenswrapper[4854]: I1007 14:26:31.295508 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-7kss6" podStartSLOduration=1.722336344 podStartE2EDuration="2.295492378s" podCreationTimestamp="2025-10-07 14:26:29 +0000 UTC" firstStartedPulling="2025-10-07 14:26:30.236771676 +0000 UTC m=+7306.224603941" lastFinishedPulling="2025-10-07 14:26:30.80992772 +0000 UTC m=+7306.797759975" observedRunningTime="2025-10-07 14:26:31.293289184 +0000 UTC m=+7307.281121439" watchObservedRunningTime="2025-10-07 14:26:31.295492378 +0000 UTC m=+7307.283324633" Oct 07 14:27:40 crc kubenswrapper[4854]: I1007 14:27:40.808007 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:27:40 crc kubenswrapper[4854]: I1007 14:27:40.808552 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:28:06 crc kubenswrapper[4854]: I1007 14:28:06.753763 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nt5nq"] Oct 07 14:28:06 crc kubenswrapper[4854]: I1007 14:28:06.757455 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nt5nq"] Oct 07 14:28:06 crc kubenswrapper[4854]: I1007 14:28:06.757583 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:06 crc kubenswrapper[4854]: I1007 14:28:06.855553 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc30509-faa0-449b-94c4-12ce64a27c1f-catalog-content\") pod \"certified-operators-nt5nq\" (UID: \"7bc30509-faa0-449b-94c4-12ce64a27c1f\") " pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:06 crc kubenswrapper[4854]: I1007 14:28:06.855640 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zw6h\" (UniqueName: \"kubernetes.io/projected/7bc30509-faa0-449b-94c4-12ce64a27c1f-kube-api-access-2zw6h\") pod \"certified-operators-nt5nq\" (UID: \"7bc30509-faa0-449b-94c4-12ce64a27c1f\") " pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:06 crc kubenswrapper[4854]: I1007 14:28:06.855680 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc30509-faa0-449b-94c4-12ce64a27c1f-utilities\") pod \"certified-operators-nt5nq\" (UID: \"7bc30509-faa0-449b-94c4-12ce64a27c1f\") " pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:06 crc kubenswrapper[4854]: I1007 14:28:06.956877 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc30509-faa0-449b-94c4-12ce64a27c1f-catalog-content\") pod \"certified-operators-nt5nq\" (UID: \"7bc30509-faa0-449b-94c4-12ce64a27c1f\") " pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:06 crc kubenswrapper[4854]: I1007 14:28:06.956955 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zw6h\" (UniqueName: \"kubernetes.io/projected/7bc30509-faa0-449b-94c4-12ce64a27c1f-kube-api-access-2zw6h\") pod \"certified-operators-nt5nq\" (UID: \"7bc30509-faa0-449b-94c4-12ce64a27c1f\") " pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:06 crc kubenswrapper[4854]: I1007 14:28:06.956992 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc30509-faa0-449b-94c4-12ce64a27c1f-utilities\") pod \"certified-operators-nt5nq\" (UID: \"7bc30509-faa0-449b-94c4-12ce64a27c1f\") " pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:06 crc kubenswrapper[4854]: I1007 14:28:06.957494 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc30509-faa0-449b-94c4-12ce64a27c1f-utilities\") pod \"certified-operators-nt5nq\" (UID: \"7bc30509-faa0-449b-94c4-12ce64a27c1f\") " pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:06 crc kubenswrapper[4854]: I1007 14:28:06.957495 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc30509-faa0-449b-94c4-12ce64a27c1f-catalog-content\") pod \"certified-operators-nt5nq\" (UID: \"7bc30509-faa0-449b-94c4-12ce64a27c1f\") " pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:06 crc kubenswrapper[4854]: I1007 14:28:06.989195 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zw6h\" (UniqueName: \"kubernetes.io/projected/7bc30509-faa0-449b-94c4-12ce64a27c1f-kube-api-access-2zw6h\") pod \"certified-operators-nt5nq\" (UID: \"7bc30509-faa0-449b-94c4-12ce64a27c1f\") " pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:07 crc kubenswrapper[4854]: I1007 14:28:07.088459 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:07 crc kubenswrapper[4854]: I1007 14:28:07.608608 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nt5nq"] Oct 07 14:28:08 crc kubenswrapper[4854]: I1007 14:28:08.435796 4854 generic.go:334] "Generic (PLEG): container finished" podID="7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a" containerID="f89554e98252c66fd68a318bb452d3681b9a61fc2aae79284ac79505433c3c0e" exitCode=0 Oct 07 14:28:08 crc kubenswrapper[4854]: I1007 14:28:08.435899 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-7kss6" event={"ID":"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a","Type":"ContainerDied","Data":"f89554e98252c66fd68a318bb452d3681b9a61fc2aae79284ac79505433c3c0e"} Oct 07 14:28:08 crc kubenswrapper[4854]: I1007 14:28:08.439759 4854 generic.go:334] "Generic (PLEG): container finished" podID="7bc30509-faa0-449b-94c4-12ce64a27c1f" containerID="248be0444375f135d382bc907f42fecbab0a4c0de7dce557f8193320367f6c30" exitCode=0 Oct 07 14:28:08 crc kubenswrapper[4854]: I1007 14:28:08.439842 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nt5nq" event={"ID":"7bc30509-faa0-449b-94c4-12ce64a27c1f","Type":"ContainerDied","Data":"248be0444375f135d382bc907f42fecbab0a4c0de7dce557f8193320367f6c30"} Oct 07 14:28:08 crc kubenswrapper[4854]: I1007 14:28:08.439923 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nt5nq" event={"ID":"7bc30509-faa0-449b-94c4-12ce64a27c1f","Type":"ContainerStarted","Data":"16f343d31c7713033635c791ca4b64a8f039f1984b67773ea2222022f9c8f37d"} Oct 07 14:28:09 crc kubenswrapper[4854]: I1007 14:28:09.453966 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nt5nq" event={"ID":"7bc30509-faa0-449b-94c4-12ce64a27c1f","Type":"ContainerStarted","Data":"4287bb427ec79d1c6f6922936b50919f35fb159cdad424b2895792b39e566fde"} Oct 07 14:28:09 crc kubenswrapper[4854]: I1007 14:28:09.972355 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.022329 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-inventory\") pod \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.022425 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-ceph\") pod \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.022489 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-ssh-key\") pod \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.022602 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94zj2\" (UniqueName: \"kubernetes.io/projected/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-kube-api-access-94zj2\") pod \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\" (UID: \"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a\") " Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.027604 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-ceph" (OuterVolumeSpecName: "ceph") pod "7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a" (UID: "7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.028933 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-kube-api-access-94zj2" (OuterVolumeSpecName: "kube-api-access-94zj2") pod "7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a" (UID: "7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a"). InnerVolumeSpecName "kube-api-access-94zj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.055339 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-inventory" (OuterVolumeSpecName: "inventory") pod "7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a" (UID: "7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.079428 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a" (UID: "7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.125249 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.125293 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.125306 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.125320 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94zj2\" (UniqueName: \"kubernetes.io/projected/7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a-kube-api-access-94zj2\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.473890 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-7kss6" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.474663 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-7kss6" event={"ID":"7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a","Type":"ContainerDied","Data":"647c8c36e82c78d237866b69c4d70913eab7febf6e92e54c4b0ad3dc5e65823d"} Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.474705 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="647c8c36e82c78d237866b69c4d70913eab7febf6e92e54c4b0ad3dc5e65823d" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.576962 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-2tv87"] Oct 07 14:28:10 crc kubenswrapper[4854]: E1007 14:28:10.577491 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a" containerName="download-cache-openstack-openstack-cell1" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.577506 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a" containerName="download-cache-openstack-openstack-cell1" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.577711 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a" containerName="download-cache-openstack-openstack-cell1" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.578507 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.586427 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-2tv87"] Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.619799 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.620451 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.621399 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.621543 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.745262 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-ssh-key\") pod \"configure-network-openstack-openstack-cell1-2tv87\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.745472 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jblz\" (UniqueName: \"kubernetes.io/projected/ead9f298-61ce-4835-b285-6df1dd26b9e5-kube-api-access-6jblz\") pod \"configure-network-openstack-openstack-cell1-2tv87\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.745557 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-inventory\") pod \"configure-network-openstack-openstack-cell1-2tv87\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.745657 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-ceph\") pod \"configure-network-openstack-openstack-cell1-2tv87\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.807648 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.807946 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.848010 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jblz\" (UniqueName: \"kubernetes.io/projected/ead9f298-61ce-4835-b285-6df1dd26b9e5-kube-api-access-6jblz\") pod \"configure-network-openstack-openstack-cell1-2tv87\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.848327 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-inventory\") pod \"configure-network-openstack-openstack-cell1-2tv87\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.848448 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-ceph\") pod \"configure-network-openstack-openstack-cell1-2tv87\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.848641 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-ssh-key\") pod \"configure-network-openstack-openstack-cell1-2tv87\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.855723 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-inventory\") pod \"configure-network-openstack-openstack-cell1-2tv87\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.856342 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-ssh-key\") pod \"configure-network-openstack-openstack-cell1-2tv87\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.856864 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-ceph\") pod \"configure-network-openstack-openstack-cell1-2tv87\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.888049 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jblz\" (UniqueName: \"kubernetes.io/projected/ead9f298-61ce-4835-b285-6df1dd26b9e5-kube-api-access-6jblz\") pod \"configure-network-openstack-openstack-cell1-2tv87\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:28:10 crc kubenswrapper[4854]: I1007 14:28:10.950790 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:28:11 crc kubenswrapper[4854]: I1007 14:28:11.491529 4854 generic.go:334] "Generic (PLEG): container finished" podID="7bc30509-faa0-449b-94c4-12ce64a27c1f" containerID="4287bb427ec79d1c6f6922936b50919f35fb159cdad424b2895792b39e566fde" exitCode=0 Oct 07 14:28:11 crc kubenswrapper[4854]: I1007 14:28:11.491630 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nt5nq" event={"ID":"7bc30509-faa0-449b-94c4-12ce64a27c1f","Type":"ContainerDied","Data":"4287bb427ec79d1c6f6922936b50919f35fb159cdad424b2895792b39e566fde"} Oct 07 14:28:11 crc kubenswrapper[4854]: W1007 14:28:11.563166 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podead9f298_61ce_4835_b285_6df1dd26b9e5.slice/crio-0a334a53fe2d02c62bbb93058629f79dd4cb0ccdab5ba3ca2f043652c6a7479d WatchSource:0}: Error finding container 0a334a53fe2d02c62bbb93058629f79dd4cb0ccdab5ba3ca2f043652c6a7479d: Status 404 returned error can't find the container with id 0a334a53fe2d02c62bbb93058629f79dd4cb0ccdab5ba3ca2f043652c6a7479d Oct 07 14:28:11 crc kubenswrapper[4854]: I1007 14:28:11.576709 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-2tv87"] Oct 07 14:28:12 crc kubenswrapper[4854]: I1007 14:28:12.503775 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nt5nq" event={"ID":"7bc30509-faa0-449b-94c4-12ce64a27c1f","Type":"ContainerStarted","Data":"5475f94c46d9828a0ba5fa111c98c0a28b9bbba7126cdfd42047fa7567b60b0f"} Oct 07 14:28:12 crc kubenswrapper[4854]: I1007 14:28:12.506420 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-2tv87" event={"ID":"ead9f298-61ce-4835-b285-6df1dd26b9e5","Type":"ContainerStarted","Data":"6e4740f9a181b19182550413a825f7d364a1afd73bb3e9671af91762d0e4c0b7"} Oct 07 14:28:12 crc kubenswrapper[4854]: I1007 14:28:12.506454 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-2tv87" event={"ID":"ead9f298-61ce-4835-b285-6df1dd26b9e5","Type":"ContainerStarted","Data":"0a334a53fe2d02c62bbb93058629f79dd4cb0ccdab5ba3ca2f043652c6a7479d"} Oct 07 14:28:12 crc kubenswrapper[4854]: I1007 14:28:12.524084 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nt5nq" podStartSLOduration=2.838032043 podStartE2EDuration="6.524063656s" podCreationTimestamp="2025-10-07 14:28:06 +0000 UTC" firstStartedPulling="2025-10-07 14:28:08.44454109 +0000 UTC m=+7404.432373375" lastFinishedPulling="2025-10-07 14:28:12.130572703 +0000 UTC m=+7408.118404988" observedRunningTime="2025-10-07 14:28:12.521002958 +0000 UTC m=+7408.508835223" watchObservedRunningTime="2025-10-07 14:28:12.524063656 +0000 UTC m=+7408.511895921" Oct 07 14:28:12 crc kubenswrapper[4854]: I1007 14:28:12.567905 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-2tv87" podStartSLOduration=2.047252591 podStartE2EDuration="2.56788672s" podCreationTimestamp="2025-10-07 14:28:10 +0000 UTC" firstStartedPulling="2025-10-07 14:28:11.56610939 +0000 UTC m=+7407.553941635" lastFinishedPulling="2025-10-07 14:28:12.086743469 +0000 UTC m=+7408.074575764" observedRunningTime="2025-10-07 14:28:12.566233512 +0000 UTC m=+7408.554065787" watchObservedRunningTime="2025-10-07 14:28:12.56788672 +0000 UTC m=+7408.555718975" Oct 07 14:28:17 crc kubenswrapper[4854]: I1007 14:28:17.088786 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:17 crc kubenswrapper[4854]: I1007 14:28:17.090645 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:17 crc kubenswrapper[4854]: I1007 14:28:17.168404 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:17 crc kubenswrapper[4854]: I1007 14:28:17.638526 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:17 crc kubenswrapper[4854]: I1007 14:28:17.694721 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nt5nq"] Oct 07 14:28:19 crc kubenswrapper[4854]: I1007 14:28:19.593103 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nt5nq" podUID="7bc30509-faa0-449b-94c4-12ce64a27c1f" containerName="registry-server" containerID="cri-o://5475f94c46d9828a0ba5fa111c98c0a28b9bbba7126cdfd42047fa7567b60b0f" gracePeriod=2 Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.121627 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.279327 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc30509-faa0-449b-94c4-12ce64a27c1f-utilities\") pod \"7bc30509-faa0-449b-94c4-12ce64a27c1f\" (UID: \"7bc30509-faa0-449b-94c4-12ce64a27c1f\") " Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.279450 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zw6h\" (UniqueName: \"kubernetes.io/projected/7bc30509-faa0-449b-94c4-12ce64a27c1f-kube-api-access-2zw6h\") pod \"7bc30509-faa0-449b-94c4-12ce64a27c1f\" (UID: \"7bc30509-faa0-449b-94c4-12ce64a27c1f\") " Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.279500 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc30509-faa0-449b-94c4-12ce64a27c1f-catalog-content\") pod \"7bc30509-faa0-449b-94c4-12ce64a27c1f\" (UID: \"7bc30509-faa0-449b-94c4-12ce64a27c1f\") " Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.281343 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bc30509-faa0-449b-94c4-12ce64a27c1f-utilities" (OuterVolumeSpecName: "utilities") pod "7bc30509-faa0-449b-94c4-12ce64a27c1f" (UID: "7bc30509-faa0-449b-94c4-12ce64a27c1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.290373 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bc30509-faa0-449b-94c4-12ce64a27c1f-kube-api-access-2zw6h" (OuterVolumeSpecName: "kube-api-access-2zw6h") pod "7bc30509-faa0-449b-94c4-12ce64a27c1f" (UID: "7bc30509-faa0-449b-94c4-12ce64a27c1f"). InnerVolumeSpecName "kube-api-access-2zw6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.337261 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bc30509-faa0-449b-94c4-12ce64a27c1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7bc30509-faa0-449b-94c4-12ce64a27c1f" (UID: "7bc30509-faa0-449b-94c4-12ce64a27c1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.382503 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc30509-faa0-449b-94c4-12ce64a27c1f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.382533 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc30509-faa0-449b-94c4-12ce64a27c1f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.382545 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zw6h\" (UniqueName: \"kubernetes.io/projected/7bc30509-faa0-449b-94c4-12ce64a27c1f-kube-api-access-2zw6h\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.609761 4854 generic.go:334] "Generic (PLEG): container finished" podID="7bc30509-faa0-449b-94c4-12ce64a27c1f" containerID="5475f94c46d9828a0ba5fa111c98c0a28b9bbba7126cdfd42047fa7567b60b0f" exitCode=0 Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.609864 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nt5nq" event={"ID":"7bc30509-faa0-449b-94c4-12ce64a27c1f","Type":"ContainerDied","Data":"5475f94c46d9828a0ba5fa111c98c0a28b9bbba7126cdfd42047fa7567b60b0f"} Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.609902 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nt5nq" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.610089 4854 scope.go:117] "RemoveContainer" containerID="5475f94c46d9828a0ba5fa111c98c0a28b9bbba7126cdfd42047fa7567b60b0f" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.610069 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nt5nq" event={"ID":"7bc30509-faa0-449b-94c4-12ce64a27c1f","Type":"ContainerDied","Data":"16f343d31c7713033635c791ca4b64a8f039f1984b67773ea2222022f9c8f37d"} Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.664709 4854 scope.go:117] "RemoveContainer" containerID="4287bb427ec79d1c6f6922936b50919f35fb159cdad424b2895792b39e566fde" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.673993 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nt5nq"] Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.685783 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nt5nq"] Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.701493 4854 scope.go:117] "RemoveContainer" containerID="248be0444375f135d382bc907f42fecbab0a4c0de7dce557f8193320367f6c30" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.720430 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bc30509-faa0-449b-94c4-12ce64a27c1f" path="/var/lib/kubelet/pods/7bc30509-faa0-449b-94c4-12ce64a27c1f/volumes" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.759565 4854 scope.go:117] "RemoveContainer" containerID="5475f94c46d9828a0ba5fa111c98c0a28b9bbba7126cdfd42047fa7567b60b0f" Oct 07 14:28:20 crc kubenswrapper[4854]: E1007 14:28:20.760255 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5475f94c46d9828a0ba5fa111c98c0a28b9bbba7126cdfd42047fa7567b60b0f\": container with ID starting with 5475f94c46d9828a0ba5fa111c98c0a28b9bbba7126cdfd42047fa7567b60b0f not found: ID does not exist" containerID="5475f94c46d9828a0ba5fa111c98c0a28b9bbba7126cdfd42047fa7567b60b0f" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.760393 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5475f94c46d9828a0ba5fa111c98c0a28b9bbba7126cdfd42047fa7567b60b0f"} err="failed to get container status \"5475f94c46d9828a0ba5fa111c98c0a28b9bbba7126cdfd42047fa7567b60b0f\": rpc error: code = NotFound desc = could not find container \"5475f94c46d9828a0ba5fa111c98c0a28b9bbba7126cdfd42047fa7567b60b0f\": container with ID starting with 5475f94c46d9828a0ba5fa111c98c0a28b9bbba7126cdfd42047fa7567b60b0f not found: ID does not exist" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.760428 4854 scope.go:117] "RemoveContainer" containerID="4287bb427ec79d1c6f6922936b50919f35fb159cdad424b2895792b39e566fde" Oct 07 14:28:20 crc kubenswrapper[4854]: E1007 14:28:20.761137 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4287bb427ec79d1c6f6922936b50919f35fb159cdad424b2895792b39e566fde\": container with ID starting with 4287bb427ec79d1c6f6922936b50919f35fb159cdad424b2895792b39e566fde not found: ID does not exist" containerID="4287bb427ec79d1c6f6922936b50919f35fb159cdad424b2895792b39e566fde" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.761322 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4287bb427ec79d1c6f6922936b50919f35fb159cdad424b2895792b39e566fde"} err="failed to get container status \"4287bb427ec79d1c6f6922936b50919f35fb159cdad424b2895792b39e566fde\": rpc error: code = NotFound desc = could not find container \"4287bb427ec79d1c6f6922936b50919f35fb159cdad424b2895792b39e566fde\": container with ID starting with 4287bb427ec79d1c6f6922936b50919f35fb159cdad424b2895792b39e566fde not found: ID does not exist" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.761359 4854 scope.go:117] "RemoveContainer" containerID="248be0444375f135d382bc907f42fecbab0a4c0de7dce557f8193320367f6c30" Oct 07 14:28:20 crc kubenswrapper[4854]: E1007 14:28:20.761743 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"248be0444375f135d382bc907f42fecbab0a4c0de7dce557f8193320367f6c30\": container with ID starting with 248be0444375f135d382bc907f42fecbab0a4c0de7dce557f8193320367f6c30 not found: ID does not exist" containerID="248be0444375f135d382bc907f42fecbab0a4c0de7dce557f8193320367f6c30" Oct 07 14:28:20 crc kubenswrapper[4854]: I1007 14:28:20.761781 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"248be0444375f135d382bc907f42fecbab0a4c0de7dce557f8193320367f6c30"} err="failed to get container status \"248be0444375f135d382bc907f42fecbab0a4c0de7dce557f8193320367f6c30\": rpc error: code = NotFound desc = could not find container \"248be0444375f135d382bc907f42fecbab0a4c0de7dce557f8193320367f6c30\": container with ID starting with 248be0444375f135d382bc907f42fecbab0a4c0de7dce557f8193320367f6c30 not found: ID does not exist" Oct 07 14:28:40 crc kubenswrapper[4854]: I1007 14:28:40.811668 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:28:40 crc kubenswrapper[4854]: I1007 14:28:40.812325 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:28:40 crc kubenswrapper[4854]: I1007 14:28:40.812386 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 14:28:40 crc kubenswrapper[4854]: I1007 14:28:40.813357 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:28:40 crc kubenswrapper[4854]: I1007 14:28:40.813460 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" gracePeriod=600 Oct 07 14:28:40 crc kubenswrapper[4854]: E1007 14:28:40.937360 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:28:41 crc kubenswrapper[4854]: I1007 14:28:41.852845 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" exitCode=0 Oct 07 14:28:41 crc kubenswrapper[4854]: I1007 14:28:41.852895 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a"} Oct 07 14:28:41 crc kubenswrapper[4854]: I1007 14:28:41.852978 4854 scope.go:117] "RemoveContainer" containerID="fb5754d52b825a7168e1a2b62822d22b024d8163c0efe9cbc9c03c5805be855f" Oct 07 14:28:41 crc kubenswrapper[4854]: I1007 14:28:41.854404 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:28:41 crc kubenswrapper[4854]: E1007 14:28:41.854831 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:28:53 crc kubenswrapper[4854]: I1007 14:28:53.702658 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:28:53 crc kubenswrapper[4854]: E1007 14:28:53.703376 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:29:06 crc kubenswrapper[4854]: I1007 14:29:06.704529 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:29:06 crc kubenswrapper[4854]: E1007 14:29:06.705834 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:29:21 crc kubenswrapper[4854]: I1007 14:29:21.703507 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:29:21 crc kubenswrapper[4854]: E1007 14:29:21.704657 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:29:31 crc kubenswrapper[4854]: I1007 14:29:31.551128 4854 generic.go:334] "Generic (PLEG): container finished" podID="ead9f298-61ce-4835-b285-6df1dd26b9e5" containerID="6e4740f9a181b19182550413a825f7d364a1afd73bb3e9671af91762d0e4c0b7" exitCode=0 Oct 07 14:29:31 crc kubenswrapper[4854]: I1007 14:29:31.551230 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-2tv87" event={"ID":"ead9f298-61ce-4835-b285-6df1dd26b9e5","Type":"ContainerDied","Data":"6e4740f9a181b19182550413a825f7d364a1afd73bb3e9671af91762d0e4c0b7"} Oct 07 14:29:32 crc kubenswrapper[4854]: I1007 14:29:32.705278 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:29:32 crc kubenswrapper[4854]: E1007 14:29:32.705864 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.131826 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.204649 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-ceph\") pod \"ead9f298-61ce-4835-b285-6df1dd26b9e5\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.204842 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-inventory\") pod \"ead9f298-61ce-4835-b285-6df1dd26b9e5\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.204995 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-ssh-key\") pod \"ead9f298-61ce-4835-b285-6df1dd26b9e5\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.205173 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jblz\" (UniqueName: \"kubernetes.io/projected/ead9f298-61ce-4835-b285-6df1dd26b9e5-kube-api-access-6jblz\") pod \"ead9f298-61ce-4835-b285-6df1dd26b9e5\" (UID: \"ead9f298-61ce-4835-b285-6df1dd26b9e5\") " Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.210433 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ead9f298-61ce-4835-b285-6df1dd26b9e5-kube-api-access-6jblz" (OuterVolumeSpecName: "kube-api-access-6jblz") pod "ead9f298-61ce-4835-b285-6df1dd26b9e5" (UID: "ead9f298-61ce-4835-b285-6df1dd26b9e5"). InnerVolumeSpecName "kube-api-access-6jblz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.218603 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-ceph" (OuterVolumeSpecName: "ceph") pod "ead9f298-61ce-4835-b285-6df1dd26b9e5" (UID: "ead9f298-61ce-4835-b285-6df1dd26b9e5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.239289 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-inventory" (OuterVolumeSpecName: "inventory") pod "ead9f298-61ce-4835-b285-6df1dd26b9e5" (UID: "ead9f298-61ce-4835-b285-6df1dd26b9e5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.244815 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ead9f298-61ce-4835-b285-6df1dd26b9e5" (UID: "ead9f298-61ce-4835-b285-6df1dd26b9e5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.306673 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.306701 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.306711 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jblz\" (UniqueName: \"kubernetes.io/projected/ead9f298-61ce-4835-b285-6df1dd26b9e5-kube-api-access-6jblz\") on node \"crc\" DevicePath \"\"" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.306721 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ead9f298-61ce-4835-b285-6df1dd26b9e5-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.582584 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-2tv87" event={"ID":"ead9f298-61ce-4835-b285-6df1dd26b9e5","Type":"ContainerDied","Data":"0a334a53fe2d02c62bbb93058629f79dd4cb0ccdab5ba3ca2f043652c6a7479d"} Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.582637 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a334a53fe2d02c62bbb93058629f79dd4cb0ccdab5ba3ca2f043652c6a7479d" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.582715 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-2tv87" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.682130 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-zxx2n"] Oct 07 14:29:33 crc kubenswrapper[4854]: E1007 14:29:33.682575 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc30509-faa0-449b-94c4-12ce64a27c1f" containerName="extract-content" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.682591 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc30509-faa0-449b-94c4-12ce64a27c1f" containerName="extract-content" Oct 07 14:29:33 crc kubenswrapper[4854]: E1007 14:29:33.682627 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ead9f298-61ce-4835-b285-6df1dd26b9e5" containerName="configure-network-openstack-openstack-cell1" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.682634 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead9f298-61ce-4835-b285-6df1dd26b9e5" containerName="configure-network-openstack-openstack-cell1" Oct 07 14:29:33 crc kubenswrapper[4854]: E1007 14:29:33.682649 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc30509-faa0-449b-94c4-12ce64a27c1f" containerName="registry-server" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.682655 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc30509-faa0-449b-94c4-12ce64a27c1f" containerName="registry-server" Oct 07 14:29:33 crc kubenswrapper[4854]: E1007 14:29:33.682672 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc30509-faa0-449b-94c4-12ce64a27c1f" containerName="extract-utilities" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.682679 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc30509-faa0-449b-94c4-12ce64a27c1f" containerName="extract-utilities" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.682862 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bc30509-faa0-449b-94c4-12ce64a27c1f" containerName="registry-server" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.682877 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="ead9f298-61ce-4835-b285-6df1dd26b9e5" containerName="configure-network-openstack-openstack-cell1" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.683559 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.689987 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.690082 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.690938 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.691036 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.719799 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-zxx2n"] Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.738641 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-inventory\") pod \"validate-network-openstack-openstack-cell1-zxx2n\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.738780 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kch2h\" (UniqueName: \"kubernetes.io/projected/b6281fe5-e710-49a7-88e0-07c925cffc8e-kube-api-access-kch2h\") pod \"validate-network-openstack-openstack-cell1-zxx2n\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.739225 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-ssh-key\") pod \"validate-network-openstack-openstack-cell1-zxx2n\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.739619 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-ceph\") pod \"validate-network-openstack-openstack-cell1-zxx2n\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.840332 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-inventory\") pod \"validate-network-openstack-openstack-cell1-zxx2n\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.840390 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kch2h\" (UniqueName: \"kubernetes.io/projected/b6281fe5-e710-49a7-88e0-07c925cffc8e-kube-api-access-kch2h\") pod \"validate-network-openstack-openstack-cell1-zxx2n\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.840434 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-ssh-key\") pod \"validate-network-openstack-openstack-cell1-zxx2n\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.840500 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-ceph\") pod \"validate-network-openstack-openstack-cell1-zxx2n\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.845601 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-inventory\") pod \"validate-network-openstack-openstack-cell1-zxx2n\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.845774 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-ssh-key\") pod \"validate-network-openstack-openstack-cell1-zxx2n\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.847679 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-ceph\") pod \"validate-network-openstack-openstack-cell1-zxx2n\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:33 crc kubenswrapper[4854]: I1007 14:29:33.857985 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kch2h\" (UniqueName: \"kubernetes.io/projected/b6281fe5-e710-49a7-88e0-07c925cffc8e-kube-api-access-kch2h\") pod \"validate-network-openstack-openstack-cell1-zxx2n\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:34 crc kubenswrapper[4854]: I1007 14:29:34.020832 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:34 crc kubenswrapper[4854]: I1007 14:29:34.648869 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-zxx2n"] Oct 07 14:29:34 crc kubenswrapper[4854]: I1007 14:29:34.664894 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:29:35 crc kubenswrapper[4854]: I1007 14:29:35.604631 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" event={"ID":"b6281fe5-e710-49a7-88e0-07c925cffc8e","Type":"ContainerStarted","Data":"15833655a6ea7db9a60a48b755c07e772b734622ce93fd98014872e2d1abf22f"} Oct 07 14:29:36 crc kubenswrapper[4854]: I1007 14:29:36.616195 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" event={"ID":"b6281fe5-e710-49a7-88e0-07c925cffc8e","Type":"ContainerStarted","Data":"2e0db2a3c2aac00e6c101d38a2af6464f6196cc2ab30ef9b1707eb7aea7af2b2"} Oct 07 14:29:36 crc kubenswrapper[4854]: I1007 14:29:36.646358 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" podStartSLOduration=2.941917047 podStartE2EDuration="3.646288633s" podCreationTimestamp="2025-10-07 14:29:33 +0000 UTC" firstStartedPulling="2025-10-07 14:29:34.664673197 +0000 UTC m=+7490.652505452" lastFinishedPulling="2025-10-07 14:29:35.369044763 +0000 UTC m=+7491.356877038" observedRunningTime="2025-10-07 14:29:36.637918782 +0000 UTC m=+7492.625751047" watchObservedRunningTime="2025-10-07 14:29:36.646288633 +0000 UTC m=+7492.634120928" Oct 07 14:29:40 crc kubenswrapper[4854]: I1007 14:29:40.665735 4854 generic.go:334] "Generic (PLEG): container finished" podID="b6281fe5-e710-49a7-88e0-07c925cffc8e" containerID="2e0db2a3c2aac00e6c101d38a2af6464f6196cc2ab30ef9b1707eb7aea7af2b2" exitCode=0 Oct 07 14:29:40 crc kubenswrapper[4854]: I1007 14:29:40.665851 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" event={"ID":"b6281fe5-e710-49a7-88e0-07c925cffc8e","Type":"ContainerDied","Data":"2e0db2a3c2aac00e6c101d38a2af6464f6196cc2ab30ef9b1707eb7aea7af2b2"} Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.141274 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.245550 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-ceph\") pod \"b6281fe5-e710-49a7-88e0-07c925cffc8e\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.245714 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-ssh-key\") pod \"b6281fe5-e710-49a7-88e0-07c925cffc8e\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.246239 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kch2h\" (UniqueName: \"kubernetes.io/projected/b6281fe5-e710-49a7-88e0-07c925cffc8e-kube-api-access-kch2h\") pod \"b6281fe5-e710-49a7-88e0-07c925cffc8e\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.247281 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-inventory\") pod \"b6281fe5-e710-49a7-88e0-07c925cffc8e\" (UID: \"b6281fe5-e710-49a7-88e0-07c925cffc8e\") " Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.254367 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-ceph" (OuterVolumeSpecName: "ceph") pod "b6281fe5-e710-49a7-88e0-07c925cffc8e" (UID: "b6281fe5-e710-49a7-88e0-07c925cffc8e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.258841 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6281fe5-e710-49a7-88e0-07c925cffc8e-kube-api-access-kch2h" (OuterVolumeSpecName: "kube-api-access-kch2h") pod "b6281fe5-e710-49a7-88e0-07c925cffc8e" (UID: "b6281fe5-e710-49a7-88e0-07c925cffc8e"). InnerVolumeSpecName "kube-api-access-kch2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.284360 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b6281fe5-e710-49a7-88e0-07c925cffc8e" (UID: "b6281fe5-e710-49a7-88e0-07c925cffc8e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.286428 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-inventory" (OuterVolumeSpecName: "inventory") pod "b6281fe5-e710-49a7-88e0-07c925cffc8e" (UID: "b6281fe5-e710-49a7-88e0-07c925cffc8e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.351933 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kch2h\" (UniqueName: \"kubernetes.io/projected/b6281fe5-e710-49a7-88e0-07c925cffc8e-kube-api-access-kch2h\") on node \"crc\" DevicePath \"\"" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.351973 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.351986 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.351997 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6281fe5-e710-49a7-88e0-07c925cffc8e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.693281 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" event={"ID":"b6281fe5-e710-49a7-88e0-07c925cffc8e","Type":"ContainerDied","Data":"15833655a6ea7db9a60a48b755c07e772b734622ce93fd98014872e2d1abf22f"} Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.693357 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15833655a6ea7db9a60a48b755c07e772b734622ce93fd98014872e2d1abf22f" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.693362 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-zxx2n" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.789097 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-hldrq"] Oct 07 14:29:42 crc kubenswrapper[4854]: E1007 14:29:42.789805 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6281fe5-e710-49a7-88e0-07c925cffc8e" containerName="validate-network-openstack-openstack-cell1" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.789833 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6281fe5-e710-49a7-88e0-07c925cffc8e" containerName="validate-network-openstack-openstack-cell1" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.790123 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6281fe5-e710-49a7-88e0-07c925cffc8e" containerName="validate-network-openstack-openstack-cell1" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.801246 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.805680 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.805900 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.806037 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.806252 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.813918 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-hldrq"] Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.978654 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khrvp\" (UniqueName: \"kubernetes.io/projected/67034203-e483-4499-a262-82cd715d1459-kube-api-access-khrvp\") pod \"install-os-openstack-openstack-cell1-hldrq\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.978916 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-ceph\") pod \"install-os-openstack-openstack-cell1-hldrq\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.979001 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-inventory\") pod \"install-os-openstack-openstack-cell1-hldrq\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:29:42 crc kubenswrapper[4854]: I1007 14:29:42.979069 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-ssh-key\") pod \"install-os-openstack-openstack-cell1-hldrq\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:29:43 crc kubenswrapper[4854]: I1007 14:29:43.081000 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khrvp\" (UniqueName: \"kubernetes.io/projected/67034203-e483-4499-a262-82cd715d1459-kube-api-access-khrvp\") pod \"install-os-openstack-openstack-cell1-hldrq\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:29:43 crc kubenswrapper[4854]: I1007 14:29:43.082072 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-ceph\") pod \"install-os-openstack-openstack-cell1-hldrq\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:29:43 crc kubenswrapper[4854]: I1007 14:29:43.082946 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-inventory\") pod \"install-os-openstack-openstack-cell1-hldrq\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:29:43 crc kubenswrapper[4854]: I1007 14:29:43.083061 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-ssh-key\") pod \"install-os-openstack-openstack-cell1-hldrq\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:29:43 crc kubenswrapper[4854]: I1007 14:29:43.088197 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-ceph\") pod \"install-os-openstack-openstack-cell1-hldrq\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:29:43 crc kubenswrapper[4854]: I1007 14:29:43.090508 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-ssh-key\") pod \"install-os-openstack-openstack-cell1-hldrq\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:29:43 crc kubenswrapper[4854]: I1007 14:29:43.095925 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-inventory\") pod \"install-os-openstack-openstack-cell1-hldrq\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:29:43 crc kubenswrapper[4854]: I1007 14:29:43.115929 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khrvp\" (UniqueName: \"kubernetes.io/projected/67034203-e483-4499-a262-82cd715d1459-kube-api-access-khrvp\") pod \"install-os-openstack-openstack-cell1-hldrq\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:29:43 crc kubenswrapper[4854]: I1007 14:29:43.124292 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:29:43 crc kubenswrapper[4854]: I1007 14:29:43.764263 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-hldrq"] Oct 07 14:29:44 crc kubenswrapper[4854]: I1007 14:29:44.726814 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:29:44 crc kubenswrapper[4854]: E1007 14:29:44.727542 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:29:44 crc kubenswrapper[4854]: I1007 14:29:44.738289 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-hldrq" event={"ID":"67034203-e483-4499-a262-82cd715d1459","Type":"ContainerStarted","Data":"5552216c45180a877d979f2cf176879c0a46e08b0c75715c8d142ff4c6017951"} Oct 07 14:29:45 crc kubenswrapper[4854]: I1007 14:29:45.744777 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-hldrq" event={"ID":"67034203-e483-4499-a262-82cd715d1459","Type":"ContainerStarted","Data":"d8cc175f1162950d14904e9b76a57657514ea8103bd9d9e0d8b641ce8e80773a"} Oct 07 14:29:45 crc kubenswrapper[4854]: I1007 14:29:45.774298 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-hldrq" podStartSLOduration=2.87419214 podStartE2EDuration="3.774262668s" podCreationTimestamp="2025-10-07 14:29:42 +0000 UTC" firstStartedPulling="2025-10-07 14:29:43.771943784 +0000 UTC m=+7499.759776039" lastFinishedPulling="2025-10-07 14:29:44.672014272 +0000 UTC m=+7500.659846567" observedRunningTime="2025-10-07 14:29:45.767662758 +0000 UTC m=+7501.755495053" watchObservedRunningTime="2025-10-07 14:29:45.774262668 +0000 UTC m=+7501.762094963" Oct 07 14:29:59 crc kubenswrapper[4854]: I1007 14:29:59.704015 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:29:59 crc kubenswrapper[4854]: E1007 14:29:59.705031 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.215469 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t"] Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.217984 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.220013 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.220497 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.229801 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t"] Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.396918 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9895z\" (UniqueName: \"kubernetes.io/projected/daac9c87-5b60-4b84-9328-86291ae2ce3c-kube-api-access-9895z\") pod \"collect-profiles-29330790-cnh9t\" (UID: \"daac9c87-5b60-4b84-9328-86291ae2ce3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.397187 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/daac9c87-5b60-4b84-9328-86291ae2ce3c-config-volume\") pod \"collect-profiles-29330790-cnh9t\" (UID: \"daac9c87-5b60-4b84-9328-86291ae2ce3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.397250 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/daac9c87-5b60-4b84-9328-86291ae2ce3c-secret-volume\") pod \"collect-profiles-29330790-cnh9t\" (UID: \"daac9c87-5b60-4b84-9328-86291ae2ce3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.499634 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/daac9c87-5b60-4b84-9328-86291ae2ce3c-config-volume\") pod \"collect-profiles-29330790-cnh9t\" (UID: \"daac9c87-5b60-4b84-9328-86291ae2ce3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.499745 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/daac9c87-5b60-4b84-9328-86291ae2ce3c-secret-volume\") pod \"collect-profiles-29330790-cnh9t\" (UID: \"daac9c87-5b60-4b84-9328-86291ae2ce3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.499819 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9895z\" (UniqueName: \"kubernetes.io/projected/daac9c87-5b60-4b84-9328-86291ae2ce3c-kube-api-access-9895z\") pod \"collect-profiles-29330790-cnh9t\" (UID: \"daac9c87-5b60-4b84-9328-86291ae2ce3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.500501 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/daac9c87-5b60-4b84-9328-86291ae2ce3c-config-volume\") pod \"collect-profiles-29330790-cnh9t\" (UID: \"daac9c87-5b60-4b84-9328-86291ae2ce3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.509080 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/daac9c87-5b60-4b84-9328-86291ae2ce3c-secret-volume\") pod \"collect-profiles-29330790-cnh9t\" (UID: \"daac9c87-5b60-4b84-9328-86291ae2ce3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.527011 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9895z\" (UniqueName: \"kubernetes.io/projected/daac9c87-5b60-4b84-9328-86291ae2ce3c-kube-api-access-9895z\") pod \"collect-profiles-29330790-cnh9t\" (UID: \"daac9c87-5b60-4b84-9328-86291ae2ce3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.544431 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" Oct 07 14:30:00 crc kubenswrapper[4854]: I1007 14:30:00.974646 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t"] Oct 07 14:30:01 crc kubenswrapper[4854]: I1007 14:30:01.931174 4854 generic.go:334] "Generic (PLEG): container finished" podID="daac9c87-5b60-4b84-9328-86291ae2ce3c" containerID="52cc3c9d1bcc4b7422e291044ef27c5f029123fabb04f2236ecc6b70f5dbc74d" exitCode=0 Oct 07 14:30:01 crc kubenswrapper[4854]: I1007 14:30:01.931264 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" event={"ID":"daac9c87-5b60-4b84-9328-86291ae2ce3c","Type":"ContainerDied","Data":"52cc3c9d1bcc4b7422e291044ef27c5f029123fabb04f2236ecc6b70f5dbc74d"} Oct 07 14:30:01 crc kubenswrapper[4854]: I1007 14:30:01.931810 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" event={"ID":"daac9c87-5b60-4b84-9328-86291ae2ce3c","Type":"ContainerStarted","Data":"deeb8ae471c80aeb79c5b23c1bc20c7177b3bd99121f5fb525709f6cb87cb33e"} Oct 07 14:30:03 crc kubenswrapper[4854]: I1007 14:30:03.364706 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" Oct 07 14:30:03 crc kubenswrapper[4854]: I1007 14:30:03.470264 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/daac9c87-5b60-4b84-9328-86291ae2ce3c-secret-volume\") pod \"daac9c87-5b60-4b84-9328-86291ae2ce3c\" (UID: \"daac9c87-5b60-4b84-9328-86291ae2ce3c\") " Oct 07 14:30:03 crc kubenswrapper[4854]: I1007 14:30:03.470626 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9895z\" (UniqueName: \"kubernetes.io/projected/daac9c87-5b60-4b84-9328-86291ae2ce3c-kube-api-access-9895z\") pod \"daac9c87-5b60-4b84-9328-86291ae2ce3c\" (UID: \"daac9c87-5b60-4b84-9328-86291ae2ce3c\") " Oct 07 14:30:03 crc kubenswrapper[4854]: I1007 14:30:03.470727 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/daac9c87-5b60-4b84-9328-86291ae2ce3c-config-volume\") pod \"daac9c87-5b60-4b84-9328-86291ae2ce3c\" (UID: \"daac9c87-5b60-4b84-9328-86291ae2ce3c\") " Oct 07 14:30:03 crc kubenswrapper[4854]: I1007 14:30:03.472038 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daac9c87-5b60-4b84-9328-86291ae2ce3c-config-volume" (OuterVolumeSpecName: "config-volume") pod "daac9c87-5b60-4b84-9328-86291ae2ce3c" (UID: "daac9c87-5b60-4b84-9328-86291ae2ce3c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:30:03 crc kubenswrapper[4854]: I1007 14:30:03.475670 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daac9c87-5b60-4b84-9328-86291ae2ce3c-kube-api-access-9895z" (OuterVolumeSpecName: "kube-api-access-9895z") pod "daac9c87-5b60-4b84-9328-86291ae2ce3c" (UID: "daac9c87-5b60-4b84-9328-86291ae2ce3c"). InnerVolumeSpecName "kube-api-access-9895z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:30:03 crc kubenswrapper[4854]: I1007 14:30:03.475811 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daac9c87-5b60-4b84-9328-86291ae2ce3c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "daac9c87-5b60-4b84-9328-86291ae2ce3c" (UID: "daac9c87-5b60-4b84-9328-86291ae2ce3c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:30:03 crc kubenswrapper[4854]: I1007 14:30:03.573520 4854 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/daac9c87-5b60-4b84-9328-86291ae2ce3c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:03 crc kubenswrapper[4854]: I1007 14:30:03.573556 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9895z\" (UniqueName: \"kubernetes.io/projected/daac9c87-5b60-4b84-9328-86291ae2ce3c-kube-api-access-9895z\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:03 crc kubenswrapper[4854]: I1007 14:30:03.573567 4854 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/daac9c87-5b60-4b84-9328-86291ae2ce3c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:03 crc kubenswrapper[4854]: I1007 14:30:03.959345 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" event={"ID":"daac9c87-5b60-4b84-9328-86291ae2ce3c","Type":"ContainerDied","Data":"deeb8ae471c80aeb79c5b23c1bc20c7177b3bd99121f5fb525709f6cb87cb33e"} Oct 07 14:30:03 crc kubenswrapper[4854]: I1007 14:30:03.959412 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deeb8ae471c80aeb79c5b23c1bc20c7177b3bd99121f5fb525709f6cb87cb33e" Oct 07 14:30:03 crc kubenswrapper[4854]: I1007 14:30:03.959411 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t" Oct 07 14:30:04 crc kubenswrapper[4854]: I1007 14:30:04.469652 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf"] Oct 07 14:30:04 crc kubenswrapper[4854]: I1007 14:30:04.483928 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-prhbf"] Oct 07 14:30:04 crc kubenswrapper[4854]: I1007 14:30:04.720351 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="512d8fce-7696-41dc-9600-f1b94e06db5c" path="/var/lib/kubelet/pods/512d8fce-7696-41dc-9600-f1b94e06db5c/volumes" Oct 07 14:30:12 crc kubenswrapper[4854]: I1007 14:30:12.702417 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:30:12 crc kubenswrapper[4854]: E1007 14:30:12.703206 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:30:24 crc kubenswrapper[4854]: I1007 14:30:24.708996 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:30:24 crc kubenswrapper[4854]: E1007 14:30:24.709928 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:30:28 crc kubenswrapper[4854]: I1007 14:30:28.132068 4854 scope.go:117] "RemoveContainer" containerID="65cc0d6dce84e751b028673ddfc3e069ced4950816cf0ca0cb5b49a18da32ef7" Oct 07 14:30:31 crc kubenswrapper[4854]: I1007 14:30:31.325885 4854 generic.go:334] "Generic (PLEG): container finished" podID="67034203-e483-4499-a262-82cd715d1459" containerID="d8cc175f1162950d14904e9b76a57657514ea8103bd9d9e0d8b641ce8e80773a" exitCode=0 Oct 07 14:30:31 crc kubenswrapper[4854]: I1007 14:30:31.325992 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-hldrq" event={"ID":"67034203-e483-4499-a262-82cd715d1459","Type":"ContainerDied","Data":"d8cc175f1162950d14904e9b76a57657514ea8103bd9d9e0d8b641ce8e80773a"} Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:32.848354 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:32.926031 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-ceph\") pod \"67034203-e483-4499-a262-82cd715d1459\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:32.926205 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-ssh-key\") pod \"67034203-e483-4499-a262-82cd715d1459\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:32.926250 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khrvp\" (UniqueName: \"kubernetes.io/projected/67034203-e483-4499-a262-82cd715d1459-kube-api-access-khrvp\") pod \"67034203-e483-4499-a262-82cd715d1459\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:32.926302 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-inventory\") pod \"67034203-e483-4499-a262-82cd715d1459\" (UID: \"67034203-e483-4499-a262-82cd715d1459\") " Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:32.931442 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-ceph" (OuterVolumeSpecName: "ceph") pod "67034203-e483-4499-a262-82cd715d1459" (UID: "67034203-e483-4499-a262-82cd715d1459"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:32.933881 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67034203-e483-4499-a262-82cd715d1459-kube-api-access-khrvp" (OuterVolumeSpecName: "kube-api-access-khrvp") pod "67034203-e483-4499-a262-82cd715d1459" (UID: "67034203-e483-4499-a262-82cd715d1459"). InnerVolumeSpecName "kube-api-access-khrvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:32.958475 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-inventory" (OuterVolumeSpecName: "inventory") pod "67034203-e483-4499-a262-82cd715d1459" (UID: "67034203-e483-4499-a262-82cd715d1459"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:32.967275 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "67034203-e483-4499-a262-82cd715d1459" (UID: "67034203-e483-4499-a262-82cd715d1459"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.028019 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.028045 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.028056 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khrvp\" (UniqueName: \"kubernetes.io/projected/67034203-e483-4499-a262-82cd715d1459-kube-api-access-khrvp\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.028066 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67034203-e483-4499-a262-82cd715d1459-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.356644 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-hldrq" event={"ID":"67034203-e483-4499-a262-82cd715d1459","Type":"ContainerDied","Data":"5552216c45180a877d979f2cf176879c0a46e08b0c75715c8d142ff4c6017951"} Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.357393 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5552216c45180a877d979f2cf176879c0a46e08b0c75715c8d142ff4c6017951" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.356960 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-hldrq" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.453108 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-w4qv8"] Oct 07 14:30:33 crc kubenswrapper[4854]: E1007 14:30:33.453711 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67034203-e483-4499-a262-82cd715d1459" containerName="install-os-openstack-openstack-cell1" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.453740 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="67034203-e483-4499-a262-82cd715d1459" containerName="install-os-openstack-openstack-cell1" Oct 07 14:30:33 crc kubenswrapper[4854]: E1007 14:30:33.453774 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daac9c87-5b60-4b84-9328-86291ae2ce3c" containerName="collect-profiles" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.453783 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="daac9c87-5b60-4b84-9328-86291ae2ce3c" containerName="collect-profiles" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.454055 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="daac9c87-5b60-4b84-9328-86291ae2ce3c" containerName="collect-profiles" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.454094 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="67034203-e483-4499-a262-82cd715d1459" containerName="install-os-openstack-openstack-cell1" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.455007 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.457169 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.457297 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.457560 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.457942 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.465220 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-w4qv8"] Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.537051 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-inventory\") pod \"configure-os-openstack-openstack-cell1-w4qv8\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.537199 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-ssh-key\") pod \"configure-os-openstack-openstack-cell1-w4qv8\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.537371 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qj95\" (UniqueName: \"kubernetes.io/projected/702bf0e1-c6e1-4718-8520-bf22b1aa913f-kube-api-access-7qj95\") pod \"configure-os-openstack-openstack-cell1-w4qv8\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.537670 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-ceph\") pod \"configure-os-openstack-openstack-cell1-w4qv8\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.640466 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-inventory\") pod \"configure-os-openstack-openstack-cell1-w4qv8\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.640724 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-ssh-key\") pod \"configure-os-openstack-openstack-cell1-w4qv8\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.640833 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qj95\" (UniqueName: \"kubernetes.io/projected/702bf0e1-c6e1-4718-8520-bf22b1aa913f-kube-api-access-7qj95\") pod \"configure-os-openstack-openstack-cell1-w4qv8\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.640946 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-ceph\") pod \"configure-os-openstack-openstack-cell1-w4qv8\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.646191 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-ssh-key\") pod \"configure-os-openstack-openstack-cell1-w4qv8\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.648537 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-ceph\") pod \"configure-os-openstack-openstack-cell1-w4qv8\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.648917 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-inventory\") pod \"configure-os-openstack-openstack-cell1-w4qv8\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.658932 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qj95\" (UniqueName: \"kubernetes.io/projected/702bf0e1-c6e1-4718-8520-bf22b1aa913f-kube-api-access-7qj95\") pod \"configure-os-openstack-openstack-cell1-w4qv8\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:30:33 crc kubenswrapper[4854]: I1007 14:30:33.781094 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:30:34 crc kubenswrapper[4854]: I1007 14:30:34.417059 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-w4qv8"] Oct 07 14:30:35 crc kubenswrapper[4854]: I1007 14:30:35.395121 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" event={"ID":"702bf0e1-c6e1-4718-8520-bf22b1aa913f","Type":"ContainerStarted","Data":"92778925fd128fff180c0b1212796d6e45983e23f1ef1103c789e69b2e8505d1"} Oct 07 14:30:36 crc kubenswrapper[4854]: I1007 14:30:36.414902 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" event={"ID":"702bf0e1-c6e1-4718-8520-bf22b1aa913f","Type":"ContainerStarted","Data":"995603dc9401cc77a34acdb7b258815f32dc560ae6ddc50e9a8843d181357997"} Oct 07 14:30:36 crc kubenswrapper[4854]: I1007 14:30:36.440408 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" podStartSLOduration=2.7111920400000002 podStartE2EDuration="3.440381342s" podCreationTimestamp="2025-10-07 14:30:33 +0000 UTC" firstStartedPulling="2025-10-07 14:30:34.434805584 +0000 UTC m=+7550.422637839" lastFinishedPulling="2025-10-07 14:30:35.163994846 +0000 UTC m=+7551.151827141" observedRunningTime="2025-10-07 14:30:36.437584072 +0000 UTC m=+7552.425416327" watchObservedRunningTime="2025-10-07 14:30:36.440381342 +0000 UTC m=+7552.428213637" Oct 07 14:30:38 crc kubenswrapper[4854]: I1007 14:30:38.702779 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:30:38 crc kubenswrapper[4854]: E1007 14:30:38.703554 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:30:50 crc kubenswrapper[4854]: I1007 14:30:50.707231 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:30:50 crc kubenswrapper[4854]: E1007 14:30:50.708101 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:31:01 crc kubenswrapper[4854]: I1007 14:31:01.703034 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:31:01 crc kubenswrapper[4854]: E1007 14:31:01.703927 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:31:15 crc kubenswrapper[4854]: I1007 14:31:15.703676 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:31:15 crc kubenswrapper[4854]: E1007 14:31:15.704913 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:31:21 crc kubenswrapper[4854]: I1007 14:31:21.961444 4854 generic.go:334] "Generic (PLEG): container finished" podID="702bf0e1-c6e1-4718-8520-bf22b1aa913f" containerID="995603dc9401cc77a34acdb7b258815f32dc560ae6ddc50e9a8843d181357997" exitCode=0 Oct 07 14:31:21 crc kubenswrapper[4854]: I1007 14:31:21.961534 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" event={"ID":"702bf0e1-c6e1-4718-8520-bf22b1aa913f","Type":"ContainerDied","Data":"995603dc9401cc77a34acdb7b258815f32dc560ae6ddc50e9a8843d181357997"} Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.492347 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.615911 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-ssh-key\") pod \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.615950 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-inventory\") pod \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.615981 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-ceph\") pod \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.616058 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qj95\" (UniqueName: \"kubernetes.io/projected/702bf0e1-c6e1-4718-8520-bf22b1aa913f-kube-api-access-7qj95\") pod \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\" (UID: \"702bf0e1-c6e1-4718-8520-bf22b1aa913f\") " Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.622401 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/702bf0e1-c6e1-4718-8520-bf22b1aa913f-kube-api-access-7qj95" (OuterVolumeSpecName: "kube-api-access-7qj95") pod "702bf0e1-c6e1-4718-8520-bf22b1aa913f" (UID: "702bf0e1-c6e1-4718-8520-bf22b1aa913f"). InnerVolumeSpecName "kube-api-access-7qj95". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.622631 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-ceph" (OuterVolumeSpecName: "ceph") pod "702bf0e1-c6e1-4718-8520-bf22b1aa913f" (UID: "702bf0e1-c6e1-4718-8520-bf22b1aa913f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.649339 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "702bf0e1-c6e1-4718-8520-bf22b1aa913f" (UID: "702bf0e1-c6e1-4718-8520-bf22b1aa913f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.672766 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-inventory" (OuterVolumeSpecName: "inventory") pod "702bf0e1-c6e1-4718-8520-bf22b1aa913f" (UID: "702bf0e1-c6e1-4718-8520-bf22b1aa913f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.717975 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.718011 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.718025 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/702bf0e1-c6e1-4718-8520-bf22b1aa913f-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.718038 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qj95\" (UniqueName: \"kubernetes.io/projected/702bf0e1-c6e1-4718-8520-bf22b1aa913f-kube-api-access-7qj95\") on node \"crc\" DevicePath \"\"" Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.983790 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" event={"ID":"702bf0e1-c6e1-4718-8520-bf22b1aa913f","Type":"ContainerDied","Data":"92778925fd128fff180c0b1212796d6e45983e23f1ef1103c789e69b2e8505d1"} Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.983837 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92778925fd128fff180c0b1212796d6e45983e23f1ef1103c789e69b2e8505d1" Oct 07 14:31:23 crc kubenswrapper[4854]: I1007 14:31:23.983897 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-w4qv8" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.111610 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-d6j9r"] Oct 07 14:31:24 crc kubenswrapper[4854]: E1007 14:31:24.112202 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702bf0e1-c6e1-4718-8520-bf22b1aa913f" containerName="configure-os-openstack-openstack-cell1" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.112287 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="702bf0e1-c6e1-4718-8520-bf22b1aa913f" containerName="configure-os-openstack-openstack-cell1" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.112576 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="702bf0e1-c6e1-4718-8520-bf22b1aa913f" containerName="configure-os-openstack-openstack-cell1" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.113412 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.115686 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.115940 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.116223 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.116797 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.128054 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-d6j9r"] Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.227314 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-inventory-0\") pod \"ssh-known-hosts-openstack-d6j9r\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.227383 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-d6j9r\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.227780 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-ceph\") pod \"ssh-known-hosts-openstack-d6j9r\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.227956 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j42j8\" (UniqueName: \"kubernetes.io/projected/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-kube-api-access-j42j8\") pod \"ssh-known-hosts-openstack-d6j9r\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.329762 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-ceph\") pod \"ssh-known-hosts-openstack-d6j9r\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.330306 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j42j8\" (UniqueName: \"kubernetes.io/projected/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-kube-api-access-j42j8\") pod \"ssh-known-hosts-openstack-d6j9r\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.330437 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-inventory-0\") pod \"ssh-known-hosts-openstack-d6j9r\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.330549 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-d6j9r\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.334472 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-ceph\") pod \"ssh-known-hosts-openstack-d6j9r\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.335496 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-d6j9r\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.342991 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-inventory-0\") pod \"ssh-known-hosts-openstack-d6j9r\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.345615 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j42j8\" (UniqueName: \"kubernetes.io/projected/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-kube-api-access-j42j8\") pod \"ssh-known-hosts-openstack-d6j9r\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.431938 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.974342 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-d6j9r"] Oct 07 14:31:24 crc kubenswrapper[4854]: I1007 14:31:24.996466 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-d6j9r" event={"ID":"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4","Type":"ContainerStarted","Data":"2b56664ef24ee5867ade42b5c77732d69be8831bd89b0807db752dfd3d27dd18"} Oct 07 14:31:26 crc kubenswrapper[4854]: I1007 14:31:26.702913 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:31:26 crc kubenswrapper[4854]: E1007 14:31:26.703770 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:31:27 crc kubenswrapper[4854]: I1007 14:31:27.026277 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-d6j9r" event={"ID":"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4","Type":"ContainerStarted","Data":"cd6861be64f29408e8c66119e15b5ae7c52339ea4aa3879acbf89d112f1801bd"} Oct 07 14:31:27 crc kubenswrapper[4854]: I1007 14:31:27.061403 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-d6j9r" podStartSLOduration=2.300887669 podStartE2EDuration="3.061372233s" podCreationTimestamp="2025-10-07 14:31:24 +0000 UTC" firstStartedPulling="2025-10-07 14:31:24.986635442 +0000 UTC m=+7600.974467717" lastFinishedPulling="2025-10-07 14:31:25.747120026 +0000 UTC m=+7601.734952281" observedRunningTime="2025-10-07 14:31:27.048718258 +0000 UTC m=+7603.036550553" watchObservedRunningTime="2025-10-07 14:31:27.061372233 +0000 UTC m=+7603.049204518" Oct 07 14:31:28 crc kubenswrapper[4854]: I1007 14:31:28.251584 4854 scope.go:117] "RemoveContainer" containerID="2cd74ceabdbdbc8e3f288f3977ad3dc4fdada79d5d0a132133d11cef7f44f15b" Oct 07 14:31:28 crc kubenswrapper[4854]: I1007 14:31:28.288846 4854 scope.go:117] "RemoveContainer" containerID="32e9135aa8292d1ad0e27d7deb1f4d58e3d93de36931e119a66c045e44397605" Oct 07 14:31:28 crc kubenswrapper[4854]: I1007 14:31:28.317956 4854 scope.go:117] "RemoveContainer" containerID="33701bb770e1968d566d31d372f2295d60e0636b194d99923642d46a5f03fc88" Oct 07 14:31:35 crc kubenswrapper[4854]: I1007 14:31:35.112517 4854 generic.go:334] "Generic (PLEG): container finished" podID="9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4" containerID="cd6861be64f29408e8c66119e15b5ae7c52339ea4aa3879acbf89d112f1801bd" exitCode=0 Oct 07 14:31:35 crc kubenswrapper[4854]: I1007 14:31:35.112641 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-d6j9r" event={"ID":"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4","Type":"ContainerDied","Data":"cd6861be64f29408e8c66119e15b5ae7c52339ea4aa3879acbf89d112f1801bd"} Oct 07 14:31:36 crc kubenswrapper[4854]: I1007 14:31:36.760257 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:36 crc kubenswrapper[4854]: I1007 14:31:36.855205 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-ssh-key-openstack-cell1\") pod \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " Oct 07 14:31:36 crc kubenswrapper[4854]: I1007 14:31:36.855340 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-inventory-0\") pod \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " Oct 07 14:31:36 crc kubenswrapper[4854]: I1007 14:31:36.855402 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j42j8\" (UniqueName: \"kubernetes.io/projected/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-kube-api-access-j42j8\") pod \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " Oct 07 14:31:36 crc kubenswrapper[4854]: I1007 14:31:36.855594 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-ceph\") pod \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\" (UID: \"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4\") " Oct 07 14:31:36 crc kubenswrapper[4854]: I1007 14:31:36.861170 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-kube-api-access-j42j8" (OuterVolumeSpecName: "kube-api-access-j42j8") pod "9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4" (UID: "9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4"). InnerVolumeSpecName "kube-api-access-j42j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:31:36 crc kubenswrapper[4854]: I1007 14:31:36.871380 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-ceph" (OuterVolumeSpecName: "ceph") pod "9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4" (UID: "9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:31:36 crc kubenswrapper[4854]: I1007 14:31:36.887473 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4" (UID: "9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:31:36 crc kubenswrapper[4854]: I1007 14:31:36.891915 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4" (UID: "9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:31:36 crc kubenswrapper[4854]: I1007 14:31:36.958000 4854 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:31:36 crc kubenswrapper[4854]: I1007 14:31:36.958032 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j42j8\" (UniqueName: \"kubernetes.io/projected/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-kube-api-access-j42j8\") on node \"crc\" DevicePath \"\"" Oct 07 14:31:36 crc kubenswrapper[4854]: I1007 14:31:36.958041 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:31:36 crc kubenswrapper[4854]: I1007 14:31:36.958049 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.145043 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-d6j9r" event={"ID":"9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4","Type":"ContainerDied","Data":"2b56664ef24ee5867ade42b5c77732d69be8831bd89b0807db752dfd3d27dd18"} Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.145094 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b56664ef24ee5867ade42b5c77732d69be8831bd89b0807db752dfd3d27dd18" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.145225 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-d6j9r" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.243392 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-vxvr4"] Oct 07 14:31:37 crc kubenswrapper[4854]: E1007 14:31:37.244268 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4" containerName="ssh-known-hosts-openstack" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.244405 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4" containerName="ssh-known-hosts-openstack" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.244923 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4" containerName="ssh-known-hosts-openstack" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.246280 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.251098 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.251412 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.251954 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.253225 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.263642 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-vxvr4"] Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.368345 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lnjh\" (UniqueName: \"kubernetes.io/projected/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-kube-api-access-6lnjh\") pod \"run-os-openstack-openstack-cell1-vxvr4\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.368729 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-inventory\") pod \"run-os-openstack-openstack-cell1-vxvr4\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.368759 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-ceph\") pod \"run-os-openstack-openstack-cell1-vxvr4\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.368928 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-ssh-key\") pod \"run-os-openstack-openstack-cell1-vxvr4\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.471627 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lnjh\" (UniqueName: \"kubernetes.io/projected/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-kube-api-access-6lnjh\") pod \"run-os-openstack-openstack-cell1-vxvr4\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.471706 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-inventory\") pod \"run-os-openstack-openstack-cell1-vxvr4\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.471749 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-ceph\") pod \"run-os-openstack-openstack-cell1-vxvr4\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.471867 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-ssh-key\") pod \"run-os-openstack-openstack-cell1-vxvr4\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.482706 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-inventory\") pod \"run-os-openstack-openstack-cell1-vxvr4\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.484163 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-ssh-key\") pod \"run-os-openstack-openstack-cell1-vxvr4\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.484690 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-ceph\") pod \"run-os-openstack-openstack-cell1-vxvr4\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.503291 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lnjh\" (UniqueName: \"kubernetes.io/projected/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-kube-api-access-6lnjh\") pod \"run-os-openstack-openstack-cell1-vxvr4\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.572291 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:37 crc kubenswrapper[4854]: I1007 14:31:37.706902 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:31:37 crc kubenswrapper[4854]: E1007 14:31:37.707425 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:31:38 crc kubenswrapper[4854]: W1007 14:31:38.189084 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7acd6b6d_b64d_48a9_b71e_1ff81fffa78a.slice/crio-540ea0c07de01d28ae8672f3897932eb31235b5b548ce535aaf9c74c01421154 WatchSource:0}: Error finding container 540ea0c07de01d28ae8672f3897932eb31235b5b548ce535aaf9c74c01421154: Status 404 returned error can't find the container with id 540ea0c07de01d28ae8672f3897932eb31235b5b548ce535aaf9c74c01421154 Oct 07 14:31:38 crc kubenswrapper[4854]: I1007 14:31:38.191884 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-vxvr4"] Oct 07 14:31:39 crc kubenswrapper[4854]: I1007 14:31:39.178859 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-vxvr4" event={"ID":"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a","Type":"ContainerStarted","Data":"540ea0c07de01d28ae8672f3897932eb31235b5b548ce535aaf9c74c01421154"} Oct 07 14:31:40 crc kubenswrapper[4854]: I1007 14:31:40.194108 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-vxvr4" event={"ID":"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a","Type":"ContainerStarted","Data":"7da2c7e0e55c6beed8c689b269c313e087e8846047ae1fc411ad39bc00ea7841"} Oct 07 14:31:40 crc kubenswrapper[4854]: I1007 14:31:40.221023 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-vxvr4" podStartSLOduration=2.338874315 podStartE2EDuration="3.221002556s" podCreationTimestamp="2025-10-07 14:31:37 +0000 UTC" firstStartedPulling="2025-10-07 14:31:38.192357782 +0000 UTC m=+7614.180190067" lastFinishedPulling="2025-10-07 14:31:39.074486033 +0000 UTC m=+7615.062318308" observedRunningTime="2025-10-07 14:31:40.217231747 +0000 UTC m=+7616.205064002" watchObservedRunningTime="2025-10-07 14:31:40.221002556 +0000 UTC m=+7616.208834821" Oct 07 14:31:48 crc kubenswrapper[4854]: I1007 14:31:48.311471 4854 generic.go:334] "Generic (PLEG): container finished" podID="7acd6b6d-b64d-48a9-b71e-1ff81fffa78a" containerID="7da2c7e0e55c6beed8c689b269c313e087e8846047ae1fc411ad39bc00ea7841" exitCode=0 Oct 07 14:31:48 crc kubenswrapper[4854]: I1007 14:31:48.311579 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-vxvr4" event={"ID":"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a","Type":"ContainerDied","Data":"7da2c7e0e55c6beed8c689b269c313e087e8846047ae1fc411ad39bc00ea7841"} Oct 07 14:31:48 crc kubenswrapper[4854]: I1007 14:31:48.703595 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:31:48 crc kubenswrapper[4854]: E1007 14:31:48.703970 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:31:49 crc kubenswrapper[4854]: I1007 14:31:49.820453 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:49 crc kubenswrapper[4854]: I1007 14:31:49.980340 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-inventory\") pod \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " Oct 07 14:31:49 crc kubenswrapper[4854]: I1007 14:31:49.980418 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-ceph\") pod \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " Oct 07 14:31:49 crc kubenswrapper[4854]: I1007 14:31:49.980533 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lnjh\" (UniqueName: \"kubernetes.io/projected/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-kube-api-access-6lnjh\") pod \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " Oct 07 14:31:49 crc kubenswrapper[4854]: I1007 14:31:49.980600 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-ssh-key\") pod \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\" (UID: \"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a\") " Oct 07 14:31:49 crc kubenswrapper[4854]: I1007 14:31:49.989911 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-ceph" (OuterVolumeSpecName: "ceph") pod "7acd6b6d-b64d-48a9-b71e-1ff81fffa78a" (UID: "7acd6b6d-b64d-48a9-b71e-1ff81fffa78a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:31:49 crc kubenswrapper[4854]: I1007 14:31:49.990222 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-kube-api-access-6lnjh" (OuterVolumeSpecName: "kube-api-access-6lnjh") pod "7acd6b6d-b64d-48a9-b71e-1ff81fffa78a" (UID: "7acd6b6d-b64d-48a9-b71e-1ff81fffa78a"). InnerVolumeSpecName "kube-api-access-6lnjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.009789 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7acd6b6d-b64d-48a9-b71e-1ff81fffa78a" (UID: "7acd6b6d-b64d-48a9-b71e-1ff81fffa78a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.010581 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-inventory" (OuterVolumeSpecName: "inventory") pod "7acd6b6d-b64d-48a9-b71e-1ff81fffa78a" (UID: "7acd6b6d-b64d-48a9-b71e-1ff81fffa78a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.085134 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.085178 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.085188 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lnjh\" (UniqueName: \"kubernetes.io/projected/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-kube-api-access-6lnjh\") on node \"crc\" DevicePath \"\"" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.085197 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7acd6b6d-b64d-48a9-b71e-1ff81fffa78a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.335536 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-vxvr4" event={"ID":"7acd6b6d-b64d-48a9-b71e-1ff81fffa78a","Type":"ContainerDied","Data":"540ea0c07de01d28ae8672f3897932eb31235b5b548ce535aaf9c74c01421154"} Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.335878 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="540ea0c07de01d28ae8672f3897932eb31235b5b548ce535aaf9c74c01421154" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.335579 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-vxvr4" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.436962 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-2mkt5"] Oct 07 14:31:50 crc kubenswrapper[4854]: E1007 14:31:50.437530 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7acd6b6d-b64d-48a9-b71e-1ff81fffa78a" containerName="run-os-openstack-openstack-cell1" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.437554 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7acd6b6d-b64d-48a9-b71e-1ff81fffa78a" containerName="run-os-openstack-openstack-cell1" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.437871 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7acd6b6d-b64d-48a9-b71e-1ff81fffa78a" containerName="run-os-openstack-openstack-cell1" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.438835 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.441859 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.443399 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.446367 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.447619 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-2mkt5"] Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.448266 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.596587 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-ceph\") pod \"reboot-os-openstack-openstack-cell1-2mkt5\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.596636 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b95w\" (UniqueName: \"kubernetes.io/projected/c94ab764-14e7-4100-8179-c00c088a611d-kube-api-access-6b95w\") pod \"reboot-os-openstack-openstack-cell1-2mkt5\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.596814 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-2mkt5\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.596900 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-inventory\") pod \"reboot-os-openstack-openstack-cell1-2mkt5\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.700983 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-ceph\") pod \"reboot-os-openstack-openstack-cell1-2mkt5\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.701075 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b95w\" (UniqueName: \"kubernetes.io/projected/c94ab764-14e7-4100-8179-c00c088a611d-kube-api-access-6b95w\") pod \"reboot-os-openstack-openstack-cell1-2mkt5\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.701454 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-2mkt5\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.701649 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-inventory\") pod \"reboot-os-openstack-openstack-cell1-2mkt5\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.714993 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-2mkt5\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.716272 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-inventory\") pod \"reboot-os-openstack-openstack-cell1-2mkt5\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.735480 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-ceph\") pod \"reboot-os-openstack-openstack-cell1-2mkt5\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.747668 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b95w\" (UniqueName: \"kubernetes.io/projected/c94ab764-14e7-4100-8179-c00c088a611d-kube-api-access-6b95w\") pod \"reboot-os-openstack-openstack-cell1-2mkt5\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:31:50 crc kubenswrapper[4854]: I1007 14:31:50.768422 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:31:51 crc kubenswrapper[4854]: I1007 14:31:51.332349 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-2mkt5"] Oct 07 14:31:51 crc kubenswrapper[4854]: I1007 14:31:51.345075 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" event={"ID":"c94ab764-14e7-4100-8179-c00c088a611d","Type":"ContainerStarted","Data":"df958299b35ca7223087e7bf82aeba93bc78d672f3a8a81b96201d8ea688c4b2"} Oct 07 14:31:53 crc kubenswrapper[4854]: I1007 14:31:53.372785 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" event={"ID":"c94ab764-14e7-4100-8179-c00c088a611d","Type":"ContainerStarted","Data":"e0cad760c2fa464fa0b3967531b308283abd2bf3a93d9c180dbf112d231fa956"} Oct 07 14:31:53 crc kubenswrapper[4854]: I1007 14:31:53.410274 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" podStartSLOduration=2.0800765549999998 podStartE2EDuration="3.410253171s" podCreationTimestamp="2025-10-07 14:31:50 +0000 UTC" firstStartedPulling="2025-10-07 14:31:51.334103169 +0000 UTC m=+7627.321935444" lastFinishedPulling="2025-10-07 14:31:52.664279785 +0000 UTC m=+7628.652112060" observedRunningTime="2025-10-07 14:31:53.38906414 +0000 UTC m=+7629.376896405" watchObservedRunningTime="2025-10-07 14:31:53.410253171 +0000 UTC m=+7629.398085426" Oct 07 14:32:03 crc kubenswrapper[4854]: I1007 14:32:03.703129 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:32:03 crc kubenswrapper[4854]: E1007 14:32:03.704301 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:32:09 crc kubenswrapper[4854]: I1007 14:32:09.562296 4854 generic.go:334] "Generic (PLEG): container finished" podID="c94ab764-14e7-4100-8179-c00c088a611d" containerID="e0cad760c2fa464fa0b3967531b308283abd2bf3a93d9c180dbf112d231fa956" exitCode=0 Oct 07 14:32:09 crc kubenswrapper[4854]: I1007 14:32:09.562424 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" event={"ID":"c94ab764-14e7-4100-8179-c00c088a611d","Type":"ContainerDied","Data":"e0cad760c2fa464fa0b3967531b308283abd2bf3a93d9c180dbf112d231fa956"} Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.072713 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.238486 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-ssh-key\") pod \"c94ab764-14e7-4100-8179-c00c088a611d\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.238587 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-inventory\") pod \"c94ab764-14e7-4100-8179-c00c088a611d\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.238628 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-ceph\") pod \"c94ab764-14e7-4100-8179-c00c088a611d\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.238759 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b95w\" (UniqueName: \"kubernetes.io/projected/c94ab764-14e7-4100-8179-c00c088a611d-kube-api-access-6b95w\") pod \"c94ab764-14e7-4100-8179-c00c088a611d\" (UID: \"c94ab764-14e7-4100-8179-c00c088a611d\") " Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.244845 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-ceph" (OuterVolumeSpecName: "ceph") pod "c94ab764-14e7-4100-8179-c00c088a611d" (UID: "c94ab764-14e7-4100-8179-c00c088a611d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.245510 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c94ab764-14e7-4100-8179-c00c088a611d-kube-api-access-6b95w" (OuterVolumeSpecName: "kube-api-access-6b95w") pod "c94ab764-14e7-4100-8179-c00c088a611d" (UID: "c94ab764-14e7-4100-8179-c00c088a611d"). InnerVolumeSpecName "kube-api-access-6b95w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.279344 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c94ab764-14e7-4100-8179-c00c088a611d" (UID: "c94ab764-14e7-4100-8179-c00c088a611d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.287916 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-inventory" (OuterVolumeSpecName: "inventory") pod "c94ab764-14e7-4100-8179-c00c088a611d" (UID: "c94ab764-14e7-4100-8179-c00c088a611d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.343302 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b95w\" (UniqueName: \"kubernetes.io/projected/c94ab764-14e7-4100-8179-c00c088a611d-kube-api-access-6b95w\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.343385 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.343410 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.343440 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c94ab764-14e7-4100-8179-c00c088a611d-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.593875 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" event={"ID":"c94ab764-14e7-4100-8179-c00c088a611d","Type":"ContainerDied","Data":"df958299b35ca7223087e7bf82aeba93bc78d672f3a8a81b96201d8ea688c4b2"} Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.594355 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df958299b35ca7223087e7bf82aeba93bc78d672f3a8a81b96201d8ea688c4b2" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.596112 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-2mkt5" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.704499 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-kspw4"] Oct 07 14:32:11 crc kubenswrapper[4854]: E1007 14:32:11.705074 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c94ab764-14e7-4100-8179-c00c088a611d" containerName="reboot-os-openstack-openstack-cell1" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.705095 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94ab764-14e7-4100-8179-c00c088a611d" containerName="reboot-os-openstack-openstack-cell1" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.705442 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="c94ab764-14e7-4100-8179-c00c088a611d" containerName="reboot-os-openstack-openstack-cell1" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.706440 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.710236 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.710665 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.711298 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.711529 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.716111 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-kspw4"] Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.854995 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.855108 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-inventory\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.855173 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ssh-key\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.855239 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.855283 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.855329 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.855576 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ceph\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.855801 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.855955 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.856224 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.856428 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn746\" (UniqueName: \"kubernetes.io/projected/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-kube-api-access-vn746\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.856610 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.959238 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.959319 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.959391 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.959454 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn746\" (UniqueName: \"kubernetes.io/projected/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-kube-api-access-vn746\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.964283 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.964511 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.964833 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-inventory\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.964939 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.964943 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ssh-key\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.965638 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.965765 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.965768 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.965872 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.965952 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.966106 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ceph\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.968142 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.969044 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-inventory\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.969125 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.970523 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ssh-key\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.970857 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ceph\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.971751 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.972874 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.979518 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:11 crc kubenswrapper[4854]: I1007 14:32:11.985186 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn746\" (UniqueName: \"kubernetes.io/projected/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-kube-api-access-vn746\") pod \"install-certs-openstack-openstack-cell1-kspw4\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:12 crc kubenswrapper[4854]: I1007 14:32:12.045717 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:12 crc kubenswrapper[4854]: I1007 14:32:12.632200 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-kspw4"] Oct 07 14:32:12 crc kubenswrapper[4854]: W1007 14:32:12.633113 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0549aa20_8f5f_4a3a_8c24_767fb2c69c65.slice/crio-5b0ec92810565d658844a7c4f763b4d826d0c484a7245588123d1dc925dbad5e WatchSource:0}: Error finding container 5b0ec92810565d658844a7c4f763b4d826d0c484a7245588123d1dc925dbad5e: Status 404 returned error can't find the container with id 5b0ec92810565d658844a7c4f763b4d826d0c484a7245588123d1dc925dbad5e Oct 07 14:32:13 crc kubenswrapper[4854]: I1007 14:32:13.618919 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-kspw4" event={"ID":"0549aa20-8f5f-4a3a-8c24-767fb2c69c65","Type":"ContainerStarted","Data":"40ec9f37c4c6fe23c9ff6f84f14aba09f6afd93f7bdfe03cde321a65cce53173"} Oct 07 14:32:13 crc kubenswrapper[4854]: I1007 14:32:13.619277 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-kspw4" event={"ID":"0549aa20-8f5f-4a3a-8c24-767fb2c69c65","Type":"ContainerStarted","Data":"5b0ec92810565d658844a7c4f763b4d826d0c484a7245588123d1dc925dbad5e"} Oct 07 14:32:13 crc kubenswrapper[4854]: I1007 14:32:13.661557 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-kspw4" podStartSLOduration=2.160634032 podStartE2EDuration="2.661525491s" podCreationTimestamp="2025-10-07 14:32:11 +0000 UTC" firstStartedPulling="2025-10-07 14:32:12.636039769 +0000 UTC m=+7648.623872014" lastFinishedPulling="2025-10-07 14:32:13.136931188 +0000 UTC m=+7649.124763473" observedRunningTime="2025-10-07 14:32:13.641456332 +0000 UTC m=+7649.629288607" watchObservedRunningTime="2025-10-07 14:32:13.661525491 +0000 UTC m=+7649.649357786" Oct 07 14:32:17 crc kubenswrapper[4854]: I1007 14:32:17.703530 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:32:17 crc kubenswrapper[4854]: E1007 14:32:17.704587 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:32:30 crc kubenswrapper[4854]: I1007 14:32:30.702993 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:32:30 crc kubenswrapper[4854]: E1007 14:32:30.703619 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:32:32 crc kubenswrapper[4854]: I1007 14:32:32.834676 4854 generic.go:334] "Generic (PLEG): container finished" podID="0549aa20-8f5f-4a3a-8c24-767fb2c69c65" containerID="40ec9f37c4c6fe23c9ff6f84f14aba09f6afd93f7bdfe03cde321a65cce53173" exitCode=0 Oct 07 14:32:32 crc kubenswrapper[4854]: I1007 14:32:32.835754 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-kspw4" event={"ID":"0549aa20-8f5f-4a3a-8c24-767fb2c69c65","Type":"ContainerDied","Data":"40ec9f37c4c6fe23c9ff6f84f14aba09f6afd93f7bdfe03cde321a65cce53173"} Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.436013 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.487596 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ceph\") pod \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.487667 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-libvirt-combined-ca-bundle\") pod \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.487704 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn746\" (UniqueName: \"kubernetes.io/projected/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-kube-api-access-vn746\") pod \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.487733 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ssh-key\") pod \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.487769 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-telemetry-combined-ca-bundle\") pod \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.487795 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-bootstrap-combined-ca-bundle\") pod \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.487822 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-metadata-combined-ca-bundle\") pod \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.487873 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-sriov-combined-ca-bundle\") pod \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.487904 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ovn-combined-ca-bundle\") pod \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.487985 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-nova-combined-ca-bundle\") pod \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.488075 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-dhcp-combined-ca-bundle\") pod \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.488107 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-inventory\") pod \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\" (UID: \"0549aa20-8f5f-4a3a-8c24-767fb2c69c65\") " Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.497268 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0549aa20-8f5f-4a3a-8c24-767fb2c69c65" (UID: "0549aa20-8f5f-4a3a-8c24-767fb2c69c65"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.497414 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-kube-api-access-vn746" (OuterVolumeSpecName: "kube-api-access-vn746") pod "0549aa20-8f5f-4a3a-8c24-767fb2c69c65" (UID: "0549aa20-8f5f-4a3a-8c24-767fb2c69c65"). InnerVolumeSpecName "kube-api-access-vn746". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.497581 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0549aa20-8f5f-4a3a-8c24-767fb2c69c65" (UID: "0549aa20-8f5f-4a3a-8c24-767fb2c69c65"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.497715 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0549aa20-8f5f-4a3a-8c24-767fb2c69c65" (UID: "0549aa20-8f5f-4a3a-8c24-767fb2c69c65"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.497953 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0549aa20-8f5f-4a3a-8c24-767fb2c69c65" (UID: "0549aa20-8f5f-4a3a-8c24-767fb2c69c65"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.498182 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0549aa20-8f5f-4a3a-8c24-767fb2c69c65" (UID: "0549aa20-8f5f-4a3a-8c24-767fb2c69c65"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.499581 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "0549aa20-8f5f-4a3a-8c24-767fb2c69c65" (UID: "0549aa20-8f5f-4a3a-8c24-767fb2c69c65"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.500432 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0549aa20-8f5f-4a3a-8c24-767fb2c69c65" (UID: "0549aa20-8f5f-4a3a-8c24-767fb2c69c65"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.513298 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "0549aa20-8f5f-4a3a-8c24-767fb2c69c65" (UID: "0549aa20-8f5f-4a3a-8c24-767fb2c69c65"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.516792 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ceph" (OuterVolumeSpecName: "ceph") pod "0549aa20-8f5f-4a3a-8c24-767fb2c69c65" (UID: "0549aa20-8f5f-4a3a-8c24-767fb2c69c65"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.535727 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-inventory" (OuterVolumeSpecName: "inventory") pod "0549aa20-8f5f-4a3a-8c24-767fb2c69c65" (UID: "0549aa20-8f5f-4a3a-8c24-767fb2c69c65"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.543162 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0549aa20-8f5f-4a3a-8c24-767fb2c69c65" (UID: "0549aa20-8f5f-4a3a-8c24-767fb2c69c65"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.590234 4854 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.590270 4854 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.590284 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.590299 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.590311 4854 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.590325 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn746\" (UniqueName: \"kubernetes.io/projected/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-kube-api-access-vn746\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.590336 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.590348 4854 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.590363 4854 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.590374 4854 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.590387 4854 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.590402 4854 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0549aa20-8f5f-4a3a-8c24-767fb2c69c65-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.859293 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-kspw4" event={"ID":"0549aa20-8f5f-4a3a-8c24-767fb2c69c65","Type":"ContainerDied","Data":"5b0ec92810565d658844a7c4f763b4d826d0c484a7245588123d1dc925dbad5e"} Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.859501 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-kspw4" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.859510 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b0ec92810565d658844a7c4f763b4d826d0c484a7245588123d1dc925dbad5e" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.965553 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-ql58q"] Oct 07 14:32:34 crc kubenswrapper[4854]: E1007 14:32:34.966073 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0549aa20-8f5f-4a3a-8c24-767fb2c69c65" containerName="install-certs-openstack-openstack-cell1" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.966096 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="0549aa20-8f5f-4a3a-8c24-767fb2c69c65" containerName="install-certs-openstack-openstack-cell1" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.966408 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="0549aa20-8f5f-4a3a-8c24-767fb2c69c65" containerName="install-certs-openstack-openstack-cell1" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.967388 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.970212 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.970352 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.970851 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.971103 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.998638 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-ql58q"] Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.998657 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-inventory\") pod \"ceph-client-openstack-openstack-cell1-ql58q\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.998789 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lccfr\" (UniqueName: \"kubernetes.io/projected/85d924b2-2173-4e12-b922-28d8b0a2ef2e-kube-api-access-lccfr\") pod \"ceph-client-openstack-openstack-cell1-ql58q\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.998857 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-ql58q\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:34 crc kubenswrapper[4854]: I1007 14:32:34.998958 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-ceph\") pod \"ceph-client-openstack-openstack-cell1-ql58q\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:35 crc kubenswrapper[4854]: I1007 14:32:35.101702 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lccfr\" (UniqueName: \"kubernetes.io/projected/85d924b2-2173-4e12-b922-28d8b0a2ef2e-kube-api-access-lccfr\") pod \"ceph-client-openstack-openstack-cell1-ql58q\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:35 crc kubenswrapper[4854]: I1007 14:32:35.101780 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-ql58q\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:35 crc kubenswrapper[4854]: I1007 14:32:35.101861 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-ceph\") pod \"ceph-client-openstack-openstack-cell1-ql58q\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:35 crc kubenswrapper[4854]: I1007 14:32:35.101987 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-inventory\") pod \"ceph-client-openstack-openstack-cell1-ql58q\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:35 crc kubenswrapper[4854]: I1007 14:32:35.107297 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-ceph\") pod \"ceph-client-openstack-openstack-cell1-ql58q\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:35 crc kubenswrapper[4854]: I1007 14:32:35.107999 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-ql58q\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:35 crc kubenswrapper[4854]: I1007 14:32:35.120282 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-inventory\") pod \"ceph-client-openstack-openstack-cell1-ql58q\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:35 crc kubenswrapper[4854]: I1007 14:32:35.122553 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lccfr\" (UniqueName: \"kubernetes.io/projected/85d924b2-2173-4e12-b922-28d8b0a2ef2e-kube-api-access-lccfr\") pod \"ceph-client-openstack-openstack-cell1-ql58q\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:35 crc kubenswrapper[4854]: I1007 14:32:35.286132 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:35 crc kubenswrapper[4854]: I1007 14:32:35.916619 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-ql58q"] Oct 07 14:32:35 crc kubenswrapper[4854]: W1007 14:32:35.924339 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85d924b2_2173_4e12_b922_28d8b0a2ef2e.slice/crio-38d01145c7f818e41c391c866a61a8e9103a2b1f670bccecf2af57f561d5fae0 WatchSource:0}: Error finding container 38d01145c7f818e41c391c866a61a8e9103a2b1f670bccecf2af57f561d5fae0: Status 404 returned error can't find the container with id 38d01145c7f818e41c391c866a61a8e9103a2b1f670bccecf2af57f561d5fae0 Oct 07 14:32:36 crc kubenswrapper[4854]: I1007 14:32:36.877582 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" event={"ID":"85d924b2-2173-4e12-b922-28d8b0a2ef2e","Type":"ContainerStarted","Data":"778ed3e7a6f2f103e2d0291911d8a9fb5c31d92cbc083f073e606d2f06513409"} Oct 07 14:32:36 crc kubenswrapper[4854]: I1007 14:32:36.878246 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" event={"ID":"85d924b2-2173-4e12-b922-28d8b0a2ef2e","Type":"ContainerStarted","Data":"38d01145c7f818e41c391c866a61a8e9103a2b1f670bccecf2af57f561d5fae0"} Oct 07 14:32:36 crc kubenswrapper[4854]: I1007 14:32:36.896186 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" podStartSLOduration=2.294752891 podStartE2EDuration="2.896115977s" podCreationTimestamp="2025-10-07 14:32:34 +0000 UTC" firstStartedPulling="2025-10-07 14:32:35.927336599 +0000 UTC m=+7671.915168854" lastFinishedPulling="2025-10-07 14:32:36.528699655 +0000 UTC m=+7672.516531940" observedRunningTime="2025-10-07 14:32:36.892570235 +0000 UTC m=+7672.880402550" watchObservedRunningTime="2025-10-07 14:32:36.896115977 +0000 UTC m=+7672.883948232" Oct 07 14:32:41 crc kubenswrapper[4854]: I1007 14:32:41.932319 4854 generic.go:334] "Generic (PLEG): container finished" podID="85d924b2-2173-4e12-b922-28d8b0a2ef2e" containerID="778ed3e7a6f2f103e2d0291911d8a9fb5c31d92cbc083f073e606d2f06513409" exitCode=0 Oct 07 14:32:41 crc kubenswrapper[4854]: I1007 14:32:41.932405 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" event={"ID":"85d924b2-2173-4e12-b922-28d8b0a2ef2e","Type":"ContainerDied","Data":"778ed3e7a6f2f103e2d0291911d8a9fb5c31d92cbc083f073e606d2f06513409"} Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.422808 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.521001 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lccfr\" (UniqueName: \"kubernetes.io/projected/85d924b2-2173-4e12-b922-28d8b0a2ef2e-kube-api-access-lccfr\") pod \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.521132 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-ceph\") pod \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.521192 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-inventory\") pod \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.521227 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-ssh-key\") pod \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\" (UID: \"85d924b2-2173-4e12-b922-28d8b0a2ef2e\") " Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.527820 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85d924b2-2173-4e12-b922-28d8b0a2ef2e-kube-api-access-lccfr" (OuterVolumeSpecName: "kube-api-access-lccfr") pod "85d924b2-2173-4e12-b922-28d8b0a2ef2e" (UID: "85d924b2-2173-4e12-b922-28d8b0a2ef2e"). InnerVolumeSpecName "kube-api-access-lccfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.528705 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-ceph" (OuterVolumeSpecName: "ceph") pod "85d924b2-2173-4e12-b922-28d8b0a2ef2e" (UID: "85d924b2-2173-4e12-b922-28d8b0a2ef2e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.561607 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "85d924b2-2173-4e12-b922-28d8b0a2ef2e" (UID: "85d924b2-2173-4e12-b922-28d8b0a2ef2e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.572079 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-inventory" (OuterVolumeSpecName: "inventory") pod "85d924b2-2173-4e12-b922-28d8b0a2ef2e" (UID: "85d924b2-2173-4e12-b922-28d8b0a2ef2e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.623666 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.623786 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.623843 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85d924b2-2173-4e12-b922-28d8b0a2ef2e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.623916 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lccfr\" (UniqueName: \"kubernetes.io/projected/85d924b2-2173-4e12-b922-28d8b0a2ef2e-kube-api-access-lccfr\") on node \"crc\" DevicePath \"\"" Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.959757 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" event={"ID":"85d924b2-2173-4e12-b922-28d8b0a2ef2e","Type":"ContainerDied","Data":"38d01145c7f818e41c391c866a61a8e9103a2b1f670bccecf2af57f561d5fae0"} Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.960095 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38d01145c7f818e41c391c866a61a8e9103a2b1f670bccecf2af57f561d5fae0" Oct 07 14:32:43 crc kubenswrapper[4854]: I1007 14:32:43.959826 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-ql58q" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.072531 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-255n2"] Oct 07 14:32:44 crc kubenswrapper[4854]: E1007 14:32:44.073173 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d924b2-2173-4e12-b922-28d8b0a2ef2e" containerName="ceph-client-openstack-openstack-cell1" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.073203 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d924b2-2173-4e12-b922-28d8b0a2ef2e" containerName="ceph-client-openstack-openstack-cell1" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.073593 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d924b2-2173-4e12-b922-28d8b0a2ef2e" containerName="ceph-client-openstack-openstack-cell1" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.074823 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.077165 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.077249 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.078922 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.078918 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.079100 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.089183 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-255n2"] Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.237704 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ceph\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.237766 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-inventory\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.238406 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.238454 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br9pv\" (UniqueName: \"kubernetes.io/projected/be6c7b9e-28b6-490c-8e3d-c919b557df3c-kube-api-access-br9pv\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.238498 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.238614 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ssh-key\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.340432 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ceph\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.340494 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-inventory\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.340558 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.340595 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br9pv\" (UniqueName: \"kubernetes.io/projected/be6c7b9e-28b6-490c-8e3d-c919b557df3c-kube-api-access-br9pv\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.340642 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.340741 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ssh-key\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.341869 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.344864 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ssh-key\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.346105 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.346503 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-inventory\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.347135 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ceph\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.366303 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br9pv\" (UniqueName: \"kubernetes.io/projected/be6c7b9e-28b6-490c-8e3d-c919b557df3c-kube-api-access-br9pv\") pod \"ovn-openstack-openstack-cell1-255n2\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:44 crc kubenswrapper[4854]: I1007 14:32:44.433109 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:32:45 crc kubenswrapper[4854]: I1007 14:32:45.069808 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-255n2"] Oct 07 14:32:45 crc kubenswrapper[4854]: I1007 14:32:45.702535 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:32:45 crc kubenswrapper[4854]: E1007 14:32:45.703010 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:32:45 crc kubenswrapper[4854]: I1007 14:32:45.993821 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-255n2" event={"ID":"be6c7b9e-28b6-490c-8e3d-c919b557df3c","Type":"ContainerStarted","Data":"87deb71a2d857281666e1cf6810f7788fd366f02ca3817eb8f4847b2d8619aca"} Oct 07 14:32:46 crc kubenswrapper[4854]: I1007 14:32:46.677217 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:32:48 crc kubenswrapper[4854]: I1007 14:32:48.020308 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-255n2" event={"ID":"be6c7b9e-28b6-490c-8e3d-c919b557df3c","Type":"ContainerStarted","Data":"ecfc9d1ac729e9847e65afa5cefeb3c46594fba8e764de4b7f72253110ded8ab"} Oct 07 14:32:48 crc kubenswrapper[4854]: I1007 14:32:48.049125 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-255n2" podStartSLOduration=2.443171705 podStartE2EDuration="4.049101821s" podCreationTimestamp="2025-10-07 14:32:44 +0000 UTC" firstStartedPulling="2025-10-07 14:32:45.069402311 +0000 UTC m=+7681.057234566" lastFinishedPulling="2025-10-07 14:32:46.675332427 +0000 UTC m=+7682.663164682" observedRunningTime="2025-10-07 14:32:48.036719844 +0000 UTC m=+7684.024552119" watchObservedRunningTime="2025-10-07 14:32:48.049101821 +0000 UTC m=+7684.036934086" Oct 07 14:32:58 crc kubenswrapper[4854]: I1007 14:32:58.704208 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:32:58 crc kubenswrapper[4854]: E1007 14:32:58.706035 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:33:11 crc kubenswrapper[4854]: I1007 14:33:11.702902 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:33:11 crc kubenswrapper[4854]: E1007 14:33:11.703717 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:33:25 crc kubenswrapper[4854]: I1007 14:33:25.703954 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:33:25 crc kubenswrapper[4854]: E1007 14:33:25.705212 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:33:39 crc kubenswrapper[4854]: I1007 14:33:39.702790 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:33:39 crc kubenswrapper[4854]: E1007 14:33:39.703534 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:33:50 crc kubenswrapper[4854]: I1007 14:33:50.703269 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:33:51 crc kubenswrapper[4854]: I1007 14:33:51.783452 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"dc81969983bbce8baa8681e21516942005959dc51371ccf54e48acb511d0a582"} Oct 07 14:33:54 crc kubenswrapper[4854]: I1007 14:33:54.813215 4854 generic.go:334] "Generic (PLEG): container finished" podID="be6c7b9e-28b6-490c-8e3d-c919b557df3c" containerID="ecfc9d1ac729e9847e65afa5cefeb3c46594fba8e764de4b7f72253110ded8ab" exitCode=0 Oct 07 14:33:54 crc kubenswrapper[4854]: I1007 14:33:54.813323 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-255n2" event={"ID":"be6c7b9e-28b6-490c-8e3d-c919b557df3c","Type":"ContainerDied","Data":"ecfc9d1ac729e9847e65afa5cefeb3c46594fba8e764de4b7f72253110ded8ab"} Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.311052 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.411363 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ovncontroller-config-0\") pod \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.411708 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-inventory\") pod \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.411760 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ceph\") pod \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.411845 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ovn-combined-ca-bundle\") pod \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.411917 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br9pv\" (UniqueName: \"kubernetes.io/projected/be6c7b9e-28b6-490c-8e3d-c919b557df3c-kube-api-access-br9pv\") pod \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.412059 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ssh-key\") pod \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\" (UID: \"be6c7b9e-28b6-490c-8e3d-c919b557df3c\") " Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.417553 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be6c7b9e-28b6-490c-8e3d-c919b557df3c-kube-api-access-br9pv" (OuterVolumeSpecName: "kube-api-access-br9pv") pod "be6c7b9e-28b6-490c-8e3d-c919b557df3c" (UID: "be6c7b9e-28b6-490c-8e3d-c919b557df3c"). InnerVolumeSpecName "kube-api-access-br9pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.417747 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ceph" (OuterVolumeSpecName: "ceph") pod "be6c7b9e-28b6-490c-8e3d-c919b557df3c" (UID: "be6c7b9e-28b6-490c-8e3d-c919b557df3c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.449024 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "be6c7b9e-28b6-490c-8e3d-c919b557df3c" (UID: "be6c7b9e-28b6-490c-8e3d-c919b557df3c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.452363 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "be6c7b9e-28b6-490c-8e3d-c919b557df3c" (UID: "be6c7b9e-28b6-490c-8e3d-c919b557df3c"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.457645 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "be6c7b9e-28b6-490c-8e3d-c919b557df3c" (UID: "be6c7b9e-28b6-490c-8e3d-c919b557df3c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.466869 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-inventory" (OuterVolumeSpecName: "inventory") pod "be6c7b9e-28b6-490c-8e3d-c919b557df3c" (UID: "be6c7b9e-28b6-490c-8e3d-c919b557df3c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.514476 4854 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.514505 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br9pv\" (UniqueName: \"kubernetes.io/projected/be6c7b9e-28b6-490c-8e3d-c919b557df3c-kube-api-access-br9pv\") on node \"crc\" DevicePath \"\"" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.514514 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.514522 4854 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.514532 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.514540 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be6c7b9e-28b6-490c-8e3d-c919b557df3c-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.836361 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-255n2" event={"ID":"be6c7b9e-28b6-490c-8e3d-c919b557df3c","Type":"ContainerDied","Data":"87deb71a2d857281666e1cf6810f7788fd366f02ca3817eb8f4847b2d8619aca"} Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.836786 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87deb71a2d857281666e1cf6810f7788fd366f02ca3817eb8f4847b2d8619aca" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.836425 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-255n2" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.932927 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-vvd45"] Oct 07 14:33:56 crc kubenswrapper[4854]: E1007 14:33:56.933428 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be6c7b9e-28b6-490c-8e3d-c919b557df3c" containerName="ovn-openstack-openstack-cell1" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.933443 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="be6c7b9e-28b6-490c-8e3d-c919b557df3c" containerName="ovn-openstack-openstack-cell1" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.933643 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="be6c7b9e-28b6-490c-8e3d-c919b557df3c" containerName="ovn-openstack-openstack-cell1" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.934450 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.936281 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.937864 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.938003 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.938001 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.938572 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.938750 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:33:56 crc kubenswrapper[4854]: I1007 14:33:56.967061 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-vvd45"] Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.028528 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.028576 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4hkt\" (UniqueName: \"kubernetes.io/projected/fddef710-1c6d-44cc-8184-613bf1aff29e-kube-api-access-z4hkt\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.028607 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.028703 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.028731 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.028763 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.028816 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.130367 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.130432 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.130474 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.130536 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.130627 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.130648 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4hkt\" (UniqueName: \"kubernetes.io/projected/fddef710-1c6d-44cc-8184-613bf1aff29e-kube-api-access-z4hkt\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.130679 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.139524 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.141090 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.141620 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.143109 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.146043 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.151566 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.157736 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4hkt\" (UniqueName: \"kubernetes.io/projected/fddef710-1c6d-44cc-8184-613bf1aff29e-kube-api-access-z4hkt\") pod \"neutron-metadata-openstack-openstack-cell1-vvd45\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.258173 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:33:57 crc kubenswrapper[4854]: I1007 14:33:57.973116 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-vvd45"] Oct 07 14:33:58 crc kubenswrapper[4854]: I1007 14:33:58.449413 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fldws"] Oct 07 14:33:58 crc kubenswrapper[4854]: I1007 14:33:58.452103 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:33:58 crc kubenswrapper[4854]: I1007 14:33:58.467285 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fldws"] Oct 07 14:33:58 crc kubenswrapper[4854]: I1007 14:33:58.559818 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83914987-4ba3-4266-b960-8d3d17caa7a8-utilities\") pod \"redhat-marketplace-fldws\" (UID: \"83914987-4ba3-4266-b960-8d3d17caa7a8\") " pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:33:58 crc kubenswrapper[4854]: I1007 14:33:58.560303 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjzq2\" (UniqueName: \"kubernetes.io/projected/83914987-4ba3-4266-b960-8d3d17caa7a8-kube-api-access-pjzq2\") pod \"redhat-marketplace-fldws\" (UID: \"83914987-4ba3-4266-b960-8d3d17caa7a8\") " pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:33:58 crc kubenswrapper[4854]: I1007 14:33:58.560466 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83914987-4ba3-4266-b960-8d3d17caa7a8-catalog-content\") pod \"redhat-marketplace-fldws\" (UID: \"83914987-4ba3-4266-b960-8d3d17caa7a8\") " pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:33:58 crc kubenswrapper[4854]: I1007 14:33:58.662787 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjzq2\" (UniqueName: \"kubernetes.io/projected/83914987-4ba3-4266-b960-8d3d17caa7a8-kube-api-access-pjzq2\") pod \"redhat-marketplace-fldws\" (UID: \"83914987-4ba3-4266-b960-8d3d17caa7a8\") " pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:33:58 crc kubenswrapper[4854]: I1007 14:33:58.662856 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83914987-4ba3-4266-b960-8d3d17caa7a8-catalog-content\") pod \"redhat-marketplace-fldws\" (UID: \"83914987-4ba3-4266-b960-8d3d17caa7a8\") " pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:33:58 crc kubenswrapper[4854]: I1007 14:33:58.663000 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83914987-4ba3-4266-b960-8d3d17caa7a8-utilities\") pod \"redhat-marketplace-fldws\" (UID: \"83914987-4ba3-4266-b960-8d3d17caa7a8\") " pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:33:58 crc kubenswrapper[4854]: I1007 14:33:58.663646 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83914987-4ba3-4266-b960-8d3d17caa7a8-catalog-content\") pod \"redhat-marketplace-fldws\" (UID: \"83914987-4ba3-4266-b960-8d3d17caa7a8\") " pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:33:58 crc kubenswrapper[4854]: I1007 14:33:58.663743 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83914987-4ba3-4266-b960-8d3d17caa7a8-utilities\") pod \"redhat-marketplace-fldws\" (UID: \"83914987-4ba3-4266-b960-8d3d17caa7a8\") " pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:33:58 crc kubenswrapper[4854]: I1007 14:33:58.681910 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjzq2\" (UniqueName: \"kubernetes.io/projected/83914987-4ba3-4266-b960-8d3d17caa7a8-kube-api-access-pjzq2\") pod \"redhat-marketplace-fldws\" (UID: \"83914987-4ba3-4266-b960-8d3d17caa7a8\") " pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:33:58 crc kubenswrapper[4854]: I1007 14:33:58.784402 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:33:58 crc kubenswrapper[4854]: I1007 14:33:58.860965 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" event={"ID":"fddef710-1c6d-44cc-8184-613bf1aff29e","Type":"ContainerStarted","Data":"ef2b78fe7ddaaee604678d082392425416c4e51be1606fd3caafa62c2ac2fb75"} Oct 07 14:33:59 crc kubenswrapper[4854]: I1007 14:33:59.294459 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fldws"] Oct 07 14:33:59 crc kubenswrapper[4854]: I1007 14:33:59.875890 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" event={"ID":"fddef710-1c6d-44cc-8184-613bf1aff29e","Type":"ContainerStarted","Data":"66009f47d612bd39145cf7d14c07fbd2164725415ce12bd861f8b426c01cc4ed"} Oct 07 14:33:59 crc kubenswrapper[4854]: I1007 14:33:59.882770 4854 generic.go:334] "Generic (PLEG): container finished" podID="83914987-4ba3-4266-b960-8d3d17caa7a8" containerID="1ca93866356834e9935e309f4fad8760858b174d4e9e58c6f3509831905181da" exitCode=0 Oct 07 14:33:59 crc kubenswrapper[4854]: I1007 14:33:59.882861 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fldws" event={"ID":"83914987-4ba3-4266-b960-8d3d17caa7a8","Type":"ContainerDied","Data":"1ca93866356834e9935e309f4fad8760858b174d4e9e58c6f3509831905181da"} Oct 07 14:33:59 crc kubenswrapper[4854]: I1007 14:33:59.882920 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fldws" event={"ID":"83914987-4ba3-4266-b960-8d3d17caa7a8","Type":"ContainerStarted","Data":"773351c05bb4e04764d72562c685a495699ee8fb3b6f1810ae96fd1bbe240542"} Oct 07 14:33:59 crc kubenswrapper[4854]: I1007 14:33:59.907324 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" podStartSLOduration=3.000041032 podStartE2EDuration="3.907296297s" podCreationTimestamp="2025-10-07 14:33:56 +0000 UTC" firstStartedPulling="2025-10-07 14:33:57.988257674 +0000 UTC m=+7753.976089969" lastFinishedPulling="2025-10-07 14:33:58.895512979 +0000 UTC m=+7754.883345234" observedRunningTime="2025-10-07 14:33:59.897198376 +0000 UTC m=+7755.885030711" watchObservedRunningTime="2025-10-07 14:33:59.907296297 +0000 UTC m=+7755.895128592" Oct 07 14:34:02 crc kubenswrapper[4854]: I1007 14:34:02.919021 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fldws" event={"ID":"83914987-4ba3-4266-b960-8d3d17caa7a8","Type":"ContainerStarted","Data":"d9532d1c7b399754c59d9807047f2cf7f809316e6d4f7b6f4a66ec39cca80a15"} Oct 07 14:34:03 crc kubenswrapper[4854]: I1007 14:34:03.931663 4854 generic.go:334] "Generic (PLEG): container finished" podID="83914987-4ba3-4266-b960-8d3d17caa7a8" containerID="d9532d1c7b399754c59d9807047f2cf7f809316e6d4f7b6f4a66ec39cca80a15" exitCode=0 Oct 07 14:34:03 crc kubenswrapper[4854]: I1007 14:34:03.931720 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fldws" event={"ID":"83914987-4ba3-4266-b960-8d3d17caa7a8","Type":"ContainerDied","Data":"d9532d1c7b399754c59d9807047f2cf7f809316e6d4f7b6f4a66ec39cca80a15"} Oct 07 14:34:05 crc kubenswrapper[4854]: I1007 14:34:05.961615 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fldws" event={"ID":"83914987-4ba3-4266-b960-8d3d17caa7a8","Type":"ContainerStarted","Data":"38af60f63f65fd8312611efed109a34342d5316d6a23a32c71519cf80bdfb815"} Oct 07 14:34:05 crc kubenswrapper[4854]: I1007 14:34:05.978130 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fldws" podStartSLOduration=3.053915883 podStartE2EDuration="7.978111569s" podCreationTimestamp="2025-10-07 14:33:58 +0000 UTC" firstStartedPulling="2025-10-07 14:33:59.88625415 +0000 UTC m=+7755.874086445" lastFinishedPulling="2025-10-07 14:34:04.810449876 +0000 UTC m=+7760.798282131" observedRunningTime="2025-10-07 14:34:05.976899864 +0000 UTC m=+7761.964732129" watchObservedRunningTime="2025-10-07 14:34:05.978111569 +0000 UTC m=+7761.965943824" Oct 07 14:34:08 crc kubenswrapper[4854]: I1007 14:34:08.785095 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:34:08 crc kubenswrapper[4854]: I1007 14:34:08.785746 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:34:08 crc kubenswrapper[4854]: I1007 14:34:08.858751 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:34:18 crc kubenswrapper[4854]: I1007 14:34:18.851498 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:34:18 crc kubenswrapper[4854]: I1007 14:34:18.912065 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fldws"] Oct 07 14:34:19 crc kubenswrapper[4854]: I1007 14:34:19.116458 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fldws" podUID="83914987-4ba3-4266-b960-8d3d17caa7a8" containerName="registry-server" containerID="cri-o://38af60f63f65fd8312611efed109a34342d5316d6a23a32c71519cf80bdfb815" gracePeriod=2 Oct 07 14:34:19 crc kubenswrapper[4854]: I1007 14:34:19.646909 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:34:19 crc kubenswrapper[4854]: I1007 14:34:19.792541 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjzq2\" (UniqueName: \"kubernetes.io/projected/83914987-4ba3-4266-b960-8d3d17caa7a8-kube-api-access-pjzq2\") pod \"83914987-4ba3-4266-b960-8d3d17caa7a8\" (UID: \"83914987-4ba3-4266-b960-8d3d17caa7a8\") " Oct 07 14:34:19 crc kubenswrapper[4854]: I1007 14:34:19.792725 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83914987-4ba3-4266-b960-8d3d17caa7a8-catalog-content\") pod \"83914987-4ba3-4266-b960-8d3d17caa7a8\" (UID: \"83914987-4ba3-4266-b960-8d3d17caa7a8\") " Oct 07 14:34:19 crc kubenswrapper[4854]: I1007 14:34:19.792845 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83914987-4ba3-4266-b960-8d3d17caa7a8-utilities\") pod \"83914987-4ba3-4266-b960-8d3d17caa7a8\" (UID: \"83914987-4ba3-4266-b960-8d3d17caa7a8\") " Oct 07 14:34:19 crc kubenswrapper[4854]: I1007 14:34:19.794097 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83914987-4ba3-4266-b960-8d3d17caa7a8-utilities" (OuterVolumeSpecName: "utilities") pod "83914987-4ba3-4266-b960-8d3d17caa7a8" (UID: "83914987-4ba3-4266-b960-8d3d17caa7a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:34:19 crc kubenswrapper[4854]: I1007 14:34:19.805546 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83914987-4ba3-4266-b960-8d3d17caa7a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83914987-4ba3-4266-b960-8d3d17caa7a8" (UID: "83914987-4ba3-4266-b960-8d3d17caa7a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:34:19 crc kubenswrapper[4854]: I1007 14:34:19.806778 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83914987-4ba3-4266-b960-8d3d17caa7a8-kube-api-access-pjzq2" (OuterVolumeSpecName: "kube-api-access-pjzq2") pod "83914987-4ba3-4266-b960-8d3d17caa7a8" (UID: "83914987-4ba3-4266-b960-8d3d17caa7a8"). InnerVolumeSpecName "kube-api-access-pjzq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:34:19 crc kubenswrapper[4854]: I1007 14:34:19.895245 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83914987-4ba3-4266-b960-8d3d17caa7a8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:34:19 crc kubenswrapper[4854]: I1007 14:34:19.895288 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83914987-4ba3-4266-b960-8d3d17caa7a8-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:34:19 crc kubenswrapper[4854]: I1007 14:34:19.895309 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjzq2\" (UniqueName: \"kubernetes.io/projected/83914987-4ba3-4266-b960-8d3d17caa7a8-kube-api-access-pjzq2\") on node \"crc\" DevicePath \"\"" Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.130936 4854 generic.go:334] "Generic (PLEG): container finished" podID="83914987-4ba3-4266-b960-8d3d17caa7a8" containerID="38af60f63f65fd8312611efed109a34342d5316d6a23a32c71519cf80bdfb815" exitCode=0 Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.130998 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fldws" event={"ID":"83914987-4ba3-4266-b960-8d3d17caa7a8","Type":"ContainerDied","Data":"38af60f63f65fd8312611efed109a34342d5316d6a23a32c71519cf80bdfb815"} Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.131069 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fldws" event={"ID":"83914987-4ba3-4266-b960-8d3d17caa7a8","Type":"ContainerDied","Data":"773351c05bb4e04764d72562c685a495699ee8fb3b6f1810ae96fd1bbe240542"} Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.131094 4854 scope.go:117] "RemoveContainer" containerID="38af60f63f65fd8312611efed109a34342d5316d6a23a32c71519cf80bdfb815" Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.131014 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fldws" Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.160657 4854 scope.go:117] "RemoveContainer" containerID="d9532d1c7b399754c59d9807047f2cf7f809316e6d4f7b6f4a66ec39cca80a15" Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.182011 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fldws"] Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.190381 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fldws"] Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.206867 4854 scope.go:117] "RemoveContainer" containerID="1ca93866356834e9935e309f4fad8760858b174d4e9e58c6f3509831905181da" Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.271127 4854 scope.go:117] "RemoveContainer" containerID="38af60f63f65fd8312611efed109a34342d5316d6a23a32c71519cf80bdfb815" Oct 07 14:34:20 crc kubenswrapper[4854]: E1007 14:34:20.271632 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38af60f63f65fd8312611efed109a34342d5316d6a23a32c71519cf80bdfb815\": container with ID starting with 38af60f63f65fd8312611efed109a34342d5316d6a23a32c71519cf80bdfb815 not found: ID does not exist" containerID="38af60f63f65fd8312611efed109a34342d5316d6a23a32c71519cf80bdfb815" Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.271698 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38af60f63f65fd8312611efed109a34342d5316d6a23a32c71519cf80bdfb815"} err="failed to get container status \"38af60f63f65fd8312611efed109a34342d5316d6a23a32c71519cf80bdfb815\": rpc error: code = NotFound desc = could not find container \"38af60f63f65fd8312611efed109a34342d5316d6a23a32c71519cf80bdfb815\": container with ID starting with 38af60f63f65fd8312611efed109a34342d5316d6a23a32c71519cf80bdfb815 not found: ID does not exist" Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.271750 4854 scope.go:117] "RemoveContainer" containerID="d9532d1c7b399754c59d9807047f2cf7f809316e6d4f7b6f4a66ec39cca80a15" Oct 07 14:34:20 crc kubenswrapper[4854]: E1007 14:34:20.272227 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9532d1c7b399754c59d9807047f2cf7f809316e6d4f7b6f4a66ec39cca80a15\": container with ID starting with d9532d1c7b399754c59d9807047f2cf7f809316e6d4f7b6f4a66ec39cca80a15 not found: ID does not exist" containerID="d9532d1c7b399754c59d9807047f2cf7f809316e6d4f7b6f4a66ec39cca80a15" Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.272291 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9532d1c7b399754c59d9807047f2cf7f809316e6d4f7b6f4a66ec39cca80a15"} err="failed to get container status \"d9532d1c7b399754c59d9807047f2cf7f809316e6d4f7b6f4a66ec39cca80a15\": rpc error: code = NotFound desc = could not find container \"d9532d1c7b399754c59d9807047f2cf7f809316e6d4f7b6f4a66ec39cca80a15\": container with ID starting with d9532d1c7b399754c59d9807047f2cf7f809316e6d4f7b6f4a66ec39cca80a15 not found: ID does not exist" Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.272321 4854 scope.go:117] "RemoveContainer" containerID="1ca93866356834e9935e309f4fad8760858b174d4e9e58c6f3509831905181da" Oct 07 14:34:20 crc kubenswrapper[4854]: E1007 14:34:20.272628 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ca93866356834e9935e309f4fad8760858b174d4e9e58c6f3509831905181da\": container with ID starting with 1ca93866356834e9935e309f4fad8760858b174d4e9e58c6f3509831905181da not found: ID does not exist" containerID="1ca93866356834e9935e309f4fad8760858b174d4e9e58c6f3509831905181da" Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.272660 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ca93866356834e9935e309f4fad8760858b174d4e9e58c6f3509831905181da"} err="failed to get container status \"1ca93866356834e9935e309f4fad8760858b174d4e9e58c6f3509831905181da\": rpc error: code = NotFound desc = could not find container \"1ca93866356834e9935e309f4fad8760858b174d4e9e58c6f3509831905181da\": container with ID starting with 1ca93866356834e9935e309f4fad8760858b174d4e9e58c6f3509831905181da not found: ID does not exist" Oct 07 14:34:20 crc kubenswrapper[4854]: I1007 14:34:20.719983 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83914987-4ba3-4266-b960-8d3d17caa7a8" path="/var/lib/kubelet/pods/83914987-4ba3-4266-b960-8d3d17caa7a8/volumes" Oct 07 14:34:53 crc kubenswrapper[4854]: I1007 14:34:53.484520 4854 generic.go:334] "Generic (PLEG): container finished" podID="fddef710-1c6d-44cc-8184-613bf1aff29e" containerID="66009f47d612bd39145cf7d14c07fbd2164725415ce12bd861f8b426c01cc4ed" exitCode=0 Oct 07 14:34:53 crc kubenswrapper[4854]: I1007 14:34:53.484762 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" event={"ID":"fddef710-1c6d-44cc-8184-613bf1aff29e","Type":"ContainerDied","Data":"66009f47d612bd39145cf7d14c07fbd2164725415ce12bd861f8b426c01cc4ed"} Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.007809 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.153162 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-nova-metadata-neutron-config-0\") pod \"fddef710-1c6d-44cc-8184-613bf1aff29e\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.153251 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-ceph\") pod \"fddef710-1c6d-44cc-8184-613bf1aff29e\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.153326 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4hkt\" (UniqueName: \"kubernetes.io/projected/fddef710-1c6d-44cc-8184-613bf1aff29e-kube-api-access-z4hkt\") pod \"fddef710-1c6d-44cc-8184-613bf1aff29e\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.153487 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-ssh-key\") pod \"fddef710-1c6d-44cc-8184-613bf1aff29e\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.153573 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"fddef710-1c6d-44cc-8184-613bf1aff29e\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.153724 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-neutron-metadata-combined-ca-bundle\") pod \"fddef710-1c6d-44cc-8184-613bf1aff29e\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.153757 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-inventory\") pod \"fddef710-1c6d-44cc-8184-613bf1aff29e\" (UID: \"fddef710-1c6d-44cc-8184-613bf1aff29e\") " Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.159619 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-ceph" (OuterVolumeSpecName: "ceph") pod "fddef710-1c6d-44cc-8184-613bf1aff29e" (UID: "fddef710-1c6d-44cc-8184-613bf1aff29e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.159737 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fddef710-1c6d-44cc-8184-613bf1aff29e-kube-api-access-z4hkt" (OuterVolumeSpecName: "kube-api-access-z4hkt") pod "fddef710-1c6d-44cc-8184-613bf1aff29e" (UID: "fddef710-1c6d-44cc-8184-613bf1aff29e"). InnerVolumeSpecName "kube-api-access-z4hkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.159941 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "fddef710-1c6d-44cc-8184-613bf1aff29e" (UID: "fddef710-1c6d-44cc-8184-613bf1aff29e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.184784 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "fddef710-1c6d-44cc-8184-613bf1aff29e" (UID: "fddef710-1c6d-44cc-8184-613bf1aff29e"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.189546 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-inventory" (OuterVolumeSpecName: "inventory") pod "fddef710-1c6d-44cc-8184-613bf1aff29e" (UID: "fddef710-1c6d-44cc-8184-613bf1aff29e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.197383 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "fddef710-1c6d-44cc-8184-613bf1aff29e" (UID: "fddef710-1c6d-44cc-8184-613bf1aff29e"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.213893 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fddef710-1c6d-44cc-8184-613bf1aff29e" (UID: "fddef710-1c6d-44cc-8184-613bf1aff29e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.259586 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.259632 4854 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.259648 4854 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.259660 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.259673 4854 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.259685 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fddef710-1c6d-44cc-8184-613bf1aff29e-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.259699 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4hkt\" (UniqueName: \"kubernetes.io/projected/fddef710-1c6d-44cc-8184-613bf1aff29e-kube-api-access-z4hkt\") on node \"crc\" DevicePath \"\"" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.548244 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" event={"ID":"fddef710-1c6d-44cc-8184-613bf1aff29e","Type":"ContainerDied","Data":"ef2b78fe7ddaaee604678d082392425416c4e51be1606fd3caafa62c2ac2fb75"} Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.548292 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef2b78fe7ddaaee604678d082392425416c4e51be1606fd3caafa62c2ac2fb75" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.548290 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-vvd45" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.700284 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-6h9d9"] Oct 07 14:34:55 crc kubenswrapper[4854]: E1007 14:34:55.700695 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fddef710-1c6d-44cc-8184-613bf1aff29e" containerName="neutron-metadata-openstack-openstack-cell1" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.700711 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="fddef710-1c6d-44cc-8184-613bf1aff29e" containerName="neutron-metadata-openstack-openstack-cell1" Oct 07 14:34:55 crc kubenswrapper[4854]: E1007 14:34:55.700734 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83914987-4ba3-4266-b960-8d3d17caa7a8" containerName="registry-server" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.700741 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="83914987-4ba3-4266-b960-8d3d17caa7a8" containerName="registry-server" Oct 07 14:34:55 crc kubenswrapper[4854]: E1007 14:34:55.700770 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83914987-4ba3-4266-b960-8d3d17caa7a8" containerName="extract-utilities" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.700776 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="83914987-4ba3-4266-b960-8d3d17caa7a8" containerName="extract-utilities" Oct 07 14:34:55 crc kubenswrapper[4854]: E1007 14:34:55.700788 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83914987-4ba3-4266-b960-8d3d17caa7a8" containerName="extract-content" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.700793 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="83914987-4ba3-4266-b960-8d3d17caa7a8" containerName="extract-content" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.700980 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="fddef710-1c6d-44cc-8184-613bf1aff29e" containerName="neutron-metadata-openstack-openstack-cell1" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.700999 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="83914987-4ba3-4266-b960-8d3d17caa7a8" containerName="registry-server" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.701745 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.719713 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-6h9d9"] Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.720404 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.720722 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.720963 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.721112 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.721297 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.772976 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-ssh-key\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.773032 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.773228 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvrs7\" (UniqueName: \"kubernetes.io/projected/42bf907e-8047-4f86-99df-41a920bec529-kube-api-access-vvrs7\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.773403 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-ceph\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.773431 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.773714 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-inventory\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.874774 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvrs7\" (UniqueName: \"kubernetes.io/projected/42bf907e-8047-4f86-99df-41a920bec529-kube-api-access-vvrs7\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.875098 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-ceph\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.875192 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.875373 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-inventory\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.875540 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-ssh-key\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.875607 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.883990 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.884428 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-inventory\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.890250 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-ssh-key\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.893131 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-ceph\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.893900 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:55 crc kubenswrapper[4854]: I1007 14:34:55.898401 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvrs7\" (UniqueName: \"kubernetes.io/projected/42bf907e-8047-4f86-99df-41a920bec529-kube-api-access-vvrs7\") pod \"libvirt-openstack-openstack-cell1-6h9d9\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:56 crc kubenswrapper[4854]: I1007 14:34:56.079538 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:34:56 crc kubenswrapper[4854]: I1007 14:34:56.700033 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-6h9d9"] Oct 07 14:34:56 crc kubenswrapper[4854]: W1007 14:34:56.703555 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42bf907e_8047_4f86_99df_41a920bec529.slice/crio-bd2d0941d7196bdfaf72244cd9bf27dc5a542e7ecb912c0a33d9e8d9ee3ad134 WatchSource:0}: Error finding container bd2d0941d7196bdfaf72244cd9bf27dc5a542e7ecb912c0a33d9e8d9ee3ad134: Status 404 returned error can't find the container with id bd2d0941d7196bdfaf72244cd9bf27dc5a542e7ecb912c0a33d9e8d9ee3ad134 Oct 07 14:34:56 crc kubenswrapper[4854]: I1007 14:34:56.708728 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:34:57 crc kubenswrapper[4854]: I1007 14:34:57.575675 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" event={"ID":"42bf907e-8047-4f86-99df-41a920bec529","Type":"ContainerStarted","Data":"bd2d0941d7196bdfaf72244cd9bf27dc5a542e7ecb912c0a33d9e8d9ee3ad134"} Oct 07 14:34:58 crc kubenswrapper[4854]: I1007 14:34:58.590573 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" event={"ID":"42bf907e-8047-4f86-99df-41a920bec529","Type":"ContainerStarted","Data":"afc4a2b5e47240e928af7ea1f798628bca912213252c3953ce8ee65e140973e5"} Oct 07 14:35:52 crc kubenswrapper[4854]: I1007 14:35:52.962514 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" podStartSLOduration=56.679416378 podStartE2EDuration="57.962494566s" podCreationTimestamp="2025-10-07 14:34:55 +0000 UTC" firstStartedPulling="2025-10-07 14:34:56.708263537 +0000 UTC m=+7812.696095832" lastFinishedPulling="2025-10-07 14:34:57.991341725 +0000 UTC m=+7813.979174020" observedRunningTime="2025-10-07 14:34:58.612217294 +0000 UTC m=+7814.600049579" watchObservedRunningTime="2025-10-07 14:35:52.962494566 +0000 UTC m=+7868.950326831" Oct 07 14:35:52 crc kubenswrapper[4854]: I1007 14:35:52.975854 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x5hrp"] Oct 07 14:35:52 crc kubenswrapper[4854]: I1007 14:35:52.978469 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:35:53 crc kubenswrapper[4854]: I1007 14:35:53.000257 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x5hrp"] Oct 07 14:35:53 crc kubenswrapper[4854]: I1007 14:35:53.132019 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcg2s\" (UniqueName: \"kubernetes.io/projected/3822e1e1-e2d3-448b-9739-cda4b84e04b6-kube-api-access-kcg2s\") pod \"community-operators-x5hrp\" (UID: \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\") " pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:35:53 crc kubenswrapper[4854]: I1007 14:35:53.132094 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3822e1e1-e2d3-448b-9739-cda4b84e04b6-catalog-content\") pod \"community-operators-x5hrp\" (UID: \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\") " pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:35:53 crc kubenswrapper[4854]: I1007 14:35:53.132136 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3822e1e1-e2d3-448b-9739-cda4b84e04b6-utilities\") pod \"community-operators-x5hrp\" (UID: \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\") " pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:35:53 crc kubenswrapper[4854]: I1007 14:35:53.234528 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcg2s\" (UniqueName: \"kubernetes.io/projected/3822e1e1-e2d3-448b-9739-cda4b84e04b6-kube-api-access-kcg2s\") pod \"community-operators-x5hrp\" (UID: \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\") " pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:35:53 crc kubenswrapper[4854]: I1007 14:35:53.234637 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3822e1e1-e2d3-448b-9739-cda4b84e04b6-catalog-content\") pod \"community-operators-x5hrp\" (UID: \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\") " pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:35:53 crc kubenswrapper[4854]: I1007 14:35:53.234687 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3822e1e1-e2d3-448b-9739-cda4b84e04b6-utilities\") pod \"community-operators-x5hrp\" (UID: \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\") " pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:35:53 crc kubenswrapper[4854]: I1007 14:35:53.235218 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3822e1e1-e2d3-448b-9739-cda4b84e04b6-catalog-content\") pod \"community-operators-x5hrp\" (UID: \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\") " pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:35:53 crc kubenswrapper[4854]: I1007 14:35:53.235285 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3822e1e1-e2d3-448b-9739-cda4b84e04b6-utilities\") pod \"community-operators-x5hrp\" (UID: \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\") " pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:35:53 crc kubenswrapper[4854]: I1007 14:35:53.256926 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcg2s\" (UniqueName: \"kubernetes.io/projected/3822e1e1-e2d3-448b-9739-cda4b84e04b6-kube-api-access-kcg2s\") pod \"community-operators-x5hrp\" (UID: \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\") " pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:35:53 crc kubenswrapper[4854]: I1007 14:35:53.305520 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:35:53 crc kubenswrapper[4854]: I1007 14:35:53.860427 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x5hrp"] Oct 07 14:35:54 crc kubenswrapper[4854]: I1007 14:35:54.171455 4854 generic.go:334] "Generic (PLEG): container finished" podID="3822e1e1-e2d3-448b-9739-cda4b84e04b6" containerID="e19c96288fd9303eabbc0e0b06c1aab5ef8fd5565a81500cba250fc82a0aa40e" exitCode=0 Oct 07 14:35:54 crc kubenswrapper[4854]: I1007 14:35:54.171499 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5hrp" event={"ID":"3822e1e1-e2d3-448b-9739-cda4b84e04b6","Type":"ContainerDied","Data":"e19c96288fd9303eabbc0e0b06c1aab5ef8fd5565a81500cba250fc82a0aa40e"} Oct 07 14:35:54 crc kubenswrapper[4854]: I1007 14:35:54.171523 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5hrp" event={"ID":"3822e1e1-e2d3-448b-9739-cda4b84e04b6","Type":"ContainerStarted","Data":"ff60b93a61fb694df0a73c26788ac40eff9e5a2537aaa465f69a866ee84de8f6"} Oct 07 14:35:56 crc kubenswrapper[4854]: I1007 14:35:56.194409 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5hrp" event={"ID":"3822e1e1-e2d3-448b-9739-cda4b84e04b6","Type":"ContainerStarted","Data":"b513a1564a9d6b516074a4d568486b5a2e8b36eda5fd741d9c0fce41c83a2e0b"} Oct 07 14:35:57 crc kubenswrapper[4854]: I1007 14:35:57.208462 4854 generic.go:334] "Generic (PLEG): container finished" podID="3822e1e1-e2d3-448b-9739-cda4b84e04b6" containerID="b513a1564a9d6b516074a4d568486b5a2e8b36eda5fd741d9c0fce41c83a2e0b" exitCode=0 Oct 07 14:35:57 crc kubenswrapper[4854]: I1007 14:35:57.208566 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5hrp" event={"ID":"3822e1e1-e2d3-448b-9739-cda4b84e04b6","Type":"ContainerDied","Data":"b513a1564a9d6b516074a4d568486b5a2e8b36eda5fd741d9c0fce41c83a2e0b"} Oct 07 14:35:59 crc kubenswrapper[4854]: I1007 14:35:59.238570 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5hrp" event={"ID":"3822e1e1-e2d3-448b-9739-cda4b84e04b6","Type":"ContainerStarted","Data":"497091e1c0a7821a260c8146ac11ae4f4df93e3bc59b27680827f8b4ecf718f9"} Oct 07 14:35:59 crc kubenswrapper[4854]: I1007 14:35:59.256730 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x5hrp" podStartSLOduration=3.413662418 podStartE2EDuration="7.256709747s" podCreationTimestamp="2025-10-07 14:35:52 +0000 UTC" firstStartedPulling="2025-10-07 14:35:54.173309972 +0000 UTC m=+7870.161142227" lastFinishedPulling="2025-10-07 14:35:58.016357291 +0000 UTC m=+7874.004189556" observedRunningTime="2025-10-07 14:35:59.254412021 +0000 UTC m=+7875.242244286" watchObservedRunningTime="2025-10-07 14:35:59.256709747 +0000 UTC m=+7875.244542002" Oct 07 14:36:03 crc kubenswrapper[4854]: I1007 14:36:03.305803 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:36:03 crc kubenswrapper[4854]: I1007 14:36:03.306433 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:36:03 crc kubenswrapper[4854]: I1007 14:36:03.364383 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:36:04 crc kubenswrapper[4854]: I1007 14:36:04.372966 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:36:04 crc kubenswrapper[4854]: I1007 14:36:04.421801 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x5hrp"] Oct 07 14:36:06 crc kubenswrapper[4854]: I1007 14:36:06.316052 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x5hrp" podUID="3822e1e1-e2d3-448b-9739-cda4b84e04b6" containerName="registry-server" containerID="cri-o://497091e1c0a7821a260c8146ac11ae4f4df93e3bc59b27680827f8b4ecf718f9" gracePeriod=2 Oct 07 14:36:06 crc kubenswrapper[4854]: I1007 14:36:06.844169 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:36:06 crc kubenswrapper[4854]: I1007 14:36:06.927975 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3822e1e1-e2d3-448b-9739-cda4b84e04b6-catalog-content\") pod \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\" (UID: \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\") " Oct 07 14:36:06 crc kubenswrapper[4854]: I1007 14:36:06.928028 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcg2s\" (UniqueName: \"kubernetes.io/projected/3822e1e1-e2d3-448b-9739-cda4b84e04b6-kube-api-access-kcg2s\") pod \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\" (UID: \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\") " Oct 07 14:36:06 crc kubenswrapper[4854]: I1007 14:36:06.928128 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3822e1e1-e2d3-448b-9739-cda4b84e04b6-utilities\") pod \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\" (UID: \"3822e1e1-e2d3-448b-9739-cda4b84e04b6\") " Oct 07 14:36:06 crc kubenswrapper[4854]: I1007 14:36:06.928960 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3822e1e1-e2d3-448b-9739-cda4b84e04b6-utilities" (OuterVolumeSpecName: "utilities") pod "3822e1e1-e2d3-448b-9739-cda4b84e04b6" (UID: "3822e1e1-e2d3-448b-9739-cda4b84e04b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:36:06 crc kubenswrapper[4854]: I1007 14:36:06.934522 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3822e1e1-e2d3-448b-9739-cda4b84e04b6-kube-api-access-kcg2s" (OuterVolumeSpecName: "kube-api-access-kcg2s") pod "3822e1e1-e2d3-448b-9739-cda4b84e04b6" (UID: "3822e1e1-e2d3-448b-9739-cda4b84e04b6"). InnerVolumeSpecName "kube-api-access-kcg2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:36:06 crc kubenswrapper[4854]: I1007 14:36:06.978320 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3822e1e1-e2d3-448b-9739-cda4b84e04b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3822e1e1-e2d3-448b-9739-cda4b84e04b6" (UID: "3822e1e1-e2d3-448b-9739-cda4b84e04b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.030182 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3822e1e1-e2d3-448b-9739-cda4b84e04b6-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.030224 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3822e1e1-e2d3-448b-9739-cda4b84e04b6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.030239 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcg2s\" (UniqueName: \"kubernetes.io/projected/3822e1e1-e2d3-448b-9739-cda4b84e04b6-kube-api-access-kcg2s\") on node \"crc\" DevicePath \"\"" Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.326726 4854 generic.go:334] "Generic (PLEG): container finished" podID="3822e1e1-e2d3-448b-9739-cda4b84e04b6" containerID="497091e1c0a7821a260c8146ac11ae4f4df93e3bc59b27680827f8b4ecf718f9" exitCode=0 Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.326777 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5hrp" event={"ID":"3822e1e1-e2d3-448b-9739-cda4b84e04b6","Type":"ContainerDied","Data":"497091e1c0a7821a260c8146ac11ae4f4df93e3bc59b27680827f8b4ecf718f9"} Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.326806 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5hrp" event={"ID":"3822e1e1-e2d3-448b-9739-cda4b84e04b6","Type":"ContainerDied","Data":"ff60b93a61fb694df0a73c26788ac40eff9e5a2537aaa465f69a866ee84de8f6"} Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.326834 4854 scope.go:117] "RemoveContainer" containerID="497091e1c0a7821a260c8146ac11ae4f4df93e3bc59b27680827f8b4ecf718f9" Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.326981 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5hrp" Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.372936 4854 scope.go:117] "RemoveContainer" containerID="b513a1564a9d6b516074a4d568486b5a2e8b36eda5fd741d9c0fce41c83a2e0b" Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.381525 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x5hrp"] Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.400228 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x5hrp"] Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.451788 4854 scope.go:117] "RemoveContainer" containerID="e19c96288fd9303eabbc0e0b06c1aab5ef8fd5565a81500cba250fc82a0aa40e" Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.477163 4854 scope.go:117] "RemoveContainer" containerID="497091e1c0a7821a260c8146ac11ae4f4df93e3bc59b27680827f8b4ecf718f9" Oct 07 14:36:07 crc kubenswrapper[4854]: E1007 14:36:07.477714 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"497091e1c0a7821a260c8146ac11ae4f4df93e3bc59b27680827f8b4ecf718f9\": container with ID starting with 497091e1c0a7821a260c8146ac11ae4f4df93e3bc59b27680827f8b4ecf718f9 not found: ID does not exist" containerID="497091e1c0a7821a260c8146ac11ae4f4df93e3bc59b27680827f8b4ecf718f9" Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.477787 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"497091e1c0a7821a260c8146ac11ae4f4df93e3bc59b27680827f8b4ecf718f9"} err="failed to get container status \"497091e1c0a7821a260c8146ac11ae4f4df93e3bc59b27680827f8b4ecf718f9\": rpc error: code = NotFound desc = could not find container \"497091e1c0a7821a260c8146ac11ae4f4df93e3bc59b27680827f8b4ecf718f9\": container with ID starting with 497091e1c0a7821a260c8146ac11ae4f4df93e3bc59b27680827f8b4ecf718f9 not found: ID does not exist" Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.477821 4854 scope.go:117] "RemoveContainer" containerID="b513a1564a9d6b516074a4d568486b5a2e8b36eda5fd741d9c0fce41c83a2e0b" Oct 07 14:36:07 crc kubenswrapper[4854]: E1007 14:36:07.478432 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b513a1564a9d6b516074a4d568486b5a2e8b36eda5fd741d9c0fce41c83a2e0b\": container with ID starting with b513a1564a9d6b516074a4d568486b5a2e8b36eda5fd741d9c0fce41c83a2e0b not found: ID does not exist" containerID="b513a1564a9d6b516074a4d568486b5a2e8b36eda5fd741d9c0fce41c83a2e0b" Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.479033 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b513a1564a9d6b516074a4d568486b5a2e8b36eda5fd741d9c0fce41c83a2e0b"} err="failed to get container status \"b513a1564a9d6b516074a4d568486b5a2e8b36eda5fd741d9c0fce41c83a2e0b\": rpc error: code = NotFound desc = could not find container \"b513a1564a9d6b516074a4d568486b5a2e8b36eda5fd741d9c0fce41c83a2e0b\": container with ID starting with b513a1564a9d6b516074a4d568486b5a2e8b36eda5fd741d9c0fce41c83a2e0b not found: ID does not exist" Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.479068 4854 scope.go:117] "RemoveContainer" containerID="e19c96288fd9303eabbc0e0b06c1aab5ef8fd5565a81500cba250fc82a0aa40e" Oct 07 14:36:07 crc kubenswrapper[4854]: E1007 14:36:07.479407 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e19c96288fd9303eabbc0e0b06c1aab5ef8fd5565a81500cba250fc82a0aa40e\": container with ID starting with e19c96288fd9303eabbc0e0b06c1aab5ef8fd5565a81500cba250fc82a0aa40e not found: ID does not exist" containerID="e19c96288fd9303eabbc0e0b06c1aab5ef8fd5565a81500cba250fc82a0aa40e" Oct 07 14:36:07 crc kubenswrapper[4854]: I1007 14:36:07.479434 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19c96288fd9303eabbc0e0b06c1aab5ef8fd5565a81500cba250fc82a0aa40e"} err="failed to get container status \"e19c96288fd9303eabbc0e0b06c1aab5ef8fd5565a81500cba250fc82a0aa40e\": rpc error: code = NotFound desc = could not find container \"e19c96288fd9303eabbc0e0b06c1aab5ef8fd5565a81500cba250fc82a0aa40e\": container with ID starting with e19c96288fd9303eabbc0e0b06c1aab5ef8fd5565a81500cba250fc82a0aa40e not found: ID does not exist" Oct 07 14:36:08 crc kubenswrapper[4854]: I1007 14:36:08.718264 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3822e1e1-e2d3-448b-9739-cda4b84e04b6" path="/var/lib/kubelet/pods/3822e1e1-e2d3-448b-9739-cda4b84e04b6/volumes" Oct 07 14:36:10 crc kubenswrapper[4854]: I1007 14:36:10.807670 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:36:10 crc kubenswrapper[4854]: I1007 14:36:10.808298 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:36:40 crc kubenswrapper[4854]: I1007 14:36:40.808114 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:36:40 crc kubenswrapper[4854]: I1007 14:36:40.808900 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:37:10 crc kubenswrapper[4854]: I1007 14:37:10.807699 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:37:10 crc kubenswrapper[4854]: I1007 14:37:10.808186 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:37:10 crc kubenswrapper[4854]: I1007 14:37:10.808226 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 14:37:10 crc kubenswrapper[4854]: I1007 14:37:10.808913 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dc81969983bbce8baa8681e21516942005959dc51371ccf54e48acb511d0a582"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:37:10 crc kubenswrapper[4854]: I1007 14:37:10.809021 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://dc81969983bbce8baa8681e21516942005959dc51371ccf54e48acb511d0a582" gracePeriod=600 Oct 07 14:37:11 crc kubenswrapper[4854]: I1007 14:37:11.066110 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="dc81969983bbce8baa8681e21516942005959dc51371ccf54e48acb511d0a582" exitCode=0 Oct 07 14:37:11 crc kubenswrapper[4854]: I1007 14:37:11.066177 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"dc81969983bbce8baa8681e21516942005959dc51371ccf54e48acb511d0a582"} Oct 07 14:37:11 crc kubenswrapper[4854]: I1007 14:37:11.066216 4854 scope.go:117] "RemoveContainer" containerID="3678c6fd88eab7cb7d631e93558a19c5557cf9245aa72ce39e1a1bf4e1c6125a" Oct 07 14:37:12 crc kubenswrapper[4854]: I1007 14:37:12.079517 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462"} Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.721652 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vtghm"] Oct 07 14:39:10 crc kubenswrapper[4854]: E1007 14:39:10.722856 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3822e1e1-e2d3-448b-9739-cda4b84e04b6" containerName="extract-content" Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.722877 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="3822e1e1-e2d3-448b-9739-cda4b84e04b6" containerName="extract-content" Oct 07 14:39:10 crc kubenswrapper[4854]: E1007 14:39:10.722919 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3822e1e1-e2d3-448b-9739-cda4b84e04b6" containerName="extract-utilities" Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.722931 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="3822e1e1-e2d3-448b-9739-cda4b84e04b6" containerName="extract-utilities" Oct 07 14:39:10 crc kubenswrapper[4854]: E1007 14:39:10.722971 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3822e1e1-e2d3-448b-9739-cda4b84e04b6" containerName="registry-server" Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.722984 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="3822e1e1-e2d3-448b-9739-cda4b84e04b6" containerName="registry-server" Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.723400 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="3822e1e1-e2d3-448b-9739-cda4b84e04b6" containerName="registry-server" Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.726082 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.734953 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vtghm"] Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.792751 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b62f8fb-f286-45dc-b54f-19976ee57bfe-utilities\") pod \"certified-operators-vtghm\" (UID: \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\") " pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.793080 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b62f8fb-f286-45dc-b54f-19976ee57bfe-catalog-content\") pod \"certified-operators-vtghm\" (UID: \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\") " pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.793324 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqfq7\" (UniqueName: \"kubernetes.io/projected/7b62f8fb-f286-45dc-b54f-19976ee57bfe-kube-api-access-kqfq7\") pod \"certified-operators-vtghm\" (UID: \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\") " pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.895604 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b62f8fb-f286-45dc-b54f-19976ee57bfe-utilities\") pod \"certified-operators-vtghm\" (UID: \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\") " pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.895792 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b62f8fb-f286-45dc-b54f-19976ee57bfe-catalog-content\") pod \"certified-operators-vtghm\" (UID: \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\") " pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.895887 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqfq7\" (UniqueName: \"kubernetes.io/projected/7b62f8fb-f286-45dc-b54f-19976ee57bfe-kube-api-access-kqfq7\") pod \"certified-operators-vtghm\" (UID: \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\") " pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.896391 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b62f8fb-f286-45dc-b54f-19976ee57bfe-utilities\") pod \"certified-operators-vtghm\" (UID: \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\") " pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.896427 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b62f8fb-f286-45dc-b54f-19976ee57bfe-catalog-content\") pod \"certified-operators-vtghm\" (UID: \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\") " pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:10 crc kubenswrapper[4854]: I1007 14:39:10.919967 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqfq7\" (UniqueName: \"kubernetes.io/projected/7b62f8fb-f286-45dc-b54f-19976ee57bfe-kube-api-access-kqfq7\") pod \"certified-operators-vtghm\" (UID: \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\") " pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:11 crc kubenswrapper[4854]: I1007 14:39:11.056859 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:11 crc kubenswrapper[4854]: I1007 14:39:11.631462 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vtghm"] Oct 07 14:39:12 crc kubenswrapper[4854]: I1007 14:39:12.437672 4854 generic.go:334] "Generic (PLEG): container finished" podID="7b62f8fb-f286-45dc-b54f-19976ee57bfe" containerID="7e033bddfbec70c1dd44efa703487704ea676fce783234a45c92c77c247d06ef" exitCode=0 Oct 07 14:39:12 crc kubenswrapper[4854]: I1007 14:39:12.437994 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vtghm" event={"ID":"7b62f8fb-f286-45dc-b54f-19976ee57bfe","Type":"ContainerDied","Data":"7e033bddfbec70c1dd44efa703487704ea676fce783234a45c92c77c247d06ef"} Oct 07 14:39:12 crc kubenswrapper[4854]: I1007 14:39:12.438076 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vtghm" event={"ID":"7b62f8fb-f286-45dc-b54f-19976ee57bfe","Type":"ContainerStarted","Data":"000d4d9eb4a5dcbca4c2555307b831ce453ce84ea31c94432364394cbc4fc27a"} Oct 07 14:39:13 crc kubenswrapper[4854]: I1007 14:39:13.453691 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vtghm" event={"ID":"7b62f8fb-f286-45dc-b54f-19976ee57bfe","Type":"ContainerStarted","Data":"1482339e9c408b1a9338d0f49c89ecd0535dc76f950cefbd8fcf7d2edcbf2ba7"} Oct 07 14:39:14 crc kubenswrapper[4854]: I1007 14:39:14.467233 4854 generic.go:334] "Generic (PLEG): container finished" podID="7b62f8fb-f286-45dc-b54f-19976ee57bfe" containerID="1482339e9c408b1a9338d0f49c89ecd0535dc76f950cefbd8fcf7d2edcbf2ba7" exitCode=0 Oct 07 14:39:14 crc kubenswrapper[4854]: I1007 14:39:14.467284 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vtghm" event={"ID":"7b62f8fb-f286-45dc-b54f-19976ee57bfe","Type":"ContainerDied","Data":"1482339e9c408b1a9338d0f49c89ecd0535dc76f950cefbd8fcf7d2edcbf2ba7"} Oct 07 14:39:16 crc kubenswrapper[4854]: I1007 14:39:16.492765 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vtghm" event={"ID":"7b62f8fb-f286-45dc-b54f-19976ee57bfe","Type":"ContainerStarted","Data":"cb065fda8d5a61d56bbe439a3599c5ad998976869f7a158de62f0f142ac81d95"} Oct 07 14:39:16 crc kubenswrapper[4854]: I1007 14:39:16.513968 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vtghm" podStartSLOduration=3.363868516 podStartE2EDuration="6.513943247s" podCreationTimestamp="2025-10-07 14:39:10 +0000 UTC" firstStartedPulling="2025-10-07 14:39:12.441009131 +0000 UTC m=+8068.428841386" lastFinishedPulling="2025-10-07 14:39:15.591083862 +0000 UTC m=+8071.578916117" observedRunningTime="2025-10-07 14:39:16.506881954 +0000 UTC m=+8072.494714209" watchObservedRunningTime="2025-10-07 14:39:16.513943247 +0000 UTC m=+8072.501775522" Oct 07 14:39:21 crc kubenswrapper[4854]: I1007 14:39:21.057746 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:21 crc kubenswrapper[4854]: I1007 14:39:21.092597 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:21 crc kubenswrapper[4854]: I1007 14:39:21.159885 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:21 crc kubenswrapper[4854]: I1007 14:39:21.588831 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:21 crc kubenswrapper[4854]: I1007 14:39:21.638725 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vtghm"] Oct 07 14:39:23 crc kubenswrapper[4854]: I1007 14:39:23.558871 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vtghm" podUID="7b62f8fb-f286-45dc-b54f-19976ee57bfe" containerName="registry-server" containerID="cri-o://cb065fda8d5a61d56bbe439a3599c5ad998976869f7a158de62f0f142ac81d95" gracePeriod=2 Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.128700 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.192173 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqfq7\" (UniqueName: \"kubernetes.io/projected/7b62f8fb-f286-45dc-b54f-19976ee57bfe-kube-api-access-kqfq7\") pod \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\" (UID: \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\") " Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.192246 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b62f8fb-f286-45dc-b54f-19976ee57bfe-utilities\") pod \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\" (UID: \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\") " Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.192331 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b62f8fb-f286-45dc-b54f-19976ee57bfe-catalog-content\") pod \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\" (UID: \"7b62f8fb-f286-45dc-b54f-19976ee57bfe\") " Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.193460 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b62f8fb-f286-45dc-b54f-19976ee57bfe-utilities" (OuterVolumeSpecName: "utilities") pod "7b62f8fb-f286-45dc-b54f-19976ee57bfe" (UID: "7b62f8fb-f286-45dc-b54f-19976ee57bfe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.197616 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b62f8fb-f286-45dc-b54f-19976ee57bfe-kube-api-access-kqfq7" (OuterVolumeSpecName: "kube-api-access-kqfq7") pod "7b62f8fb-f286-45dc-b54f-19976ee57bfe" (UID: "7b62f8fb-f286-45dc-b54f-19976ee57bfe"). InnerVolumeSpecName "kube-api-access-kqfq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.254502 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b62f8fb-f286-45dc-b54f-19976ee57bfe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b62f8fb-f286-45dc-b54f-19976ee57bfe" (UID: "7b62f8fb-f286-45dc-b54f-19976ee57bfe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.295174 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqfq7\" (UniqueName: \"kubernetes.io/projected/7b62f8fb-f286-45dc-b54f-19976ee57bfe-kube-api-access-kqfq7\") on node \"crc\" DevicePath \"\"" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.295220 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b62f8fb-f286-45dc-b54f-19976ee57bfe-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.295234 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b62f8fb-f286-45dc-b54f-19976ee57bfe-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.578930 4854 generic.go:334] "Generic (PLEG): container finished" podID="7b62f8fb-f286-45dc-b54f-19976ee57bfe" containerID="cb065fda8d5a61d56bbe439a3599c5ad998976869f7a158de62f0f142ac81d95" exitCode=0 Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.579036 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vtghm" event={"ID":"7b62f8fb-f286-45dc-b54f-19976ee57bfe","Type":"ContainerDied","Data":"cb065fda8d5a61d56bbe439a3599c5ad998976869f7a158de62f0f142ac81d95"} Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.581993 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vtghm" event={"ID":"7b62f8fb-f286-45dc-b54f-19976ee57bfe","Type":"ContainerDied","Data":"000d4d9eb4a5dcbca4c2555307b831ce453ce84ea31c94432364394cbc4fc27a"} Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.579057 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vtghm" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.582258 4854 scope.go:117] "RemoveContainer" containerID="cb065fda8d5a61d56bbe439a3599c5ad998976869f7a158de62f0f142ac81d95" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.616251 4854 scope.go:117] "RemoveContainer" containerID="1482339e9c408b1a9338d0f49c89ecd0535dc76f950cefbd8fcf7d2edcbf2ba7" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.622027 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vtghm"] Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.631921 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vtghm"] Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.717840 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b62f8fb-f286-45dc-b54f-19976ee57bfe" path="/var/lib/kubelet/pods/7b62f8fb-f286-45dc-b54f-19976ee57bfe/volumes" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.720058 4854 scope.go:117] "RemoveContainer" containerID="7e033bddfbec70c1dd44efa703487704ea676fce783234a45c92c77c247d06ef" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.749283 4854 scope.go:117] "RemoveContainer" containerID="cb065fda8d5a61d56bbe439a3599c5ad998976869f7a158de62f0f142ac81d95" Oct 07 14:39:24 crc kubenswrapper[4854]: E1007 14:39:24.749816 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb065fda8d5a61d56bbe439a3599c5ad998976869f7a158de62f0f142ac81d95\": container with ID starting with cb065fda8d5a61d56bbe439a3599c5ad998976869f7a158de62f0f142ac81d95 not found: ID does not exist" containerID="cb065fda8d5a61d56bbe439a3599c5ad998976869f7a158de62f0f142ac81d95" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.749875 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb065fda8d5a61d56bbe439a3599c5ad998976869f7a158de62f0f142ac81d95"} err="failed to get container status \"cb065fda8d5a61d56bbe439a3599c5ad998976869f7a158de62f0f142ac81d95\": rpc error: code = NotFound desc = could not find container \"cb065fda8d5a61d56bbe439a3599c5ad998976869f7a158de62f0f142ac81d95\": container with ID starting with cb065fda8d5a61d56bbe439a3599c5ad998976869f7a158de62f0f142ac81d95 not found: ID does not exist" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.749908 4854 scope.go:117] "RemoveContainer" containerID="1482339e9c408b1a9338d0f49c89ecd0535dc76f950cefbd8fcf7d2edcbf2ba7" Oct 07 14:39:24 crc kubenswrapper[4854]: E1007 14:39:24.751370 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1482339e9c408b1a9338d0f49c89ecd0535dc76f950cefbd8fcf7d2edcbf2ba7\": container with ID starting with 1482339e9c408b1a9338d0f49c89ecd0535dc76f950cefbd8fcf7d2edcbf2ba7 not found: ID does not exist" containerID="1482339e9c408b1a9338d0f49c89ecd0535dc76f950cefbd8fcf7d2edcbf2ba7" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.751408 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1482339e9c408b1a9338d0f49c89ecd0535dc76f950cefbd8fcf7d2edcbf2ba7"} err="failed to get container status \"1482339e9c408b1a9338d0f49c89ecd0535dc76f950cefbd8fcf7d2edcbf2ba7\": rpc error: code = NotFound desc = could not find container \"1482339e9c408b1a9338d0f49c89ecd0535dc76f950cefbd8fcf7d2edcbf2ba7\": container with ID starting with 1482339e9c408b1a9338d0f49c89ecd0535dc76f950cefbd8fcf7d2edcbf2ba7 not found: ID does not exist" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.751433 4854 scope.go:117] "RemoveContainer" containerID="7e033bddfbec70c1dd44efa703487704ea676fce783234a45c92c77c247d06ef" Oct 07 14:39:24 crc kubenswrapper[4854]: E1007 14:39:24.751807 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e033bddfbec70c1dd44efa703487704ea676fce783234a45c92c77c247d06ef\": container with ID starting with 7e033bddfbec70c1dd44efa703487704ea676fce783234a45c92c77c247d06ef not found: ID does not exist" containerID="7e033bddfbec70c1dd44efa703487704ea676fce783234a45c92c77c247d06ef" Oct 07 14:39:24 crc kubenswrapper[4854]: I1007 14:39:24.751837 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e033bddfbec70c1dd44efa703487704ea676fce783234a45c92c77c247d06ef"} err="failed to get container status \"7e033bddfbec70c1dd44efa703487704ea676fce783234a45c92c77c247d06ef\": rpc error: code = NotFound desc = could not find container \"7e033bddfbec70c1dd44efa703487704ea676fce783234a45c92c77c247d06ef\": container with ID starting with 7e033bddfbec70c1dd44efa703487704ea676fce783234a45c92c77c247d06ef not found: ID does not exist" Oct 07 14:39:40 crc kubenswrapper[4854]: I1007 14:39:40.807409 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:39:40 crc kubenswrapper[4854]: I1007 14:39:40.808118 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:39:43 crc kubenswrapper[4854]: I1007 14:39:43.932460 4854 generic.go:334] "Generic (PLEG): container finished" podID="42bf907e-8047-4f86-99df-41a920bec529" containerID="afc4a2b5e47240e928af7ea1f798628bca912213252c3953ce8ee65e140973e5" exitCode=0 Oct 07 14:39:43 crc kubenswrapper[4854]: I1007 14:39:43.933343 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" event={"ID":"42bf907e-8047-4f86-99df-41a920bec529","Type":"ContainerDied","Data":"afc4a2b5e47240e928af7ea1f798628bca912213252c3953ce8ee65e140973e5"} Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.516571 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.609557 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-ssh-key\") pod \"42bf907e-8047-4f86-99df-41a920bec529\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.609627 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-libvirt-secret-0\") pod \"42bf907e-8047-4f86-99df-41a920bec529\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.609654 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-libvirt-combined-ca-bundle\") pod \"42bf907e-8047-4f86-99df-41a920bec529\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.609788 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvrs7\" (UniqueName: \"kubernetes.io/projected/42bf907e-8047-4f86-99df-41a920bec529-kube-api-access-vvrs7\") pod \"42bf907e-8047-4f86-99df-41a920bec529\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.609897 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-ceph\") pod \"42bf907e-8047-4f86-99df-41a920bec529\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.609956 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-inventory\") pod \"42bf907e-8047-4f86-99df-41a920bec529\" (UID: \"42bf907e-8047-4f86-99df-41a920bec529\") " Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.616993 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "42bf907e-8047-4f86-99df-41a920bec529" (UID: "42bf907e-8047-4f86-99df-41a920bec529"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.619977 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-ceph" (OuterVolumeSpecName: "ceph") pod "42bf907e-8047-4f86-99df-41a920bec529" (UID: "42bf907e-8047-4f86-99df-41a920bec529"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.620514 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42bf907e-8047-4f86-99df-41a920bec529-kube-api-access-vvrs7" (OuterVolumeSpecName: "kube-api-access-vvrs7") pod "42bf907e-8047-4f86-99df-41a920bec529" (UID: "42bf907e-8047-4f86-99df-41a920bec529"). InnerVolumeSpecName "kube-api-access-vvrs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.642264 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "42bf907e-8047-4f86-99df-41a920bec529" (UID: "42bf907e-8047-4f86-99df-41a920bec529"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.648380 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-inventory" (OuterVolumeSpecName: "inventory") pod "42bf907e-8047-4f86-99df-41a920bec529" (UID: "42bf907e-8047-4f86-99df-41a920bec529"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.651483 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "42bf907e-8047-4f86-99df-41a920bec529" (UID: "42bf907e-8047-4f86-99df-41a920bec529"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.713271 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvrs7\" (UniqueName: \"kubernetes.io/projected/42bf907e-8047-4f86-99df-41a920bec529-kube-api-access-vvrs7\") on node \"crc\" DevicePath \"\"" Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.713484 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.713611 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.713726 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.713838 4854 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.713952 4854 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bf907e-8047-4f86-99df-41a920bec529-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.963650 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" event={"ID":"42bf907e-8047-4f86-99df-41a920bec529","Type":"ContainerDied","Data":"bd2d0941d7196bdfaf72244cd9bf27dc5a542e7ecb912c0a33d9e8d9ee3ad134"} Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.963732 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd2d0941d7196bdfaf72244cd9bf27dc5a542e7ecb912c0a33d9e8d9ee3ad134" Oct 07 14:39:45 crc kubenswrapper[4854]: I1007 14:39:45.963842 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-6h9d9" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.091659 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-ktmw7"] Oct 07 14:39:46 crc kubenswrapper[4854]: E1007 14:39:46.092213 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42bf907e-8047-4f86-99df-41a920bec529" containerName="libvirt-openstack-openstack-cell1" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.092231 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="42bf907e-8047-4f86-99df-41a920bec529" containerName="libvirt-openstack-openstack-cell1" Oct 07 14:39:46 crc kubenswrapper[4854]: E1007 14:39:46.092248 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b62f8fb-f286-45dc-b54f-19976ee57bfe" containerName="extract-utilities" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.092257 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b62f8fb-f286-45dc-b54f-19976ee57bfe" containerName="extract-utilities" Oct 07 14:39:46 crc kubenswrapper[4854]: E1007 14:39:46.092309 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b62f8fb-f286-45dc-b54f-19976ee57bfe" containerName="extract-content" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.092320 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b62f8fb-f286-45dc-b54f-19976ee57bfe" containerName="extract-content" Oct 07 14:39:46 crc kubenswrapper[4854]: E1007 14:39:46.092346 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b62f8fb-f286-45dc-b54f-19976ee57bfe" containerName="registry-server" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.092354 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b62f8fb-f286-45dc-b54f-19976ee57bfe" containerName="registry-server" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.092595 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="42bf907e-8047-4f86-99df-41a920bec529" containerName="libvirt-openstack-openstack-cell1" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.092632 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b62f8fb-f286-45dc-b54f-19976ee57bfe" containerName="registry-server" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.093611 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.095759 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.096275 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.097564 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.097755 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.098967 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.101952 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.102116 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.107806 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-ktmw7"] Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.229333 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.229517 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-inventory\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.229589 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.229672 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.229734 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.230005 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-ceph\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.230143 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.230416 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.230461 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.230634 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.230725 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqr85\" (UniqueName: \"kubernetes.io/projected/e9d7777b-b10c-44e6-970c-34dec79d193e-kube-api-access-xqr85\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.333199 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.333663 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-ceph\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.333709 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.333825 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.333863 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.333918 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.333960 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqr85\" (UniqueName: \"kubernetes.io/projected/e9d7777b-b10c-44e6-970c-34dec79d193e-kube-api-access-xqr85\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.334070 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.334126 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-inventory\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.334185 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.334211 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.335167 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.335919 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.339491 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-ceph\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.339730 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.340055 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.340772 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.343091 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.343954 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.344973 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.350237 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-inventory\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.351929 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqr85\" (UniqueName: \"kubernetes.io/projected/e9d7777b-b10c-44e6-970c-34dec79d193e-kube-api-access-xqr85\") pod \"nova-cell1-openstack-openstack-cell1-ktmw7\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:46 crc kubenswrapper[4854]: I1007 14:39:46.412392 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:39:47 crc kubenswrapper[4854]: I1007 14:39:47.012383 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-ktmw7"] Oct 07 14:39:47 crc kubenswrapper[4854]: I1007 14:39:47.993387 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" event={"ID":"e9d7777b-b10c-44e6-970c-34dec79d193e","Type":"ContainerStarted","Data":"338252bd6c736f94c71ff7c96d4e0a8c7e02ee9fe19a315c9df6c4dc5c988609"} Oct 07 14:39:47 crc kubenswrapper[4854]: I1007 14:39:47.993868 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" event={"ID":"e9d7777b-b10c-44e6-970c-34dec79d193e","Type":"ContainerStarted","Data":"b508ef17c93fca8bd637e871c209ee9b64e3c3856a881a1b32f9ebe7bbf0ac5c"} Oct 07 14:39:48 crc kubenswrapper[4854]: I1007 14:39:48.018610 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" podStartSLOduration=1.5825269560000002 podStartE2EDuration="2.018588992s" podCreationTimestamp="2025-10-07 14:39:46 +0000 UTC" firstStartedPulling="2025-10-07 14:39:47.014961662 +0000 UTC m=+8103.002793957" lastFinishedPulling="2025-10-07 14:39:47.451023728 +0000 UTC m=+8103.438855993" observedRunningTime="2025-10-07 14:39:48.012236788 +0000 UTC m=+8104.000069043" watchObservedRunningTime="2025-10-07 14:39:48.018588992 +0000 UTC m=+8104.006421247" Oct 07 14:40:10 crc kubenswrapper[4854]: I1007 14:40:10.807620 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:40:10 crc kubenswrapper[4854]: I1007 14:40:10.808193 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:40:40 crc kubenswrapper[4854]: I1007 14:40:40.807451 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:40:40 crc kubenswrapper[4854]: I1007 14:40:40.808008 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:40:40 crc kubenswrapper[4854]: I1007 14:40:40.808064 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 14:40:40 crc kubenswrapper[4854]: I1007 14:40:40.808840 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:40:40 crc kubenswrapper[4854]: I1007 14:40:40.808902 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" gracePeriod=600 Oct 07 14:40:40 crc kubenswrapper[4854]: E1007 14:40:40.938962 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:40:41 crc kubenswrapper[4854]: I1007 14:40:41.602307 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" exitCode=0 Oct 07 14:40:41 crc kubenswrapper[4854]: I1007 14:40:41.602362 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462"} Oct 07 14:40:41 crc kubenswrapper[4854]: I1007 14:40:41.602400 4854 scope.go:117] "RemoveContainer" containerID="dc81969983bbce8baa8681e21516942005959dc51371ccf54e48acb511d0a582" Oct 07 14:40:41 crc kubenswrapper[4854]: I1007 14:40:41.603139 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:40:41 crc kubenswrapper[4854]: E1007 14:40:41.603527 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:40:55 crc kubenswrapper[4854]: I1007 14:40:55.703036 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:40:55 crc kubenswrapper[4854]: E1007 14:40:55.703863 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:41:06 crc kubenswrapper[4854]: I1007 14:41:06.703700 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:41:06 crc kubenswrapper[4854]: E1007 14:41:06.705210 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:41:19 crc kubenswrapper[4854]: I1007 14:41:19.702483 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:41:19 crc kubenswrapper[4854]: E1007 14:41:19.704083 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:41:34 crc kubenswrapper[4854]: I1007 14:41:34.709568 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:41:34 crc kubenswrapper[4854]: E1007 14:41:34.710361 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:41:49 crc kubenswrapper[4854]: I1007 14:41:49.702543 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:41:49 crc kubenswrapper[4854]: E1007 14:41:49.703357 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:42:00 crc kubenswrapper[4854]: I1007 14:42:00.702923 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:42:00 crc kubenswrapper[4854]: E1007 14:42:00.703683 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:42:12 crc kubenswrapper[4854]: I1007 14:42:12.706524 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:42:12 crc kubenswrapper[4854]: E1007 14:42:12.707999 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:42:26 crc kubenswrapper[4854]: I1007 14:42:26.704043 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:42:26 crc kubenswrapper[4854]: E1007 14:42:26.705264 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:42:40 crc kubenswrapper[4854]: I1007 14:42:40.704122 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:42:40 crc kubenswrapper[4854]: E1007 14:42:40.705574 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:42:52 crc kubenswrapper[4854]: I1007 14:42:52.703440 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:42:52 crc kubenswrapper[4854]: E1007 14:42:52.704582 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:43:04 crc kubenswrapper[4854]: I1007 14:43:04.719899 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:43:04 crc kubenswrapper[4854]: E1007 14:43:04.722428 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:43:19 crc kubenswrapper[4854]: I1007 14:43:19.703119 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:43:19 crc kubenswrapper[4854]: E1007 14:43:19.704055 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:43:19 crc kubenswrapper[4854]: I1007 14:43:19.977612 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vcm84"] Oct 07 14:43:19 crc kubenswrapper[4854]: I1007 14:43:19.979837 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:19 crc kubenswrapper[4854]: I1007 14:43:19.994862 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085d10f5-39c8-469c-beef-02158cccf12b-catalog-content\") pod \"redhat-operators-vcm84\" (UID: \"085d10f5-39c8-469c-beef-02158cccf12b\") " pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:19 crc kubenswrapper[4854]: I1007 14:43:19.995295 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzz45\" (UniqueName: \"kubernetes.io/projected/085d10f5-39c8-469c-beef-02158cccf12b-kube-api-access-kzz45\") pod \"redhat-operators-vcm84\" (UID: \"085d10f5-39c8-469c-beef-02158cccf12b\") " pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:19 crc kubenswrapper[4854]: I1007 14:43:19.995584 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085d10f5-39c8-469c-beef-02158cccf12b-utilities\") pod \"redhat-operators-vcm84\" (UID: \"085d10f5-39c8-469c-beef-02158cccf12b\") " pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:20 crc kubenswrapper[4854]: I1007 14:43:20.002484 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vcm84"] Oct 07 14:43:20 crc kubenswrapper[4854]: I1007 14:43:20.097294 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085d10f5-39c8-469c-beef-02158cccf12b-utilities\") pod \"redhat-operators-vcm84\" (UID: \"085d10f5-39c8-469c-beef-02158cccf12b\") " pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:20 crc kubenswrapper[4854]: I1007 14:43:20.097381 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085d10f5-39c8-469c-beef-02158cccf12b-catalog-content\") pod \"redhat-operators-vcm84\" (UID: \"085d10f5-39c8-469c-beef-02158cccf12b\") " pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:20 crc kubenswrapper[4854]: I1007 14:43:20.097524 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzz45\" (UniqueName: \"kubernetes.io/projected/085d10f5-39c8-469c-beef-02158cccf12b-kube-api-access-kzz45\") pod \"redhat-operators-vcm84\" (UID: \"085d10f5-39c8-469c-beef-02158cccf12b\") " pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:20 crc kubenswrapper[4854]: I1007 14:43:20.098115 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085d10f5-39c8-469c-beef-02158cccf12b-utilities\") pod \"redhat-operators-vcm84\" (UID: \"085d10f5-39c8-469c-beef-02158cccf12b\") " pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:20 crc kubenswrapper[4854]: I1007 14:43:20.098254 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085d10f5-39c8-469c-beef-02158cccf12b-catalog-content\") pod \"redhat-operators-vcm84\" (UID: \"085d10f5-39c8-469c-beef-02158cccf12b\") " pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:20 crc kubenswrapper[4854]: I1007 14:43:20.122847 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzz45\" (UniqueName: \"kubernetes.io/projected/085d10f5-39c8-469c-beef-02158cccf12b-kube-api-access-kzz45\") pod \"redhat-operators-vcm84\" (UID: \"085d10f5-39c8-469c-beef-02158cccf12b\") " pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:20 crc kubenswrapper[4854]: I1007 14:43:20.305413 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:20 crc kubenswrapper[4854]: I1007 14:43:20.819676 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vcm84"] Oct 07 14:43:21 crc kubenswrapper[4854]: I1007 14:43:21.541654 4854 generic.go:334] "Generic (PLEG): container finished" podID="085d10f5-39c8-469c-beef-02158cccf12b" containerID="6a0b06b409fe17c0e3f931a2ff8f1a04cfbf2c24e5a6bc4d0a6e892c2e2e9403" exitCode=0 Oct 07 14:43:21 crc kubenswrapper[4854]: I1007 14:43:21.541722 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcm84" event={"ID":"085d10f5-39c8-469c-beef-02158cccf12b","Type":"ContainerDied","Data":"6a0b06b409fe17c0e3f931a2ff8f1a04cfbf2c24e5a6bc4d0a6e892c2e2e9403"} Oct 07 14:43:21 crc kubenswrapper[4854]: I1007 14:43:21.541977 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcm84" event={"ID":"085d10f5-39c8-469c-beef-02158cccf12b","Type":"ContainerStarted","Data":"c69a91624618048fd814c9e1a1821a719b24b392da3ba952bd7612ec0d45705a"} Oct 07 14:43:21 crc kubenswrapper[4854]: I1007 14:43:21.544714 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:43:23 crc kubenswrapper[4854]: I1007 14:43:23.582568 4854 generic.go:334] "Generic (PLEG): container finished" podID="085d10f5-39c8-469c-beef-02158cccf12b" containerID="2f239bc4f059138cbe7bef20b32ab8bae7cad725d92cbb87460c8dadddf61fef" exitCode=0 Oct 07 14:43:23 crc kubenswrapper[4854]: I1007 14:43:23.582767 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcm84" event={"ID":"085d10f5-39c8-469c-beef-02158cccf12b","Type":"ContainerDied","Data":"2f239bc4f059138cbe7bef20b32ab8bae7cad725d92cbb87460c8dadddf61fef"} Oct 07 14:43:24 crc kubenswrapper[4854]: I1007 14:43:24.602426 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcm84" event={"ID":"085d10f5-39c8-469c-beef-02158cccf12b","Type":"ContainerStarted","Data":"72ecb663d15baca7405b21ceac12c46b0dcb83fb112f0490d25f48721048b2b7"} Oct 07 14:43:24 crc kubenswrapper[4854]: I1007 14:43:24.623715 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vcm84" podStartSLOduration=3.151039177 podStartE2EDuration="5.623697051s" podCreationTimestamp="2025-10-07 14:43:19 +0000 UTC" firstStartedPulling="2025-10-07 14:43:21.544442545 +0000 UTC m=+8317.532274810" lastFinishedPulling="2025-10-07 14:43:24.017100429 +0000 UTC m=+8320.004932684" observedRunningTime="2025-10-07 14:43:24.620687474 +0000 UTC m=+8320.608519759" watchObservedRunningTime="2025-10-07 14:43:24.623697051 +0000 UTC m=+8320.611529306" Oct 07 14:43:30 crc kubenswrapper[4854]: I1007 14:43:30.305562 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:30 crc kubenswrapper[4854]: I1007 14:43:30.305992 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:30 crc kubenswrapper[4854]: I1007 14:43:30.382087 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:30 crc kubenswrapper[4854]: I1007 14:43:30.720994 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:30 crc kubenswrapper[4854]: I1007 14:43:30.786088 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vcm84"] Oct 07 14:43:31 crc kubenswrapper[4854]: I1007 14:43:31.666409 4854 generic.go:334] "Generic (PLEG): container finished" podID="e9d7777b-b10c-44e6-970c-34dec79d193e" containerID="338252bd6c736f94c71ff7c96d4e0a8c7e02ee9fe19a315c9df6c4dc5c988609" exitCode=0 Oct 07 14:43:31 crc kubenswrapper[4854]: I1007 14:43:31.666501 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" event={"ID":"e9d7777b-b10c-44e6-970c-34dec79d193e","Type":"ContainerDied","Data":"338252bd6c736f94c71ff7c96d4e0a8c7e02ee9fe19a315c9df6c4dc5c988609"} Oct 07 14:43:31 crc kubenswrapper[4854]: I1007 14:43:31.703017 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:43:31 crc kubenswrapper[4854]: E1007 14:43:31.703324 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:43:32 crc kubenswrapper[4854]: I1007 14:43:32.676048 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vcm84" podUID="085d10f5-39c8-469c-beef-02158cccf12b" containerName="registry-server" containerID="cri-o://72ecb663d15baca7405b21ceac12c46b0dcb83fb112f0490d25f48721048b2b7" gracePeriod=2 Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.292913 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.300997 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.408691 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-ssh-key\") pod \"e9d7777b-b10c-44e6-970c-34dec79d193e\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.408775 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzz45\" (UniqueName: \"kubernetes.io/projected/085d10f5-39c8-469c-beef-02158cccf12b-kube-api-access-kzz45\") pod \"085d10f5-39c8-469c-beef-02158cccf12b\" (UID: \"085d10f5-39c8-469c-beef-02158cccf12b\") " Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.408803 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-migration-ssh-key-0\") pod \"e9d7777b-b10c-44e6-970c-34dec79d193e\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.408866 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-combined-ca-bundle\") pod \"e9d7777b-b10c-44e6-970c-34dec79d193e\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.408939 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-compute-config-0\") pod \"e9d7777b-b10c-44e6-970c-34dec79d193e\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.408961 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqr85\" (UniqueName: \"kubernetes.io/projected/e9d7777b-b10c-44e6-970c-34dec79d193e-kube-api-access-xqr85\") pod \"e9d7777b-b10c-44e6-970c-34dec79d193e\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.409005 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cells-global-config-1\") pod \"e9d7777b-b10c-44e6-970c-34dec79d193e\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.409035 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-migration-ssh-key-1\") pod \"e9d7777b-b10c-44e6-970c-34dec79d193e\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.409091 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085d10f5-39c8-469c-beef-02158cccf12b-utilities\") pod \"085d10f5-39c8-469c-beef-02158cccf12b\" (UID: \"085d10f5-39c8-469c-beef-02158cccf12b\") " Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.409292 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-inventory\") pod \"e9d7777b-b10c-44e6-970c-34dec79d193e\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.409308 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-ceph\") pod \"e9d7777b-b10c-44e6-970c-34dec79d193e\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.409340 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-compute-config-1\") pod \"e9d7777b-b10c-44e6-970c-34dec79d193e\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.409381 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cells-global-config-0\") pod \"e9d7777b-b10c-44e6-970c-34dec79d193e\" (UID: \"e9d7777b-b10c-44e6-970c-34dec79d193e\") " Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.409451 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085d10f5-39c8-469c-beef-02158cccf12b-catalog-content\") pod \"085d10f5-39c8-469c-beef-02158cccf12b\" (UID: \"085d10f5-39c8-469c-beef-02158cccf12b\") " Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.410334 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/085d10f5-39c8-469c-beef-02158cccf12b-utilities" (OuterVolumeSpecName: "utilities") pod "085d10f5-39c8-469c-beef-02158cccf12b" (UID: "085d10f5-39c8-469c-beef-02158cccf12b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.415612 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "e9d7777b-b10c-44e6-970c-34dec79d193e" (UID: "e9d7777b-b10c-44e6-970c-34dec79d193e"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.415988 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9d7777b-b10c-44e6-970c-34dec79d193e-kube-api-access-xqr85" (OuterVolumeSpecName: "kube-api-access-xqr85") pod "e9d7777b-b10c-44e6-970c-34dec79d193e" (UID: "e9d7777b-b10c-44e6-970c-34dec79d193e"). InnerVolumeSpecName "kube-api-access-xqr85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.418343 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-ceph" (OuterVolumeSpecName: "ceph") pod "e9d7777b-b10c-44e6-970c-34dec79d193e" (UID: "e9d7777b-b10c-44e6-970c-34dec79d193e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.421530 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/085d10f5-39c8-469c-beef-02158cccf12b-kube-api-access-kzz45" (OuterVolumeSpecName: "kube-api-access-kzz45") pod "085d10f5-39c8-469c-beef-02158cccf12b" (UID: "085d10f5-39c8-469c-beef-02158cccf12b"). InnerVolumeSpecName "kube-api-access-kzz45". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.446345 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "e9d7777b-b10c-44e6-970c-34dec79d193e" (UID: "e9d7777b-b10c-44e6-970c-34dec79d193e"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.451347 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e9d7777b-b10c-44e6-970c-34dec79d193e" (UID: "e9d7777b-b10c-44e6-970c-34dec79d193e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.454366 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e9d7777b-b10c-44e6-970c-34dec79d193e" (UID: "e9d7777b-b10c-44e6-970c-34dec79d193e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.459408 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "e9d7777b-b10c-44e6-970c-34dec79d193e" (UID: "e9d7777b-b10c-44e6-970c-34dec79d193e"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.460763 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e9d7777b-b10c-44e6-970c-34dec79d193e" (UID: "e9d7777b-b10c-44e6-970c-34dec79d193e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.467414 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-inventory" (OuterVolumeSpecName: "inventory") pod "e9d7777b-b10c-44e6-970c-34dec79d193e" (UID: "e9d7777b-b10c-44e6-970c-34dec79d193e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.467453 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e9d7777b-b10c-44e6-970c-34dec79d193e" (UID: "e9d7777b-b10c-44e6-970c-34dec79d193e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.469980 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e9d7777b-b10c-44e6-970c-34dec79d193e" (UID: "e9d7777b-b10c-44e6-970c-34dec79d193e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.501988 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/085d10f5-39c8-469c-beef-02158cccf12b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "085d10f5-39c8-469c-beef-02158cccf12b" (UID: "085d10f5-39c8-469c-beef-02158cccf12b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.512256 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085d10f5-39c8-469c-beef-02158cccf12b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.512288 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.512304 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzz45\" (UniqueName: \"kubernetes.io/projected/085d10f5-39c8-469c-beef-02158cccf12b-kube-api-access-kzz45\") on node \"crc\" DevicePath \"\"" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.512321 4854 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.512333 4854 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.512344 4854 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.512356 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqr85\" (UniqueName: \"kubernetes.io/projected/e9d7777b-b10c-44e6-970c-34dec79d193e-kube-api-access-xqr85\") on node \"crc\" DevicePath \"\"" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.512367 4854 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.512379 4854 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.512390 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085d10f5-39c8-469c-beef-02158cccf12b-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.512401 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.512413 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.512424 4854 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.512436 4854 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e9d7777b-b10c-44e6-970c-34dec79d193e-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.687016 4854 generic.go:334] "Generic (PLEG): container finished" podID="085d10f5-39c8-469c-beef-02158cccf12b" containerID="72ecb663d15baca7405b21ceac12c46b0dcb83fb112f0490d25f48721048b2b7" exitCode=0 Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.687081 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcm84" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.688216 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcm84" event={"ID":"085d10f5-39c8-469c-beef-02158cccf12b","Type":"ContainerDied","Data":"72ecb663d15baca7405b21ceac12c46b0dcb83fb112f0490d25f48721048b2b7"} Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.688390 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcm84" event={"ID":"085d10f5-39c8-469c-beef-02158cccf12b","Type":"ContainerDied","Data":"c69a91624618048fd814c9e1a1821a719b24b392da3ba952bd7612ec0d45705a"} Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.688417 4854 scope.go:117] "RemoveContainer" containerID="72ecb663d15baca7405b21ceac12c46b0dcb83fb112f0490d25f48721048b2b7" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.689625 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" event={"ID":"e9d7777b-b10c-44e6-970c-34dec79d193e","Type":"ContainerDied","Data":"b508ef17c93fca8bd637e871c209ee9b64e3c3856a881a1b32f9ebe7bbf0ac5c"} Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.689657 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-ktmw7" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.689665 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b508ef17c93fca8bd637e871c209ee9b64e3c3856a881a1b32f9ebe7bbf0ac5c" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.715876 4854 scope.go:117] "RemoveContainer" containerID="2f239bc4f059138cbe7bef20b32ab8bae7cad725d92cbb87460c8dadddf61fef" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.751238 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vcm84"] Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.762572 4854 scope.go:117] "RemoveContainer" containerID="6a0b06b409fe17c0e3f931a2ff8f1a04cfbf2c24e5a6bc4d0a6e892c2e2e9403" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.771440 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vcm84"] Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.792355 4854 scope.go:117] "RemoveContainer" containerID="72ecb663d15baca7405b21ceac12c46b0dcb83fb112f0490d25f48721048b2b7" Oct 07 14:43:33 crc kubenswrapper[4854]: E1007 14:43:33.792923 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72ecb663d15baca7405b21ceac12c46b0dcb83fb112f0490d25f48721048b2b7\": container with ID starting with 72ecb663d15baca7405b21ceac12c46b0dcb83fb112f0490d25f48721048b2b7 not found: ID does not exist" containerID="72ecb663d15baca7405b21ceac12c46b0dcb83fb112f0490d25f48721048b2b7" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.792971 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72ecb663d15baca7405b21ceac12c46b0dcb83fb112f0490d25f48721048b2b7"} err="failed to get container status \"72ecb663d15baca7405b21ceac12c46b0dcb83fb112f0490d25f48721048b2b7\": rpc error: code = NotFound desc = could not find container \"72ecb663d15baca7405b21ceac12c46b0dcb83fb112f0490d25f48721048b2b7\": container with ID starting with 72ecb663d15baca7405b21ceac12c46b0dcb83fb112f0490d25f48721048b2b7 not found: ID does not exist" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.793002 4854 scope.go:117] "RemoveContainer" containerID="2f239bc4f059138cbe7bef20b32ab8bae7cad725d92cbb87460c8dadddf61fef" Oct 07 14:43:33 crc kubenswrapper[4854]: E1007 14:43:33.793375 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f239bc4f059138cbe7bef20b32ab8bae7cad725d92cbb87460c8dadddf61fef\": container with ID starting with 2f239bc4f059138cbe7bef20b32ab8bae7cad725d92cbb87460c8dadddf61fef not found: ID does not exist" containerID="2f239bc4f059138cbe7bef20b32ab8bae7cad725d92cbb87460c8dadddf61fef" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.793405 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f239bc4f059138cbe7bef20b32ab8bae7cad725d92cbb87460c8dadddf61fef"} err="failed to get container status \"2f239bc4f059138cbe7bef20b32ab8bae7cad725d92cbb87460c8dadddf61fef\": rpc error: code = NotFound desc = could not find container \"2f239bc4f059138cbe7bef20b32ab8bae7cad725d92cbb87460c8dadddf61fef\": container with ID starting with 2f239bc4f059138cbe7bef20b32ab8bae7cad725d92cbb87460c8dadddf61fef not found: ID does not exist" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.793423 4854 scope.go:117] "RemoveContainer" containerID="6a0b06b409fe17c0e3f931a2ff8f1a04cfbf2c24e5a6bc4d0a6e892c2e2e9403" Oct 07 14:43:33 crc kubenswrapper[4854]: E1007 14:43:33.793712 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a0b06b409fe17c0e3f931a2ff8f1a04cfbf2c24e5a6bc4d0a6e892c2e2e9403\": container with ID starting with 6a0b06b409fe17c0e3f931a2ff8f1a04cfbf2c24e5a6bc4d0a6e892c2e2e9403 not found: ID does not exist" containerID="6a0b06b409fe17c0e3f931a2ff8f1a04cfbf2c24e5a6bc4d0a6e892c2e2e9403" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.793753 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0b06b409fe17c0e3f931a2ff8f1a04cfbf2c24e5a6bc4d0a6e892c2e2e9403"} err="failed to get container status \"6a0b06b409fe17c0e3f931a2ff8f1a04cfbf2c24e5a6bc4d0a6e892c2e2e9403\": rpc error: code = NotFound desc = could not find container \"6a0b06b409fe17c0e3f931a2ff8f1a04cfbf2c24e5a6bc4d0a6e892c2e2e9403\": container with ID starting with 6a0b06b409fe17c0e3f931a2ff8f1a04cfbf2c24e5a6bc4d0a6e892c2e2e9403 not found: ID does not exist" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.795613 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-csjlt"] Oct 07 14:43:33 crc kubenswrapper[4854]: E1007 14:43:33.796195 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d7777b-b10c-44e6-970c-34dec79d193e" containerName="nova-cell1-openstack-openstack-cell1" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.796223 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d7777b-b10c-44e6-970c-34dec79d193e" containerName="nova-cell1-openstack-openstack-cell1" Oct 07 14:43:33 crc kubenswrapper[4854]: E1007 14:43:33.796250 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085d10f5-39c8-469c-beef-02158cccf12b" containerName="extract-utilities" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.796260 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="085d10f5-39c8-469c-beef-02158cccf12b" containerName="extract-utilities" Oct 07 14:43:33 crc kubenswrapper[4854]: E1007 14:43:33.796284 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085d10f5-39c8-469c-beef-02158cccf12b" containerName="extract-content" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.796292 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="085d10f5-39c8-469c-beef-02158cccf12b" containerName="extract-content" Oct 07 14:43:33 crc kubenswrapper[4854]: E1007 14:43:33.796311 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085d10f5-39c8-469c-beef-02158cccf12b" containerName="registry-server" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.796321 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="085d10f5-39c8-469c-beef-02158cccf12b" containerName="registry-server" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.796578 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d7777b-b10c-44e6-970c-34dec79d193e" containerName="nova-cell1-openstack-openstack-cell1" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.796615 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="085d10f5-39c8-469c-beef-02158cccf12b" containerName="registry-server" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.797720 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.800619 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.800908 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.801083 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.801271 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.801514 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.806070 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-csjlt"] Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.934348 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktc5p\" (UniqueName: \"kubernetes.io/projected/41fb94cd-3209-4d7e-803b-85f122d3800b-kube-api-access-ktc5p\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.934811 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceph\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.934924 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.934985 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.935033 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ssh-key\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.935102 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.935384 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-inventory\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:33 crc kubenswrapper[4854]: I1007 14:43:33.935438 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.036828 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.036969 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.037067 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ssh-key\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.037174 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.037355 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-inventory\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.037442 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.037579 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktc5p\" (UniqueName: \"kubernetes.io/projected/41fb94cd-3209-4d7e-803b-85f122d3800b-kube-api-access-ktc5p\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.037686 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceph\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.043241 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ssh-key\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.043434 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceph\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.044972 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-inventory\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.048739 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.048840 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.049552 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.049945 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.069452 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktc5p\" (UniqueName: \"kubernetes.io/projected/41fb94cd-3209-4d7e-803b-85f122d3800b-kube-api-access-ktc5p\") pod \"telemetry-openstack-openstack-cell1-csjlt\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.193063 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.581779 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-csjlt"] Oct 07 14:43:34 crc kubenswrapper[4854]: W1007 14:43:34.586949 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41fb94cd_3209_4d7e_803b_85f122d3800b.slice/crio-e6ba914d0b8495aa05911e8dcaa34cc8149db3817a6719f31713bbb6ee824b2c WatchSource:0}: Error finding container e6ba914d0b8495aa05911e8dcaa34cc8149db3817a6719f31713bbb6ee824b2c: Status 404 returned error can't find the container with id e6ba914d0b8495aa05911e8dcaa34cc8149db3817a6719f31713bbb6ee824b2c Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.720868 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="085d10f5-39c8-469c-beef-02158cccf12b" path="/var/lib/kubelet/pods/085d10f5-39c8-469c-beef-02158cccf12b/volumes" Oct 07 14:43:34 crc kubenswrapper[4854]: I1007 14:43:34.725391 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-csjlt" event={"ID":"41fb94cd-3209-4d7e-803b-85f122d3800b","Type":"ContainerStarted","Data":"e6ba914d0b8495aa05911e8dcaa34cc8149db3817a6719f31713bbb6ee824b2c"} Oct 07 14:43:35 crc kubenswrapper[4854]: I1007 14:43:35.718566 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-csjlt" event={"ID":"41fb94cd-3209-4d7e-803b-85f122d3800b","Type":"ContainerStarted","Data":"323b003372963da8f913442c3af5e94520ce56838bbdea4730bb42080f1ce0c5"} Oct 07 14:43:35 crc kubenswrapper[4854]: I1007 14:43:35.752793 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-csjlt" podStartSLOduration=2.168563231 podStartE2EDuration="2.752775476s" podCreationTimestamp="2025-10-07 14:43:33 +0000 UTC" firstStartedPulling="2025-10-07 14:43:34.591283245 +0000 UTC m=+8330.579115500" lastFinishedPulling="2025-10-07 14:43:35.17549548 +0000 UTC m=+8331.163327745" observedRunningTime="2025-10-07 14:43:35.751693785 +0000 UTC m=+8331.739526060" watchObservedRunningTime="2025-10-07 14:43:35.752775476 +0000 UTC m=+8331.740607731" Oct 07 14:43:43 crc kubenswrapper[4854]: I1007 14:43:43.702226 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:43:43 crc kubenswrapper[4854]: E1007 14:43:43.702832 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:43:58 crc kubenswrapper[4854]: I1007 14:43:58.704174 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:43:58 crc kubenswrapper[4854]: E1007 14:43:58.705289 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:44:13 crc kubenswrapper[4854]: I1007 14:44:13.703592 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:44:13 crc kubenswrapper[4854]: E1007 14:44:13.704562 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:44:28 crc kubenswrapper[4854]: I1007 14:44:28.703506 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:44:28 crc kubenswrapper[4854]: E1007 14:44:28.704283 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:44:42 crc kubenswrapper[4854]: I1007 14:44:42.703817 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:44:42 crc kubenswrapper[4854]: E1007 14:44:42.705116 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:44:55 crc kubenswrapper[4854]: I1007 14:44:55.703082 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:44:55 crc kubenswrapper[4854]: E1007 14:44:55.703903 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:44:58 crc kubenswrapper[4854]: I1007 14:44:58.849034 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pwngw"] Oct 07 14:44:58 crc kubenswrapper[4854]: I1007 14:44:58.852721 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:44:58 crc kubenswrapper[4854]: I1007 14:44:58.859209 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwngw"] Oct 07 14:44:58 crc kubenswrapper[4854]: I1007 14:44:58.952262 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fb4487-40ff-4da4-bb16-2c35d61f27e0-utilities\") pod \"redhat-marketplace-pwngw\" (UID: \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\") " pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:44:58 crc kubenswrapper[4854]: I1007 14:44:58.952388 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fb4487-40ff-4da4-bb16-2c35d61f27e0-catalog-content\") pod \"redhat-marketplace-pwngw\" (UID: \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\") " pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:44:58 crc kubenswrapper[4854]: I1007 14:44:58.952639 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzvpn\" (UniqueName: \"kubernetes.io/projected/25fb4487-40ff-4da4-bb16-2c35d61f27e0-kube-api-access-vzvpn\") pod \"redhat-marketplace-pwngw\" (UID: \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\") " pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:44:59 crc kubenswrapper[4854]: I1007 14:44:59.055109 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fb4487-40ff-4da4-bb16-2c35d61f27e0-utilities\") pod \"redhat-marketplace-pwngw\" (UID: \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\") " pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:44:59 crc kubenswrapper[4854]: I1007 14:44:59.055254 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fb4487-40ff-4da4-bb16-2c35d61f27e0-catalog-content\") pod \"redhat-marketplace-pwngw\" (UID: \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\") " pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:44:59 crc kubenswrapper[4854]: I1007 14:44:59.055383 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzvpn\" (UniqueName: \"kubernetes.io/projected/25fb4487-40ff-4da4-bb16-2c35d61f27e0-kube-api-access-vzvpn\") pod \"redhat-marketplace-pwngw\" (UID: \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\") " pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:44:59 crc kubenswrapper[4854]: I1007 14:44:59.055709 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fb4487-40ff-4da4-bb16-2c35d61f27e0-utilities\") pod \"redhat-marketplace-pwngw\" (UID: \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\") " pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:44:59 crc kubenswrapper[4854]: I1007 14:44:59.055799 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fb4487-40ff-4da4-bb16-2c35d61f27e0-catalog-content\") pod \"redhat-marketplace-pwngw\" (UID: \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\") " pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:44:59 crc kubenswrapper[4854]: I1007 14:44:59.078774 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzvpn\" (UniqueName: \"kubernetes.io/projected/25fb4487-40ff-4da4-bb16-2c35d61f27e0-kube-api-access-vzvpn\") pod \"redhat-marketplace-pwngw\" (UID: \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\") " pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:44:59 crc kubenswrapper[4854]: I1007 14:44:59.180206 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:44:59 crc kubenswrapper[4854]: I1007 14:44:59.658715 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwngw"] Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.159362 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf"] Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.161797 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.163696 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.164307 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.180885 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf"] Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.182365 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2e60073-22ba-465a-bfe2-049b4a1e4630-secret-volume\") pod \"collect-profiles-29330805-p5hwf\" (UID: \"f2e60073-22ba-465a-bfe2-049b4a1e4630\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.182535 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgpnx\" (UniqueName: \"kubernetes.io/projected/f2e60073-22ba-465a-bfe2-049b4a1e4630-kube-api-access-pgpnx\") pod \"collect-profiles-29330805-p5hwf\" (UID: \"f2e60073-22ba-465a-bfe2-049b4a1e4630\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.182578 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2e60073-22ba-465a-bfe2-049b4a1e4630-config-volume\") pod \"collect-profiles-29330805-p5hwf\" (UID: \"f2e60073-22ba-465a-bfe2-049b4a1e4630\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.284541 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2e60073-22ba-465a-bfe2-049b4a1e4630-secret-volume\") pod \"collect-profiles-29330805-p5hwf\" (UID: \"f2e60073-22ba-465a-bfe2-049b4a1e4630\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.284678 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgpnx\" (UniqueName: \"kubernetes.io/projected/f2e60073-22ba-465a-bfe2-049b4a1e4630-kube-api-access-pgpnx\") pod \"collect-profiles-29330805-p5hwf\" (UID: \"f2e60073-22ba-465a-bfe2-049b4a1e4630\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.284705 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2e60073-22ba-465a-bfe2-049b4a1e4630-config-volume\") pod \"collect-profiles-29330805-p5hwf\" (UID: \"f2e60073-22ba-465a-bfe2-049b4a1e4630\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.285770 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2e60073-22ba-465a-bfe2-049b4a1e4630-config-volume\") pod \"collect-profiles-29330805-p5hwf\" (UID: \"f2e60073-22ba-465a-bfe2-049b4a1e4630\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.299446 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2e60073-22ba-465a-bfe2-049b4a1e4630-secret-volume\") pod \"collect-profiles-29330805-p5hwf\" (UID: \"f2e60073-22ba-465a-bfe2-049b4a1e4630\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.301341 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgpnx\" (UniqueName: \"kubernetes.io/projected/f2e60073-22ba-465a-bfe2-049b4a1e4630-kube-api-access-pgpnx\") pod \"collect-profiles-29330805-p5hwf\" (UID: \"f2e60073-22ba-465a-bfe2-049b4a1e4630\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.499566 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.647169 4854 generic.go:334] "Generic (PLEG): container finished" podID="25fb4487-40ff-4da4-bb16-2c35d61f27e0" containerID="08c642d9922d3fef7bd0b762788b54e933e12f0010ba8955045df04eacf72b96" exitCode=0 Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.647331 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwngw" event={"ID":"25fb4487-40ff-4da4-bb16-2c35d61f27e0","Type":"ContainerDied","Data":"08c642d9922d3fef7bd0b762788b54e933e12f0010ba8955045df04eacf72b96"} Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.647436 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwngw" event={"ID":"25fb4487-40ff-4da4-bb16-2c35d61f27e0","Type":"ContainerStarted","Data":"585c232e34b0c095d82ae9b690c55e8195d3cfc2de19e346568933dd23ffd2b2"} Oct 07 14:45:00 crc kubenswrapper[4854]: I1007 14:45:00.975093 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf"] Oct 07 14:45:00 crc kubenswrapper[4854]: W1007 14:45:00.978782 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2e60073_22ba_465a_bfe2_049b4a1e4630.slice/crio-a14f11149232f7ec5d6958cd3bfe2d16d0a5123232d0a9e3762ddab410b59659 WatchSource:0}: Error finding container a14f11149232f7ec5d6958cd3bfe2d16d0a5123232d0a9e3762ddab410b59659: Status 404 returned error can't find the container with id a14f11149232f7ec5d6958cd3bfe2d16d0a5123232d0a9e3762ddab410b59659 Oct 07 14:45:01 crc kubenswrapper[4854]: I1007 14:45:01.664770 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwngw" event={"ID":"25fb4487-40ff-4da4-bb16-2c35d61f27e0","Type":"ContainerStarted","Data":"5b03ee388c5d2b89e101cce224f7a5c59ee650ad24fe2d7720b1c30cf484e945"} Oct 07 14:45:01 crc kubenswrapper[4854]: I1007 14:45:01.668935 4854 generic.go:334] "Generic (PLEG): container finished" podID="f2e60073-22ba-465a-bfe2-049b4a1e4630" containerID="c925f7e29a3d89b67b8e9add5c9cc72dfb3d147496814df29266e74b3a318484" exitCode=0 Oct 07 14:45:01 crc kubenswrapper[4854]: I1007 14:45:01.668981 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" event={"ID":"f2e60073-22ba-465a-bfe2-049b4a1e4630","Type":"ContainerDied","Data":"c925f7e29a3d89b67b8e9add5c9cc72dfb3d147496814df29266e74b3a318484"} Oct 07 14:45:01 crc kubenswrapper[4854]: I1007 14:45:01.669001 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" event={"ID":"f2e60073-22ba-465a-bfe2-049b4a1e4630","Type":"ContainerStarted","Data":"a14f11149232f7ec5d6958cd3bfe2d16d0a5123232d0a9e3762ddab410b59659"} Oct 07 14:45:02 crc kubenswrapper[4854]: I1007 14:45:02.683120 4854 generic.go:334] "Generic (PLEG): container finished" podID="25fb4487-40ff-4da4-bb16-2c35d61f27e0" containerID="5b03ee388c5d2b89e101cce224f7a5c59ee650ad24fe2d7720b1c30cf484e945" exitCode=0 Oct 07 14:45:02 crc kubenswrapper[4854]: I1007 14:45:02.683972 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwngw" event={"ID":"25fb4487-40ff-4da4-bb16-2c35d61f27e0","Type":"ContainerDied","Data":"5b03ee388c5d2b89e101cce224f7a5c59ee650ad24fe2d7720b1c30cf484e945"} Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.092958 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.150441 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgpnx\" (UniqueName: \"kubernetes.io/projected/f2e60073-22ba-465a-bfe2-049b4a1e4630-kube-api-access-pgpnx\") pod \"f2e60073-22ba-465a-bfe2-049b4a1e4630\" (UID: \"f2e60073-22ba-465a-bfe2-049b4a1e4630\") " Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.150821 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2e60073-22ba-465a-bfe2-049b4a1e4630-config-volume\") pod \"f2e60073-22ba-465a-bfe2-049b4a1e4630\" (UID: \"f2e60073-22ba-465a-bfe2-049b4a1e4630\") " Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.150895 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2e60073-22ba-465a-bfe2-049b4a1e4630-secret-volume\") pod \"f2e60073-22ba-465a-bfe2-049b4a1e4630\" (UID: \"f2e60073-22ba-465a-bfe2-049b4a1e4630\") " Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.152847 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2e60073-22ba-465a-bfe2-049b4a1e4630-config-volume" (OuterVolumeSpecName: "config-volume") pod "f2e60073-22ba-465a-bfe2-049b4a1e4630" (UID: "f2e60073-22ba-465a-bfe2-049b4a1e4630"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.163597 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2e60073-22ba-465a-bfe2-049b4a1e4630-kube-api-access-pgpnx" (OuterVolumeSpecName: "kube-api-access-pgpnx") pod "f2e60073-22ba-465a-bfe2-049b4a1e4630" (UID: "f2e60073-22ba-465a-bfe2-049b4a1e4630"). InnerVolumeSpecName "kube-api-access-pgpnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.164496 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2e60073-22ba-465a-bfe2-049b4a1e4630-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f2e60073-22ba-465a-bfe2-049b4a1e4630" (UID: "f2e60073-22ba-465a-bfe2-049b4a1e4630"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.254173 4854 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2e60073-22ba-465a-bfe2-049b4a1e4630-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.254446 4854 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2e60073-22ba-465a-bfe2-049b4a1e4630-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.254574 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgpnx\" (UniqueName: \"kubernetes.io/projected/f2e60073-22ba-465a-bfe2-049b4a1e4630-kube-api-access-pgpnx\") on node \"crc\" DevicePath \"\"" Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.698719 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwngw" event={"ID":"25fb4487-40ff-4da4-bb16-2c35d61f27e0","Type":"ContainerStarted","Data":"cc800f53b0eb5985daeacbc8937315e7df388f500ca79ab3996eca52ee048f6c"} Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.700820 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" event={"ID":"f2e60073-22ba-465a-bfe2-049b4a1e4630","Type":"ContainerDied","Data":"a14f11149232f7ec5d6958cd3bfe2d16d0a5123232d0a9e3762ddab410b59659"} Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.700878 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a14f11149232f7ec5d6958cd3bfe2d16d0a5123232d0a9e3762ddab410b59659" Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.700904 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-p5hwf" Oct 07 14:45:03 crc kubenswrapper[4854]: I1007 14:45:03.732123 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pwngw" podStartSLOduration=3.200702389 podStartE2EDuration="5.73209722s" podCreationTimestamp="2025-10-07 14:44:58 +0000 UTC" firstStartedPulling="2025-10-07 14:45:00.649766115 +0000 UTC m=+8416.637598370" lastFinishedPulling="2025-10-07 14:45:03.181160946 +0000 UTC m=+8419.168993201" observedRunningTime="2025-10-07 14:45:03.725829479 +0000 UTC m=+8419.713661784" watchObservedRunningTime="2025-10-07 14:45:03.73209722 +0000 UTC m=+8419.719929485" Oct 07 14:45:04 crc kubenswrapper[4854]: I1007 14:45:04.178009 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh"] Oct 07 14:45:04 crc kubenswrapper[4854]: I1007 14:45:04.188420 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-vtxjh"] Oct 07 14:45:04 crc kubenswrapper[4854]: I1007 14:45:04.714895 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a28dc48-91f4-4fc6-854d-058aab9daf21" path="/var/lib/kubelet/pods/6a28dc48-91f4-4fc6-854d-058aab9daf21/volumes" Oct 07 14:45:08 crc kubenswrapper[4854]: I1007 14:45:08.703920 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:45:08 crc kubenswrapper[4854]: E1007 14:45:08.704947 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:45:09 crc kubenswrapper[4854]: I1007 14:45:09.182736 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:45:09 crc kubenswrapper[4854]: I1007 14:45:09.183660 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:45:09 crc kubenswrapper[4854]: I1007 14:45:09.254231 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:45:09 crc kubenswrapper[4854]: I1007 14:45:09.826821 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.212519 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwngw"] Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.213366 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pwngw" podUID="25fb4487-40ff-4da4-bb16-2c35d61f27e0" containerName="registry-server" containerID="cri-o://cc800f53b0eb5985daeacbc8937315e7df388f500ca79ab3996eca52ee048f6c" gracePeriod=2 Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.666797 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.832453 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fb4487-40ff-4da4-bb16-2c35d61f27e0-catalog-content\") pod \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\" (UID: \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\") " Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.832645 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fb4487-40ff-4da4-bb16-2c35d61f27e0-utilities\") pod \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\" (UID: \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\") " Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.832679 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzvpn\" (UniqueName: \"kubernetes.io/projected/25fb4487-40ff-4da4-bb16-2c35d61f27e0-kube-api-access-vzvpn\") pod \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\" (UID: \"25fb4487-40ff-4da4-bb16-2c35d61f27e0\") " Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.834980 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25fb4487-40ff-4da4-bb16-2c35d61f27e0-utilities" (OuterVolumeSpecName: "utilities") pod "25fb4487-40ff-4da4-bb16-2c35d61f27e0" (UID: "25fb4487-40ff-4da4-bb16-2c35d61f27e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.853114 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25fb4487-40ff-4da4-bb16-2c35d61f27e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25fb4487-40ff-4da4-bb16-2c35d61f27e0" (UID: "25fb4487-40ff-4da4-bb16-2c35d61f27e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.859504 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25fb4487-40ff-4da4-bb16-2c35d61f27e0-kube-api-access-vzvpn" (OuterVolumeSpecName: "kube-api-access-vzvpn") pod "25fb4487-40ff-4da4-bb16-2c35d61f27e0" (UID: "25fb4487-40ff-4da4-bb16-2c35d61f27e0"). InnerVolumeSpecName "kube-api-access-vzvpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.861644 4854 generic.go:334] "Generic (PLEG): container finished" podID="25fb4487-40ff-4da4-bb16-2c35d61f27e0" containerID="cc800f53b0eb5985daeacbc8937315e7df388f500ca79ab3996eca52ee048f6c" exitCode=0 Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.861686 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwngw" event={"ID":"25fb4487-40ff-4da4-bb16-2c35d61f27e0","Type":"ContainerDied","Data":"cc800f53b0eb5985daeacbc8937315e7df388f500ca79ab3996eca52ee048f6c"} Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.861715 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwngw" event={"ID":"25fb4487-40ff-4da4-bb16-2c35d61f27e0","Type":"ContainerDied","Data":"585c232e34b0c095d82ae9b690c55e8195d3cfc2de19e346568933dd23ffd2b2"} Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.861745 4854 scope.go:117] "RemoveContainer" containerID="cc800f53b0eb5985daeacbc8937315e7df388f500ca79ab3996eca52ee048f6c" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.861741 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pwngw" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.915222 4854 scope.go:117] "RemoveContainer" containerID="5b03ee388c5d2b89e101cce224f7a5c59ee650ad24fe2d7720b1c30cf484e945" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.923964 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwngw"] Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.935230 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fb4487-40ff-4da4-bb16-2c35d61f27e0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.935276 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fb4487-40ff-4da4-bb16-2c35d61f27e0-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.935289 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzvpn\" (UniqueName: \"kubernetes.io/projected/25fb4487-40ff-4da4-bb16-2c35d61f27e0-kube-api-access-vzvpn\") on node \"crc\" DevicePath \"\"" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.936805 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwngw"] Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.940340 4854 scope.go:117] "RemoveContainer" containerID="08c642d9922d3fef7bd0b762788b54e933e12f0010ba8955045df04eacf72b96" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.995112 4854 scope.go:117] "RemoveContainer" containerID="cc800f53b0eb5985daeacbc8937315e7df388f500ca79ab3996eca52ee048f6c" Oct 07 14:45:15 crc kubenswrapper[4854]: E1007 14:45:15.996291 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc800f53b0eb5985daeacbc8937315e7df388f500ca79ab3996eca52ee048f6c\": container with ID starting with cc800f53b0eb5985daeacbc8937315e7df388f500ca79ab3996eca52ee048f6c not found: ID does not exist" containerID="cc800f53b0eb5985daeacbc8937315e7df388f500ca79ab3996eca52ee048f6c" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.996331 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc800f53b0eb5985daeacbc8937315e7df388f500ca79ab3996eca52ee048f6c"} err="failed to get container status \"cc800f53b0eb5985daeacbc8937315e7df388f500ca79ab3996eca52ee048f6c\": rpc error: code = NotFound desc = could not find container \"cc800f53b0eb5985daeacbc8937315e7df388f500ca79ab3996eca52ee048f6c\": container with ID starting with cc800f53b0eb5985daeacbc8937315e7df388f500ca79ab3996eca52ee048f6c not found: ID does not exist" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.996359 4854 scope.go:117] "RemoveContainer" containerID="5b03ee388c5d2b89e101cce224f7a5c59ee650ad24fe2d7720b1c30cf484e945" Oct 07 14:45:15 crc kubenswrapper[4854]: E1007 14:45:15.996744 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b03ee388c5d2b89e101cce224f7a5c59ee650ad24fe2d7720b1c30cf484e945\": container with ID starting with 5b03ee388c5d2b89e101cce224f7a5c59ee650ad24fe2d7720b1c30cf484e945 not found: ID does not exist" containerID="5b03ee388c5d2b89e101cce224f7a5c59ee650ad24fe2d7720b1c30cf484e945" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.996775 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b03ee388c5d2b89e101cce224f7a5c59ee650ad24fe2d7720b1c30cf484e945"} err="failed to get container status \"5b03ee388c5d2b89e101cce224f7a5c59ee650ad24fe2d7720b1c30cf484e945\": rpc error: code = NotFound desc = could not find container \"5b03ee388c5d2b89e101cce224f7a5c59ee650ad24fe2d7720b1c30cf484e945\": container with ID starting with 5b03ee388c5d2b89e101cce224f7a5c59ee650ad24fe2d7720b1c30cf484e945 not found: ID does not exist" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.996793 4854 scope.go:117] "RemoveContainer" containerID="08c642d9922d3fef7bd0b762788b54e933e12f0010ba8955045df04eacf72b96" Oct 07 14:45:15 crc kubenswrapper[4854]: E1007 14:45:15.997181 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08c642d9922d3fef7bd0b762788b54e933e12f0010ba8955045df04eacf72b96\": container with ID starting with 08c642d9922d3fef7bd0b762788b54e933e12f0010ba8955045df04eacf72b96 not found: ID does not exist" containerID="08c642d9922d3fef7bd0b762788b54e933e12f0010ba8955045df04eacf72b96" Oct 07 14:45:15 crc kubenswrapper[4854]: I1007 14:45:15.997913 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08c642d9922d3fef7bd0b762788b54e933e12f0010ba8955045df04eacf72b96"} err="failed to get container status \"08c642d9922d3fef7bd0b762788b54e933e12f0010ba8955045df04eacf72b96\": rpc error: code = NotFound desc = could not find container \"08c642d9922d3fef7bd0b762788b54e933e12f0010ba8955045df04eacf72b96\": container with ID starting with 08c642d9922d3fef7bd0b762788b54e933e12f0010ba8955045df04eacf72b96 not found: ID does not exist" Oct 07 14:45:16 crc kubenswrapper[4854]: I1007 14:45:16.714974 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25fb4487-40ff-4da4-bb16-2c35d61f27e0" path="/var/lib/kubelet/pods/25fb4487-40ff-4da4-bb16-2c35d61f27e0/volumes" Oct 07 14:45:23 crc kubenswrapper[4854]: I1007 14:45:23.710707 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:45:23 crc kubenswrapper[4854]: E1007 14:45:23.713879 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:45:28 crc kubenswrapper[4854]: I1007 14:45:28.773667 4854 scope.go:117] "RemoveContainer" containerID="f84e1c192cd00627c4033c462e4e78c3cd6da4dcf31eeb4d0396300e389306fd" Oct 07 14:45:36 crc kubenswrapper[4854]: I1007 14:45:36.702823 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:45:36 crc kubenswrapper[4854]: E1007 14:45:36.703554 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:45:47 crc kubenswrapper[4854]: I1007 14:45:47.709495 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:45:48 crc kubenswrapper[4854]: I1007 14:45:48.262077 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"3a01cf9fd789340dd73e9f8543d33a8348cf73c39951103b77b25055e06fb867"} Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.086210 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kzgbz"] Oct 07 14:46:01 crc kubenswrapper[4854]: E1007 14:46:01.087167 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fb4487-40ff-4da4-bb16-2c35d61f27e0" containerName="registry-server" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.087182 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fb4487-40ff-4da4-bb16-2c35d61f27e0" containerName="registry-server" Oct 07 14:46:01 crc kubenswrapper[4854]: E1007 14:46:01.087209 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fb4487-40ff-4da4-bb16-2c35d61f27e0" containerName="extract-content" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.087216 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fb4487-40ff-4da4-bb16-2c35d61f27e0" containerName="extract-content" Oct 07 14:46:01 crc kubenswrapper[4854]: E1007 14:46:01.087235 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2e60073-22ba-465a-bfe2-049b4a1e4630" containerName="collect-profiles" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.087243 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2e60073-22ba-465a-bfe2-049b4a1e4630" containerName="collect-profiles" Oct 07 14:46:01 crc kubenswrapper[4854]: E1007 14:46:01.087264 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fb4487-40ff-4da4-bb16-2c35d61f27e0" containerName="extract-utilities" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.087272 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fb4487-40ff-4da4-bb16-2c35d61f27e0" containerName="extract-utilities" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.087495 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="25fb4487-40ff-4da4-bb16-2c35d61f27e0" containerName="registry-server" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.087512 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2e60073-22ba-465a-bfe2-049b4a1e4630" containerName="collect-profiles" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.089380 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.121472 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzgbz"] Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.144475 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85cf44bc-e281-4b64-b72a-0160f5eb107b-catalog-content\") pod \"community-operators-kzgbz\" (UID: \"85cf44bc-e281-4b64-b72a-0160f5eb107b\") " pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.144620 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6h2k\" (UniqueName: \"kubernetes.io/projected/85cf44bc-e281-4b64-b72a-0160f5eb107b-kube-api-access-p6h2k\") pod \"community-operators-kzgbz\" (UID: \"85cf44bc-e281-4b64-b72a-0160f5eb107b\") " pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.144679 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85cf44bc-e281-4b64-b72a-0160f5eb107b-utilities\") pod \"community-operators-kzgbz\" (UID: \"85cf44bc-e281-4b64-b72a-0160f5eb107b\") " pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.247472 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6h2k\" (UniqueName: \"kubernetes.io/projected/85cf44bc-e281-4b64-b72a-0160f5eb107b-kube-api-access-p6h2k\") pod \"community-operators-kzgbz\" (UID: \"85cf44bc-e281-4b64-b72a-0160f5eb107b\") " pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.247566 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85cf44bc-e281-4b64-b72a-0160f5eb107b-utilities\") pod \"community-operators-kzgbz\" (UID: \"85cf44bc-e281-4b64-b72a-0160f5eb107b\") " pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.247658 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85cf44bc-e281-4b64-b72a-0160f5eb107b-catalog-content\") pod \"community-operators-kzgbz\" (UID: \"85cf44bc-e281-4b64-b72a-0160f5eb107b\") " pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.248168 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85cf44bc-e281-4b64-b72a-0160f5eb107b-utilities\") pod \"community-operators-kzgbz\" (UID: \"85cf44bc-e281-4b64-b72a-0160f5eb107b\") " pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.248230 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85cf44bc-e281-4b64-b72a-0160f5eb107b-catalog-content\") pod \"community-operators-kzgbz\" (UID: \"85cf44bc-e281-4b64-b72a-0160f5eb107b\") " pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.268224 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6h2k\" (UniqueName: \"kubernetes.io/projected/85cf44bc-e281-4b64-b72a-0160f5eb107b-kube-api-access-p6h2k\") pod \"community-operators-kzgbz\" (UID: \"85cf44bc-e281-4b64-b72a-0160f5eb107b\") " pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:01 crc kubenswrapper[4854]: I1007 14:46:01.422659 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:02 crc kubenswrapper[4854]: I1007 14:46:02.029095 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzgbz"] Oct 07 14:46:02 crc kubenswrapper[4854]: I1007 14:46:02.415601 4854 generic.go:334] "Generic (PLEG): container finished" podID="85cf44bc-e281-4b64-b72a-0160f5eb107b" containerID="85b20322af3315fabb7b54fe87faa2f85fb8cdc1dc646035b72a490ccbfd3381" exitCode=0 Oct 07 14:46:02 crc kubenswrapper[4854]: I1007 14:46:02.415816 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzgbz" event={"ID":"85cf44bc-e281-4b64-b72a-0160f5eb107b","Type":"ContainerDied","Data":"85b20322af3315fabb7b54fe87faa2f85fb8cdc1dc646035b72a490ccbfd3381"} Oct 07 14:46:02 crc kubenswrapper[4854]: I1007 14:46:02.415887 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzgbz" event={"ID":"85cf44bc-e281-4b64-b72a-0160f5eb107b","Type":"ContainerStarted","Data":"85747e8ef5d8b61a63961b39f84aed8e012d57042ed23e184684c929bf32403e"} Oct 07 14:46:03 crc kubenswrapper[4854]: I1007 14:46:03.435221 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzgbz" event={"ID":"85cf44bc-e281-4b64-b72a-0160f5eb107b","Type":"ContainerStarted","Data":"5420b2d58a196621f4d1141e21b23cc00c7362a60b719c6e99ff78ff9d14eb6a"} Oct 07 14:46:04 crc kubenswrapper[4854]: I1007 14:46:04.451352 4854 generic.go:334] "Generic (PLEG): container finished" podID="85cf44bc-e281-4b64-b72a-0160f5eb107b" containerID="5420b2d58a196621f4d1141e21b23cc00c7362a60b719c6e99ff78ff9d14eb6a" exitCode=0 Oct 07 14:46:04 crc kubenswrapper[4854]: I1007 14:46:04.451422 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzgbz" event={"ID":"85cf44bc-e281-4b64-b72a-0160f5eb107b","Type":"ContainerDied","Data":"5420b2d58a196621f4d1141e21b23cc00c7362a60b719c6e99ff78ff9d14eb6a"} Oct 07 14:46:05 crc kubenswrapper[4854]: I1007 14:46:05.468403 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzgbz" event={"ID":"85cf44bc-e281-4b64-b72a-0160f5eb107b","Type":"ContainerStarted","Data":"eabb95203bac43f7d289b61d355847d0a54481ae1397e8271b5eb9c8b725a4fe"} Oct 07 14:46:05 crc kubenswrapper[4854]: I1007 14:46:05.487182 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kzgbz" podStartSLOduration=2.012135047 podStartE2EDuration="4.48716787s" podCreationTimestamp="2025-10-07 14:46:01 +0000 UTC" firstStartedPulling="2025-10-07 14:46:02.418347584 +0000 UTC m=+8478.406179839" lastFinishedPulling="2025-10-07 14:46:04.893380397 +0000 UTC m=+8480.881212662" observedRunningTime="2025-10-07 14:46:05.485863852 +0000 UTC m=+8481.473696107" watchObservedRunningTime="2025-10-07 14:46:05.48716787 +0000 UTC m=+8481.475000125" Oct 07 14:46:11 crc kubenswrapper[4854]: I1007 14:46:11.423990 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:11 crc kubenswrapper[4854]: I1007 14:46:11.424633 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:11 crc kubenswrapper[4854]: I1007 14:46:11.490407 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:11 crc kubenswrapper[4854]: I1007 14:46:11.581439 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:11 crc kubenswrapper[4854]: I1007 14:46:11.725371 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kzgbz"] Oct 07 14:46:13 crc kubenswrapper[4854]: I1007 14:46:13.548897 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kzgbz" podUID="85cf44bc-e281-4b64-b72a-0160f5eb107b" containerName="registry-server" containerID="cri-o://eabb95203bac43f7d289b61d355847d0a54481ae1397e8271b5eb9c8b725a4fe" gracePeriod=2 Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.083478 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.135448 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85cf44bc-e281-4b64-b72a-0160f5eb107b-catalog-content\") pod \"85cf44bc-e281-4b64-b72a-0160f5eb107b\" (UID: \"85cf44bc-e281-4b64-b72a-0160f5eb107b\") " Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.135641 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6h2k\" (UniqueName: \"kubernetes.io/projected/85cf44bc-e281-4b64-b72a-0160f5eb107b-kube-api-access-p6h2k\") pod \"85cf44bc-e281-4b64-b72a-0160f5eb107b\" (UID: \"85cf44bc-e281-4b64-b72a-0160f5eb107b\") " Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.135694 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85cf44bc-e281-4b64-b72a-0160f5eb107b-utilities\") pod \"85cf44bc-e281-4b64-b72a-0160f5eb107b\" (UID: \"85cf44bc-e281-4b64-b72a-0160f5eb107b\") " Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.137463 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85cf44bc-e281-4b64-b72a-0160f5eb107b-utilities" (OuterVolumeSpecName: "utilities") pod "85cf44bc-e281-4b64-b72a-0160f5eb107b" (UID: "85cf44bc-e281-4b64-b72a-0160f5eb107b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.143378 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85cf44bc-e281-4b64-b72a-0160f5eb107b-kube-api-access-p6h2k" (OuterVolumeSpecName: "kube-api-access-p6h2k") pod "85cf44bc-e281-4b64-b72a-0160f5eb107b" (UID: "85cf44bc-e281-4b64-b72a-0160f5eb107b"). InnerVolumeSpecName "kube-api-access-p6h2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.196107 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85cf44bc-e281-4b64-b72a-0160f5eb107b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85cf44bc-e281-4b64-b72a-0160f5eb107b" (UID: "85cf44bc-e281-4b64-b72a-0160f5eb107b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.238654 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85cf44bc-e281-4b64-b72a-0160f5eb107b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.238706 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6h2k\" (UniqueName: \"kubernetes.io/projected/85cf44bc-e281-4b64-b72a-0160f5eb107b-kube-api-access-p6h2k\") on node \"crc\" DevicePath \"\"" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.238722 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85cf44bc-e281-4b64-b72a-0160f5eb107b-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.567588 4854 generic.go:334] "Generic (PLEG): container finished" podID="85cf44bc-e281-4b64-b72a-0160f5eb107b" containerID="eabb95203bac43f7d289b61d355847d0a54481ae1397e8271b5eb9c8b725a4fe" exitCode=0 Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.567754 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzgbz" event={"ID":"85cf44bc-e281-4b64-b72a-0160f5eb107b","Type":"ContainerDied","Data":"eabb95203bac43f7d289b61d355847d0a54481ae1397e8271b5eb9c8b725a4fe"} Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.567833 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzgbz" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.567861 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzgbz" event={"ID":"85cf44bc-e281-4b64-b72a-0160f5eb107b","Type":"ContainerDied","Data":"85747e8ef5d8b61a63961b39f84aed8e012d57042ed23e184684c929bf32403e"} Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.567883 4854 scope.go:117] "RemoveContainer" containerID="eabb95203bac43f7d289b61d355847d0a54481ae1397e8271b5eb9c8b725a4fe" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.590441 4854 scope.go:117] "RemoveContainer" containerID="5420b2d58a196621f4d1141e21b23cc00c7362a60b719c6e99ff78ff9d14eb6a" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.618817 4854 scope.go:117] "RemoveContainer" containerID="85b20322af3315fabb7b54fe87faa2f85fb8cdc1dc646035b72a490ccbfd3381" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.618944 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kzgbz"] Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.629230 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kzgbz"] Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.665600 4854 scope.go:117] "RemoveContainer" containerID="eabb95203bac43f7d289b61d355847d0a54481ae1397e8271b5eb9c8b725a4fe" Oct 07 14:46:14 crc kubenswrapper[4854]: E1007 14:46:14.665993 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eabb95203bac43f7d289b61d355847d0a54481ae1397e8271b5eb9c8b725a4fe\": container with ID starting with eabb95203bac43f7d289b61d355847d0a54481ae1397e8271b5eb9c8b725a4fe not found: ID does not exist" containerID="eabb95203bac43f7d289b61d355847d0a54481ae1397e8271b5eb9c8b725a4fe" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.666033 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eabb95203bac43f7d289b61d355847d0a54481ae1397e8271b5eb9c8b725a4fe"} err="failed to get container status \"eabb95203bac43f7d289b61d355847d0a54481ae1397e8271b5eb9c8b725a4fe\": rpc error: code = NotFound desc = could not find container \"eabb95203bac43f7d289b61d355847d0a54481ae1397e8271b5eb9c8b725a4fe\": container with ID starting with eabb95203bac43f7d289b61d355847d0a54481ae1397e8271b5eb9c8b725a4fe not found: ID does not exist" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.666059 4854 scope.go:117] "RemoveContainer" containerID="5420b2d58a196621f4d1141e21b23cc00c7362a60b719c6e99ff78ff9d14eb6a" Oct 07 14:46:14 crc kubenswrapper[4854]: E1007 14:46:14.666815 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5420b2d58a196621f4d1141e21b23cc00c7362a60b719c6e99ff78ff9d14eb6a\": container with ID starting with 5420b2d58a196621f4d1141e21b23cc00c7362a60b719c6e99ff78ff9d14eb6a not found: ID does not exist" containerID="5420b2d58a196621f4d1141e21b23cc00c7362a60b719c6e99ff78ff9d14eb6a" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.666858 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5420b2d58a196621f4d1141e21b23cc00c7362a60b719c6e99ff78ff9d14eb6a"} err="failed to get container status \"5420b2d58a196621f4d1141e21b23cc00c7362a60b719c6e99ff78ff9d14eb6a\": rpc error: code = NotFound desc = could not find container \"5420b2d58a196621f4d1141e21b23cc00c7362a60b719c6e99ff78ff9d14eb6a\": container with ID starting with 5420b2d58a196621f4d1141e21b23cc00c7362a60b719c6e99ff78ff9d14eb6a not found: ID does not exist" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.666883 4854 scope.go:117] "RemoveContainer" containerID="85b20322af3315fabb7b54fe87faa2f85fb8cdc1dc646035b72a490ccbfd3381" Oct 07 14:46:14 crc kubenswrapper[4854]: E1007 14:46:14.667267 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85b20322af3315fabb7b54fe87faa2f85fb8cdc1dc646035b72a490ccbfd3381\": container with ID starting with 85b20322af3315fabb7b54fe87faa2f85fb8cdc1dc646035b72a490ccbfd3381 not found: ID does not exist" containerID="85b20322af3315fabb7b54fe87faa2f85fb8cdc1dc646035b72a490ccbfd3381" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.667300 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b20322af3315fabb7b54fe87faa2f85fb8cdc1dc646035b72a490ccbfd3381"} err="failed to get container status \"85b20322af3315fabb7b54fe87faa2f85fb8cdc1dc646035b72a490ccbfd3381\": rpc error: code = NotFound desc = could not find container \"85b20322af3315fabb7b54fe87faa2f85fb8cdc1dc646035b72a490ccbfd3381\": container with ID starting with 85b20322af3315fabb7b54fe87faa2f85fb8cdc1dc646035b72a490ccbfd3381 not found: ID does not exist" Oct 07 14:46:14 crc kubenswrapper[4854]: I1007 14:46:14.738597 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85cf44bc-e281-4b64-b72a-0160f5eb107b" path="/var/lib/kubelet/pods/85cf44bc-e281-4b64-b72a-0160f5eb107b/volumes" Oct 07 14:48:10 crc kubenswrapper[4854]: I1007 14:48:10.808588 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:48:10 crc kubenswrapper[4854]: I1007 14:48:10.809240 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:48:17 crc kubenswrapper[4854]: I1007 14:48:17.975542 4854 generic.go:334] "Generic (PLEG): container finished" podID="41fb94cd-3209-4d7e-803b-85f122d3800b" containerID="323b003372963da8f913442c3af5e94520ce56838bbdea4730bb42080f1ce0c5" exitCode=0 Oct 07 14:48:17 crc kubenswrapper[4854]: I1007 14:48:17.975619 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-csjlt" event={"ID":"41fb94cd-3209-4d7e-803b-85f122d3800b","Type":"ContainerDied","Data":"323b003372963da8f913442c3af5e94520ce56838bbdea4730bb42080f1ce0c5"} Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.384143 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.556344 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-0\") pod \"41fb94cd-3209-4d7e-803b-85f122d3800b\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.556602 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-telemetry-combined-ca-bundle\") pod \"41fb94cd-3209-4d7e-803b-85f122d3800b\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.556647 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-inventory\") pod \"41fb94cd-3209-4d7e-803b-85f122d3800b\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.556674 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceph\") pod \"41fb94cd-3209-4d7e-803b-85f122d3800b\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.556715 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ssh-key\") pod \"41fb94cd-3209-4d7e-803b-85f122d3800b\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.556755 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-2\") pod \"41fb94cd-3209-4d7e-803b-85f122d3800b\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.556795 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktc5p\" (UniqueName: \"kubernetes.io/projected/41fb94cd-3209-4d7e-803b-85f122d3800b-kube-api-access-ktc5p\") pod \"41fb94cd-3209-4d7e-803b-85f122d3800b\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.556841 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-1\") pod \"41fb94cd-3209-4d7e-803b-85f122d3800b\" (UID: \"41fb94cd-3209-4d7e-803b-85f122d3800b\") " Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.562815 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41fb94cd-3209-4d7e-803b-85f122d3800b-kube-api-access-ktc5p" (OuterVolumeSpecName: "kube-api-access-ktc5p") pod "41fb94cd-3209-4d7e-803b-85f122d3800b" (UID: "41fb94cd-3209-4d7e-803b-85f122d3800b"). InnerVolumeSpecName "kube-api-access-ktc5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.562978 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceph" (OuterVolumeSpecName: "ceph") pod "41fb94cd-3209-4d7e-803b-85f122d3800b" (UID: "41fb94cd-3209-4d7e-803b-85f122d3800b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.564843 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "41fb94cd-3209-4d7e-803b-85f122d3800b" (UID: "41fb94cd-3209-4d7e-803b-85f122d3800b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.589595 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-inventory" (OuterVolumeSpecName: "inventory") pod "41fb94cd-3209-4d7e-803b-85f122d3800b" (UID: "41fb94cd-3209-4d7e-803b-85f122d3800b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.591026 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "41fb94cd-3209-4d7e-803b-85f122d3800b" (UID: "41fb94cd-3209-4d7e-803b-85f122d3800b"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.596120 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "41fb94cd-3209-4d7e-803b-85f122d3800b" (UID: "41fb94cd-3209-4d7e-803b-85f122d3800b"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.607303 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "41fb94cd-3209-4d7e-803b-85f122d3800b" (UID: "41fb94cd-3209-4d7e-803b-85f122d3800b"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.613853 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "41fb94cd-3209-4d7e-803b-85f122d3800b" (UID: "41fb94cd-3209-4d7e-803b-85f122d3800b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.659706 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.659736 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.659745 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.659755 4854 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.659767 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktc5p\" (UniqueName: \"kubernetes.io/projected/41fb94cd-3209-4d7e-803b-85f122d3800b-kube-api-access-ktc5p\") on node \"crc\" DevicePath \"\"" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.659778 4854 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.659787 4854 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.659795 4854 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41fb94cd-3209-4d7e-803b-85f122d3800b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.998624 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-csjlt" event={"ID":"41fb94cd-3209-4d7e-803b-85f122d3800b","Type":"ContainerDied","Data":"e6ba914d0b8495aa05911e8dcaa34cc8149db3817a6719f31713bbb6ee824b2c"} Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.998708 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6ba914d0b8495aa05911e8dcaa34cc8149db3817a6719f31713bbb6ee824b2c" Oct 07 14:48:19 crc kubenswrapper[4854]: I1007 14:48:19.998820 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-csjlt" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.091439 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-zrspg"] Oct 07 14:48:20 crc kubenswrapper[4854]: E1007 14:48:20.091864 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85cf44bc-e281-4b64-b72a-0160f5eb107b" containerName="extract-utilities" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.091883 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="85cf44bc-e281-4b64-b72a-0160f5eb107b" containerName="extract-utilities" Oct 07 14:48:20 crc kubenswrapper[4854]: E1007 14:48:20.091916 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41fb94cd-3209-4d7e-803b-85f122d3800b" containerName="telemetry-openstack-openstack-cell1" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.091924 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="41fb94cd-3209-4d7e-803b-85f122d3800b" containerName="telemetry-openstack-openstack-cell1" Oct 07 14:48:20 crc kubenswrapper[4854]: E1007 14:48:20.091945 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85cf44bc-e281-4b64-b72a-0160f5eb107b" containerName="extract-content" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.091951 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="85cf44bc-e281-4b64-b72a-0160f5eb107b" containerName="extract-content" Oct 07 14:48:20 crc kubenswrapper[4854]: E1007 14:48:20.091963 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85cf44bc-e281-4b64-b72a-0160f5eb107b" containerName="registry-server" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.091971 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="85cf44bc-e281-4b64-b72a-0160f5eb107b" containerName="registry-server" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.092192 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="41fb94cd-3209-4d7e-803b-85f122d3800b" containerName="telemetry-openstack-openstack-cell1" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.092212 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="85cf44bc-e281-4b64-b72a-0160f5eb107b" containerName="registry-server" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.093162 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.095886 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.096575 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.096600 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.098520 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.101763 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.106485 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-zrspg"] Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.271996 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.272069 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.272431 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.272550 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.273027 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.273310 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7xq2\" (UniqueName: \"kubernetes.io/projected/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-kube-api-access-l7xq2\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.375921 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7xq2\" (UniqueName: \"kubernetes.io/projected/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-kube-api-access-l7xq2\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.375982 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.376015 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.376087 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.376110 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.376221 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.381268 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.381555 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.382200 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.382313 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.382557 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.396115 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7xq2\" (UniqueName: \"kubernetes.io/projected/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-kube-api-access-l7xq2\") pod \"neutron-sriov-openstack-openstack-cell1-zrspg\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:20 crc kubenswrapper[4854]: I1007 14:48:20.459487 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:48:21 crc kubenswrapper[4854]: I1007 14:48:21.030275 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-zrspg"] Oct 07 14:48:22 crc kubenswrapper[4854]: I1007 14:48:22.018780 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" event={"ID":"bfe44bd6-7978-41f7-afa8-bdfcafad2b49","Type":"ContainerStarted","Data":"19777fbbef9b90828d4d4e9508466fb01b139a0e108c5db1331e1dd01433aa0e"} Oct 07 14:48:22 crc kubenswrapper[4854]: I1007 14:48:22.019539 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" event={"ID":"bfe44bd6-7978-41f7-afa8-bdfcafad2b49","Type":"ContainerStarted","Data":"ab94f774c40a5b750b78aa43a96db350e4201f9719f95c82c3a524377d69b609"} Oct 07 14:48:22 crc kubenswrapper[4854]: I1007 14:48:22.048392 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" podStartSLOduration=1.5157121120000001 podStartE2EDuration="2.048376359s" podCreationTimestamp="2025-10-07 14:48:20 +0000 UTC" firstStartedPulling="2025-10-07 14:48:21.037741065 +0000 UTC m=+8617.025573310" lastFinishedPulling="2025-10-07 14:48:21.570405292 +0000 UTC m=+8617.558237557" observedRunningTime="2025-10-07 14:48:22.046163725 +0000 UTC m=+8618.033995990" watchObservedRunningTime="2025-10-07 14:48:22.048376359 +0000 UTC m=+8618.036208614" Oct 07 14:48:40 crc kubenswrapper[4854]: I1007 14:48:40.808426 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:48:40 crc kubenswrapper[4854]: I1007 14:48:40.809024 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:49:10 crc kubenswrapper[4854]: I1007 14:49:10.808351 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:49:10 crc kubenswrapper[4854]: I1007 14:49:10.809105 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:49:10 crc kubenswrapper[4854]: I1007 14:49:10.809164 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 14:49:10 crc kubenswrapper[4854]: I1007 14:49:10.810218 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a01cf9fd789340dd73e9f8543d33a8348cf73c39951103b77b25055e06fb867"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:49:10 crc kubenswrapper[4854]: I1007 14:49:10.810280 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://3a01cf9fd789340dd73e9f8543d33a8348cf73c39951103b77b25055e06fb867" gracePeriod=600 Oct 07 14:49:11 crc kubenswrapper[4854]: I1007 14:49:11.623004 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="3a01cf9fd789340dd73e9f8543d33a8348cf73c39951103b77b25055e06fb867" exitCode=0 Oct 07 14:49:11 crc kubenswrapper[4854]: I1007 14:49:11.624053 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"3a01cf9fd789340dd73e9f8543d33a8348cf73c39951103b77b25055e06fb867"} Oct 07 14:49:11 crc kubenswrapper[4854]: I1007 14:49:11.624128 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0"} Oct 07 14:49:11 crc kubenswrapper[4854]: I1007 14:49:11.624166 4854 scope.go:117] "RemoveContainer" containerID="390e57b5d82f9be820f7fa4a2a67b06934642922575e8b6606f5dbf6a1870462" Oct 07 14:49:48 crc kubenswrapper[4854]: I1007 14:49:48.802251 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fsdch"] Oct 07 14:49:48 crc kubenswrapper[4854]: I1007 14:49:48.805241 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:48 crc kubenswrapper[4854]: I1007 14:49:48.814580 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fsdch"] Oct 07 14:49:48 crc kubenswrapper[4854]: I1007 14:49:48.899314 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8nsw\" (UniqueName: \"kubernetes.io/projected/df01e860-3743-4cc3-898e-d15f5a7a2203-kube-api-access-m8nsw\") pod \"certified-operators-fsdch\" (UID: \"df01e860-3743-4cc3-898e-d15f5a7a2203\") " pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:48 crc kubenswrapper[4854]: I1007 14:49:48.899365 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df01e860-3743-4cc3-898e-d15f5a7a2203-utilities\") pod \"certified-operators-fsdch\" (UID: \"df01e860-3743-4cc3-898e-d15f5a7a2203\") " pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:48 crc kubenswrapper[4854]: I1007 14:49:48.899625 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df01e860-3743-4cc3-898e-d15f5a7a2203-catalog-content\") pod \"certified-operators-fsdch\" (UID: \"df01e860-3743-4cc3-898e-d15f5a7a2203\") " pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:49 crc kubenswrapper[4854]: I1007 14:49:49.001328 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8nsw\" (UniqueName: \"kubernetes.io/projected/df01e860-3743-4cc3-898e-d15f5a7a2203-kube-api-access-m8nsw\") pod \"certified-operators-fsdch\" (UID: \"df01e860-3743-4cc3-898e-d15f5a7a2203\") " pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:49 crc kubenswrapper[4854]: I1007 14:49:49.001388 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df01e860-3743-4cc3-898e-d15f5a7a2203-utilities\") pod \"certified-operators-fsdch\" (UID: \"df01e860-3743-4cc3-898e-d15f5a7a2203\") " pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:49 crc kubenswrapper[4854]: I1007 14:49:49.001449 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df01e860-3743-4cc3-898e-d15f5a7a2203-catalog-content\") pod \"certified-operators-fsdch\" (UID: \"df01e860-3743-4cc3-898e-d15f5a7a2203\") " pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:49 crc kubenswrapper[4854]: I1007 14:49:49.001890 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df01e860-3743-4cc3-898e-d15f5a7a2203-utilities\") pod \"certified-operators-fsdch\" (UID: \"df01e860-3743-4cc3-898e-d15f5a7a2203\") " pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:49 crc kubenswrapper[4854]: I1007 14:49:49.001929 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df01e860-3743-4cc3-898e-d15f5a7a2203-catalog-content\") pod \"certified-operators-fsdch\" (UID: \"df01e860-3743-4cc3-898e-d15f5a7a2203\") " pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:49 crc kubenswrapper[4854]: I1007 14:49:49.022230 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8nsw\" (UniqueName: \"kubernetes.io/projected/df01e860-3743-4cc3-898e-d15f5a7a2203-kube-api-access-m8nsw\") pod \"certified-operators-fsdch\" (UID: \"df01e860-3743-4cc3-898e-d15f5a7a2203\") " pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:49 crc kubenswrapper[4854]: I1007 14:49:49.122172 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:49 crc kubenswrapper[4854]: I1007 14:49:49.777625 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fsdch"] Oct 07 14:49:50 crc kubenswrapper[4854]: I1007 14:49:50.035248 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsdch" event={"ID":"df01e860-3743-4cc3-898e-d15f5a7a2203","Type":"ContainerStarted","Data":"f5df867341fa08f3bb8b574cba0e399cf83440343b65630fa7dfd53739a61aa5"} Oct 07 14:49:50 crc kubenswrapper[4854]: I1007 14:49:50.035397 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsdch" event={"ID":"df01e860-3743-4cc3-898e-d15f5a7a2203","Type":"ContainerStarted","Data":"0369321121d1d380e9c7e82c45a89b1bc0d038ee73f72d4270d79536761a85c4"} Oct 07 14:49:51 crc kubenswrapper[4854]: I1007 14:49:51.046005 4854 generic.go:334] "Generic (PLEG): container finished" podID="df01e860-3743-4cc3-898e-d15f5a7a2203" containerID="f5df867341fa08f3bb8b574cba0e399cf83440343b65630fa7dfd53739a61aa5" exitCode=0 Oct 07 14:49:51 crc kubenswrapper[4854]: I1007 14:49:51.046214 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsdch" event={"ID":"df01e860-3743-4cc3-898e-d15f5a7a2203","Type":"ContainerDied","Data":"f5df867341fa08f3bb8b574cba0e399cf83440343b65630fa7dfd53739a61aa5"} Oct 07 14:49:51 crc kubenswrapper[4854]: I1007 14:49:51.049232 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:49:53 crc kubenswrapper[4854]: I1007 14:49:53.076555 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsdch" event={"ID":"df01e860-3743-4cc3-898e-d15f5a7a2203","Type":"ContainerStarted","Data":"1ff2496486c57fba50fbd2468d688c50cf8a51ec45fba0782019d7a587739736"} Oct 07 14:49:54 crc kubenswrapper[4854]: I1007 14:49:54.086952 4854 generic.go:334] "Generic (PLEG): container finished" podID="df01e860-3743-4cc3-898e-d15f5a7a2203" containerID="1ff2496486c57fba50fbd2468d688c50cf8a51ec45fba0782019d7a587739736" exitCode=0 Oct 07 14:49:54 crc kubenswrapper[4854]: I1007 14:49:54.087016 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsdch" event={"ID":"df01e860-3743-4cc3-898e-d15f5a7a2203","Type":"ContainerDied","Data":"1ff2496486c57fba50fbd2468d688c50cf8a51ec45fba0782019d7a587739736"} Oct 07 14:49:55 crc kubenswrapper[4854]: I1007 14:49:55.103119 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsdch" event={"ID":"df01e860-3743-4cc3-898e-d15f5a7a2203","Type":"ContainerStarted","Data":"112c673e39c1b3d4b73324af2b50c600e646d1bdbf0e3c1af95da3922072af8d"} Oct 07 14:49:55 crc kubenswrapper[4854]: I1007 14:49:55.131391 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fsdch" podStartSLOduration=3.5053875850000003 podStartE2EDuration="7.131361634s" podCreationTimestamp="2025-10-07 14:49:48 +0000 UTC" firstStartedPulling="2025-10-07 14:49:51.048984811 +0000 UTC m=+8707.036817066" lastFinishedPulling="2025-10-07 14:49:54.67495886 +0000 UTC m=+8710.662791115" observedRunningTime="2025-10-07 14:49:55.124253489 +0000 UTC m=+8711.112085764" watchObservedRunningTime="2025-10-07 14:49:55.131361634 +0000 UTC m=+8711.119193889" Oct 07 14:49:59 crc kubenswrapper[4854]: I1007 14:49:59.123249 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:59 crc kubenswrapper[4854]: I1007 14:49:59.124330 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:59 crc kubenswrapper[4854]: I1007 14:49:59.191612 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:59 crc kubenswrapper[4854]: I1007 14:49:59.264724 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:49:59 crc kubenswrapper[4854]: I1007 14:49:59.449650 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fsdch"] Oct 07 14:50:01 crc kubenswrapper[4854]: I1007 14:50:01.161213 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fsdch" podUID="df01e860-3743-4cc3-898e-d15f5a7a2203" containerName="registry-server" containerID="cri-o://112c673e39c1b3d4b73324af2b50c600e646d1bdbf0e3c1af95da3922072af8d" gracePeriod=2 Oct 07 14:50:01 crc kubenswrapper[4854]: I1007 14:50:01.718129 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:50:01 crc kubenswrapper[4854]: I1007 14:50:01.809918 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df01e860-3743-4cc3-898e-d15f5a7a2203-utilities\") pod \"df01e860-3743-4cc3-898e-d15f5a7a2203\" (UID: \"df01e860-3743-4cc3-898e-d15f5a7a2203\") " Oct 07 14:50:01 crc kubenswrapper[4854]: I1007 14:50:01.810029 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df01e860-3743-4cc3-898e-d15f5a7a2203-catalog-content\") pod \"df01e860-3743-4cc3-898e-d15f5a7a2203\" (UID: \"df01e860-3743-4cc3-898e-d15f5a7a2203\") " Oct 07 14:50:01 crc kubenswrapper[4854]: I1007 14:50:01.810069 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8nsw\" (UniqueName: \"kubernetes.io/projected/df01e860-3743-4cc3-898e-d15f5a7a2203-kube-api-access-m8nsw\") pod \"df01e860-3743-4cc3-898e-d15f5a7a2203\" (UID: \"df01e860-3743-4cc3-898e-d15f5a7a2203\") " Oct 07 14:50:01 crc kubenswrapper[4854]: I1007 14:50:01.810975 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df01e860-3743-4cc3-898e-d15f5a7a2203-utilities" (OuterVolumeSpecName: "utilities") pod "df01e860-3743-4cc3-898e-d15f5a7a2203" (UID: "df01e860-3743-4cc3-898e-d15f5a7a2203"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:50:01 crc kubenswrapper[4854]: I1007 14:50:01.812117 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df01e860-3743-4cc3-898e-d15f5a7a2203-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:50:01 crc kubenswrapper[4854]: I1007 14:50:01.814930 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df01e860-3743-4cc3-898e-d15f5a7a2203-kube-api-access-m8nsw" (OuterVolumeSpecName: "kube-api-access-m8nsw") pod "df01e860-3743-4cc3-898e-d15f5a7a2203" (UID: "df01e860-3743-4cc3-898e-d15f5a7a2203"). InnerVolumeSpecName "kube-api-access-m8nsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:50:01 crc kubenswrapper[4854]: I1007 14:50:01.856590 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df01e860-3743-4cc3-898e-d15f5a7a2203-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df01e860-3743-4cc3-898e-d15f5a7a2203" (UID: "df01e860-3743-4cc3-898e-d15f5a7a2203"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:50:01 crc kubenswrapper[4854]: I1007 14:50:01.914705 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df01e860-3743-4cc3-898e-d15f5a7a2203-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:50:01 crc kubenswrapper[4854]: I1007 14:50:01.914744 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8nsw\" (UniqueName: \"kubernetes.io/projected/df01e860-3743-4cc3-898e-d15f5a7a2203-kube-api-access-m8nsw\") on node \"crc\" DevicePath \"\"" Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.172919 4854 generic.go:334] "Generic (PLEG): container finished" podID="df01e860-3743-4cc3-898e-d15f5a7a2203" containerID="112c673e39c1b3d4b73324af2b50c600e646d1bdbf0e3c1af95da3922072af8d" exitCode=0 Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.172983 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsdch" event={"ID":"df01e860-3743-4cc3-898e-d15f5a7a2203","Type":"ContainerDied","Data":"112c673e39c1b3d4b73324af2b50c600e646d1bdbf0e3c1af95da3922072af8d"} Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.172995 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fsdch" Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.173030 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fsdch" event={"ID":"df01e860-3743-4cc3-898e-d15f5a7a2203","Type":"ContainerDied","Data":"0369321121d1d380e9c7e82c45a89b1bc0d038ee73f72d4270d79536761a85c4"} Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.173049 4854 scope.go:117] "RemoveContainer" containerID="112c673e39c1b3d4b73324af2b50c600e646d1bdbf0e3c1af95da3922072af8d" Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.206399 4854 scope.go:117] "RemoveContainer" containerID="1ff2496486c57fba50fbd2468d688c50cf8a51ec45fba0782019d7a587739736" Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.220773 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fsdch"] Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.233416 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fsdch"] Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.243136 4854 scope.go:117] "RemoveContainer" containerID="f5df867341fa08f3bb8b574cba0e399cf83440343b65630fa7dfd53739a61aa5" Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.297778 4854 scope.go:117] "RemoveContainer" containerID="112c673e39c1b3d4b73324af2b50c600e646d1bdbf0e3c1af95da3922072af8d" Oct 07 14:50:02 crc kubenswrapper[4854]: E1007 14:50:02.298279 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"112c673e39c1b3d4b73324af2b50c600e646d1bdbf0e3c1af95da3922072af8d\": container with ID starting with 112c673e39c1b3d4b73324af2b50c600e646d1bdbf0e3c1af95da3922072af8d not found: ID does not exist" containerID="112c673e39c1b3d4b73324af2b50c600e646d1bdbf0e3c1af95da3922072af8d" Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.298316 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"112c673e39c1b3d4b73324af2b50c600e646d1bdbf0e3c1af95da3922072af8d"} err="failed to get container status \"112c673e39c1b3d4b73324af2b50c600e646d1bdbf0e3c1af95da3922072af8d\": rpc error: code = NotFound desc = could not find container \"112c673e39c1b3d4b73324af2b50c600e646d1bdbf0e3c1af95da3922072af8d\": container with ID starting with 112c673e39c1b3d4b73324af2b50c600e646d1bdbf0e3c1af95da3922072af8d not found: ID does not exist" Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.298339 4854 scope.go:117] "RemoveContainer" containerID="1ff2496486c57fba50fbd2468d688c50cf8a51ec45fba0782019d7a587739736" Oct 07 14:50:02 crc kubenswrapper[4854]: E1007 14:50:02.298693 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff2496486c57fba50fbd2468d688c50cf8a51ec45fba0782019d7a587739736\": container with ID starting with 1ff2496486c57fba50fbd2468d688c50cf8a51ec45fba0782019d7a587739736 not found: ID does not exist" containerID="1ff2496486c57fba50fbd2468d688c50cf8a51ec45fba0782019d7a587739736" Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.298720 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff2496486c57fba50fbd2468d688c50cf8a51ec45fba0782019d7a587739736"} err="failed to get container status \"1ff2496486c57fba50fbd2468d688c50cf8a51ec45fba0782019d7a587739736\": rpc error: code = NotFound desc = could not find container \"1ff2496486c57fba50fbd2468d688c50cf8a51ec45fba0782019d7a587739736\": container with ID starting with 1ff2496486c57fba50fbd2468d688c50cf8a51ec45fba0782019d7a587739736 not found: ID does not exist" Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.298737 4854 scope.go:117] "RemoveContainer" containerID="f5df867341fa08f3bb8b574cba0e399cf83440343b65630fa7dfd53739a61aa5" Oct 07 14:50:02 crc kubenswrapper[4854]: E1007 14:50:02.299014 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5df867341fa08f3bb8b574cba0e399cf83440343b65630fa7dfd53739a61aa5\": container with ID starting with f5df867341fa08f3bb8b574cba0e399cf83440343b65630fa7dfd53739a61aa5 not found: ID does not exist" containerID="f5df867341fa08f3bb8b574cba0e399cf83440343b65630fa7dfd53739a61aa5" Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.299051 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5df867341fa08f3bb8b574cba0e399cf83440343b65630fa7dfd53739a61aa5"} err="failed to get container status \"f5df867341fa08f3bb8b574cba0e399cf83440343b65630fa7dfd53739a61aa5\": rpc error: code = NotFound desc = could not find container \"f5df867341fa08f3bb8b574cba0e399cf83440343b65630fa7dfd53739a61aa5\": container with ID starting with f5df867341fa08f3bb8b574cba0e399cf83440343b65630fa7dfd53739a61aa5 not found: ID does not exist" Oct 07 14:50:02 crc kubenswrapper[4854]: I1007 14:50:02.718719 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df01e860-3743-4cc3-898e-d15f5a7a2203" path="/var/lib/kubelet/pods/df01e860-3743-4cc3-898e-d15f5a7a2203/volumes" Oct 07 14:51:40 crc kubenswrapper[4854]: I1007 14:51:40.808181 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:51:40 crc kubenswrapper[4854]: I1007 14:51:40.808769 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:51:58 crc kubenswrapper[4854]: I1007 14:51:58.594049 4854 generic.go:334] "Generic (PLEG): container finished" podID="bfe44bd6-7978-41f7-afa8-bdfcafad2b49" containerID="19777fbbef9b90828d4d4e9508466fb01b139a0e108c5db1331e1dd01433aa0e" exitCode=0 Oct 07 14:51:58 crc kubenswrapper[4854]: I1007 14:51:58.594164 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" event={"ID":"bfe44bd6-7978-41f7-afa8-bdfcafad2b49","Type":"ContainerDied","Data":"19777fbbef9b90828d4d4e9508466fb01b139a0e108c5db1331e1dd01433aa0e"} Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.133359 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.229997 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-neutron-sriov-combined-ca-bundle\") pod \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.230114 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-ceph\") pod \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.230183 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7xq2\" (UniqueName: \"kubernetes.io/projected/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-kube-api-access-l7xq2\") pod \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.230345 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-inventory\") pod \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.230425 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-ssh-key\") pod \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.230448 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-neutron-sriov-agent-neutron-config-0\") pod \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\" (UID: \"bfe44bd6-7978-41f7-afa8-bdfcafad2b49\") " Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.235732 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-kube-api-access-l7xq2" (OuterVolumeSpecName: "kube-api-access-l7xq2") pod "bfe44bd6-7978-41f7-afa8-bdfcafad2b49" (UID: "bfe44bd6-7978-41f7-afa8-bdfcafad2b49"). InnerVolumeSpecName "kube-api-access-l7xq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.236125 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-ceph" (OuterVolumeSpecName: "ceph") pod "bfe44bd6-7978-41f7-afa8-bdfcafad2b49" (UID: "bfe44bd6-7978-41f7-afa8-bdfcafad2b49"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.236181 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "bfe44bd6-7978-41f7-afa8-bdfcafad2b49" (UID: "bfe44bd6-7978-41f7-afa8-bdfcafad2b49"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.262666 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bfe44bd6-7978-41f7-afa8-bdfcafad2b49" (UID: "bfe44bd6-7978-41f7-afa8-bdfcafad2b49"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.264816 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-inventory" (OuterVolumeSpecName: "inventory") pod "bfe44bd6-7978-41f7-afa8-bdfcafad2b49" (UID: "bfe44bd6-7978-41f7-afa8-bdfcafad2b49"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.272064 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "bfe44bd6-7978-41f7-afa8-bdfcafad2b49" (UID: "bfe44bd6-7978-41f7-afa8-bdfcafad2b49"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.333095 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.333140 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.333179 4854 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.333193 4854 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.333207 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.333220 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7xq2\" (UniqueName: \"kubernetes.io/projected/bfe44bd6-7978-41f7-afa8-bdfcafad2b49-kube-api-access-l7xq2\") on node \"crc\" DevicePath \"\"" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.618437 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" event={"ID":"bfe44bd6-7978-41f7-afa8-bdfcafad2b49","Type":"ContainerDied","Data":"ab94f774c40a5b750b78aa43a96db350e4201f9719f95c82c3a524377d69b609"} Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.618825 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab94f774c40a5b750b78aa43a96db350e4201f9719f95c82c3a524377d69b609" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.618489 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-zrspg" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.727474 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-ht5md"] Oct 07 14:52:00 crc kubenswrapper[4854]: E1007 14:52:00.728142 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe44bd6-7978-41f7-afa8-bdfcafad2b49" containerName="neutron-sriov-openstack-openstack-cell1" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.728177 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe44bd6-7978-41f7-afa8-bdfcafad2b49" containerName="neutron-sriov-openstack-openstack-cell1" Oct 07 14:52:00 crc kubenswrapper[4854]: E1007 14:52:00.728234 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df01e860-3743-4cc3-898e-d15f5a7a2203" containerName="extract-content" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.728245 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="df01e860-3743-4cc3-898e-d15f5a7a2203" containerName="extract-content" Oct 07 14:52:00 crc kubenswrapper[4854]: E1007 14:52:00.728265 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df01e860-3743-4cc3-898e-d15f5a7a2203" containerName="registry-server" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.728273 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="df01e860-3743-4cc3-898e-d15f5a7a2203" containerName="registry-server" Oct 07 14:52:00 crc kubenswrapper[4854]: E1007 14:52:00.728334 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df01e860-3743-4cc3-898e-d15f5a7a2203" containerName="extract-utilities" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.728343 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="df01e860-3743-4cc3-898e-d15f5a7a2203" containerName="extract-utilities" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.728671 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="df01e860-3743-4cc3-898e-d15f5a7a2203" containerName="registry-server" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.728726 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe44bd6-7978-41f7-afa8-bdfcafad2b49" containerName="neutron-sriov-openstack-openstack-cell1" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.729952 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.732462 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.732604 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.733842 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.734175 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.734875 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.740831 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-ht5md"] Oct 07 14:52:00 crc kubenswrapper[4854]: E1007 14:52:00.760205 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfe44bd6_7978_41f7_afa8_bdfcafad2b49.slice\": RecentStats: unable to find data in memory cache]" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.869504 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.869583 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7746h\" (UniqueName: \"kubernetes.io/projected/d7ff666b-5204-4865-9ba1-924d47a82480-kube-api-access-7746h\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.870794 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.871614 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.871888 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.872064 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.973872 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.974003 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.974084 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.974127 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.974195 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7746h\" (UniqueName: \"kubernetes.io/projected/d7ff666b-5204-4865-9ba1-924d47a82480-kube-api-access-7746h\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.974236 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.979767 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.980292 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.980363 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.981828 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.993886 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:00 crc kubenswrapper[4854]: I1007 14:52:00.997950 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7746h\" (UniqueName: \"kubernetes.io/projected/d7ff666b-5204-4865-9ba1-924d47a82480-kube-api-access-7746h\") pod \"neutron-dhcp-openstack-openstack-cell1-ht5md\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:01 crc kubenswrapper[4854]: I1007 14:52:01.069861 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:52:01 crc kubenswrapper[4854]: I1007 14:52:01.679193 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-ht5md"] Oct 07 14:52:02 crc kubenswrapper[4854]: I1007 14:52:02.643485 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" event={"ID":"d7ff666b-5204-4865-9ba1-924d47a82480","Type":"ContainerStarted","Data":"2c13735ce6c5f6837f1844e2147d3c2871d2143cac732b6a2126aa2a3f53f969"} Oct 07 14:52:03 crc kubenswrapper[4854]: I1007 14:52:03.653860 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" event={"ID":"d7ff666b-5204-4865-9ba1-924d47a82480","Type":"ContainerStarted","Data":"1f2755ae8f04081a476f44c4844dbb09b5beff1078663ccbed5d552c24f6177f"} Oct 07 14:52:03 crc kubenswrapper[4854]: I1007 14:52:03.708895 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" podStartSLOduration=3.022019425 podStartE2EDuration="3.708869075s" podCreationTimestamp="2025-10-07 14:52:00 +0000 UTC" firstStartedPulling="2025-10-07 14:52:01.683585833 +0000 UTC m=+8837.671418098" lastFinishedPulling="2025-10-07 14:52:02.370435493 +0000 UTC m=+8838.358267748" observedRunningTime="2025-10-07 14:52:03.687800187 +0000 UTC m=+8839.675632472" watchObservedRunningTime="2025-10-07 14:52:03.708869075 +0000 UTC m=+8839.696701320" Oct 07 14:52:10 crc kubenswrapper[4854]: I1007 14:52:10.808129 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:52:10 crc kubenswrapper[4854]: I1007 14:52:10.809213 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:52:40 crc kubenswrapper[4854]: I1007 14:52:40.807602 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:52:40 crc kubenswrapper[4854]: I1007 14:52:40.808279 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:52:40 crc kubenswrapper[4854]: I1007 14:52:40.808344 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 14:52:40 crc kubenswrapper[4854]: I1007 14:52:40.809754 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:52:40 crc kubenswrapper[4854]: I1007 14:52:40.809866 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" gracePeriod=600 Oct 07 14:52:40 crc kubenswrapper[4854]: E1007 14:52:40.958518 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:52:41 crc kubenswrapper[4854]: I1007 14:52:41.090641 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" exitCode=0 Oct 07 14:52:41 crc kubenswrapper[4854]: I1007 14:52:41.090689 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0"} Oct 07 14:52:41 crc kubenswrapper[4854]: I1007 14:52:41.090729 4854 scope.go:117] "RemoveContainer" containerID="3a01cf9fd789340dd73e9f8543d33a8348cf73c39951103b77b25055e06fb867" Oct 07 14:52:41 crc kubenswrapper[4854]: I1007 14:52:41.091476 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:52:41 crc kubenswrapper[4854]: E1007 14:52:41.091938 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:52:52 crc kubenswrapper[4854]: I1007 14:52:52.704263 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:52:52 crc kubenswrapper[4854]: E1007 14:52:52.705975 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:53:06 crc kubenswrapper[4854]: I1007 14:53:06.704557 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:53:06 crc kubenswrapper[4854]: E1007 14:53:06.705315 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:53:17 crc kubenswrapper[4854]: I1007 14:53:17.703212 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:53:17 crc kubenswrapper[4854]: E1007 14:53:17.704294 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:53:28 crc kubenswrapper[4854]: I1007 14:53:28.703527 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:53:28 crc kubenswrapper[4854]: E1007 14:53:28.704312 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:53:40 crc kubenswrapper[4854]: I1007 14:53:40.703819 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:53:40 crc kubenswrapper[4854]: E1007 14:53:40.706400 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:53:55 crc kubenswrapper[4854]: I1007 14:53:55.704428 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:53:55 crc kubenswrapper[4854]: E1007 14:53:55.705601 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:54:08 crc kubenswrapper[4854]: I1007 14:54:08.703469 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:54:08 crc kubenswrapper[4854]: E1007 14:54:08.719578 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:54:22 crc kubenswrapper[4854]: I1007 14:54:22.703433 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:54:22 crc kubenswrapper[4854]: E1007 14:54:22.704376 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:54:34 crc kubenswrapper[4854]: I1007 14:54:34.709736 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:54:34 crc kubenswrapper[4854]: E1007 14:54:34.710709 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:54:46 crc kubenswrapper[4854]: I1007 14:54:46.582888 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pmdqf"] Oct 07 14:54:46 crc kubenswrapper[4854]: I1007 14:54:46.599658 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:46 crc kubenswrapper[4854]: I1007 14:54:46.685307 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pmdqf"] Oct 07 14:54:46 crc kubenswrapper[4854]: I1007 14:54:46.767476 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-catalog-content\") pod \"redhat-operators-pmdqf\" (UID: \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\") " pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:46 crc kubenswrapper[4854]: I1007 14:54:46.767551 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-utilities\") pod \"redhat-operators-pmdqf\" (UID: \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\") " pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:46 crc kubenswrapper[4854]: I1007 14:54:46.767566 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrkft\" (UniqueName: \"kubernetes.io/projected/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-kube-api-access-nrkft\") pod \"redhat-operators-pmdqf\" (UID: \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\") " pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:46 crc kubenswrapper[4854]: I1007 14:54:46.871972 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-catalog-content\") pod \"redhat-operators-pmdqf\" (UID: \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\") " pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:46 crc kubenswrapper[4854]: I1007 14:54:46.872034 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-utilities\") pod \"redhat-operators-pmdqf\" (UID: \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\") " pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:46 crc kubenswrapper[4854]: I1007 14:54:46.872052 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrkft\" (UniqueName: \"kubernetes.io/projected/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-kube-api-access-nrkft\") pod \"redhat-operators-pmdqf\" (UID: \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\") " pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:46 crc kubenswrapper[4854]: I1007 14:54:46.874423 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-catalog-content\") pod \"redhat-operators-pmdqf\" (UID: \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\") " pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:46 crc kubenswrapper[4854]: I1007 14:54:46.874722 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-utilities\") pod \"redhat-operators-pmdqf\" (UID: \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\") " pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:46 crc kubenswrapper[4854]: I1007 14:54:46.903808 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrkft\" (UniqueName: \"kubernetes.io/projected/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-kube-api-access-nrkft\") pod \"redhat-operators-pmdqf\" (UID: \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\") " pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:46 crc kubenswrapper[4854]: I1007 14:54:46.986321 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:47 crc kubenswrapper[4854]: I1007 14:54:47.469938 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pmdqf"] Oct 07 14:54:47 crc kubenswrapper[4854]: W1007 14:54:47.490514 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda85cc1f7_60d6_4c65_ac0c_2584ae44dad4.slice/crio-3f1eb64abac07f745c90e5cb48f3b02e7c6f4dc2edd9404f1dabaa6e5c9ac88f WatchSource:0}: Error finding container 3f1eb64abac07f745c90e5cb48f3b02e7c6f4dc2edd9404f1dabaa6e5c9ac88f: Status 404 returned error can't find the container with id 3f1eb64abac07f745c90e5cb48f3b02e7c6f4dc2edd9404f1dabaa6e5c9ac88f Oct 07 14:54:47 crc kubenswrapper[4854]: I1007 14:54:47.681431 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pmdqf" event={"ID":"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4","Type":"ContainerStarted","Data":"3f1eb64abac07f745c90e5cb48f3b02e7c6f4dc2edd9404f1dabaa6e5c9ac88f"} Oct 07 14:54:48 crc kubenswrapper[4854]: I1007 14:54:48.694209 4854 generic.go:334] "Generic (PLEG): container finished" podID="a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" containerID="286ad5c795ef3b1023a2274713d26098e7622da8db08b509bed1997501eb1a09" exitCode=0 Oct 07 14:54:48 crc kubenswrapper[4854]: I1007 14:54:48.694262 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pmdqf" event={"ID":"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4","Type":"ContainerDied","Data":"286ad5c795ef3b1023a2274713d26098e7622da8db08b509bed1997501eb1a09"} Oct 07 14:54:48 crc kubenswrapper[4854]: I1007 14:54:48.703524 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:54:48 crc kubenswrapper[4854]: E1007 14:54:48.704376 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:54:50 crc kubenswrapper[4854]: I1007 14:54:50.727675 4854 generic.go:334] "Generic (PLEG): container finished" podID="a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" containerID="e9bea8675c5456b4e21f7720323759bf7d22ab4d24245654fdf4a5e6d86ff966" exitCode=0 Oct 07 14:54:50 crc kubenswrapper[4854]: I1007 14:54:50.727766 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pmdqf" event={"ID":"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4","Type":"ContainerDied","Data":"e9bea8675c5456b4e21f7720323759bf7d22ab4d24245654fdf4a5e6d86ff966"} Oct 07 14:54:51 crc kubenswrapper[4854]: I1007 14:54:51.745303 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pmdqf" event={"ID":"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4","Type":"ContainerStarted","Data":"4047d306af6b2602b115264b1c59f7350817ef60f3acfd84f0ad7b19accf59f6"} Oct 07 14:54:51 crc kubenswrapper[4854]: I1007 14:54:51.771016 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pmdqf" podStartSLOduration=3.027136964 podStartE2EDuration="5.770990932s" podCreationTimestamp="2025-10-07 14:54:46 +0000 UTC" firstStartedPulling="2025-10-07 14:54:48.696552534 +0000 UTC m=+9004.684384829" lastFinishedPulling="2025-10-07 14:54:51.440406512 +0000 UTC m=+9007.428238797" observedRunningTime="2025-10-07 14:54:51.768057957 +0000 UTC m=+9007.755890212" watchObservedRunningTime="2025-10-07 14:54:51.770990932 +0000 UTC m=+9007.758823187" Oct 07 14:54:56 crc kubenswrapper[4854]: I1007 14:54:56.986505 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:56 crc kubenswrapper[4854]: I1007 14:54:56.987232 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:57 crc kubenswrapper[4854]: I1007 14:54:57.062861 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:57 crc kubenswrapper[4854]: I1007 14:54:57.886360 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:54:57 crc kubenswrapper[4854]: I1007 14:54:57.939278 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pmdqf"] Oct 07 14:54:59 crc kubenswrapper[4854]: I1007 14:54:59.847111 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pmdqf" podUID="a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" containerName="registry-server" containerID="cri-o://4047d306af6b2602b115264b1c59f7350817ef60f3acfd84f0ad7b19accf59f6" gracePeriod=2 Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.575594 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.684888 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrkft\" (UniqueName: \"kubernetes.io/projected/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-kube-api-access-nrkft\") pod \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\" (UID: \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\") " Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.685456 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-utilities\") pod \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\" (UID: \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\") " Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.685542 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-catalog-content\") pod \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\" (UID: \"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4\") " Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.686385 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-utilities" (OuterVolumeSpecName: "utilities") pod "a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" (UID: "a85cc1f7-60d6-4c65-ac0c-2584ae44dad4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.691219 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-kube-api-access-nrkft" (OuterVolumeSpecName: "kube-api-access-nrkft") pod "a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" (UID: "a85cc1f7-60d6-4c65-ac0c-2584ae44dad4"). InnerVolumeSpecName "kube-api-access-nrkft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.779271 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" (UID: "a85cc1f7-60d6-4c65-ac0c-2584ae44dad4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.787745 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.787782 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.787799 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrkft\" (UniqueName: \"kubernetes.io/projected/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4-kube-api-access-nrkft\") on node \"crc\" DevicePath \"\"" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.858107 4854 generic.go:334] "Generic (PLEG): container finished" podID="a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" containerID="4047d306af6b2602b115264b1c59f7350817ef60f3acfd84f0ad7b19accf59f6" exitCode=0 Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.858172 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pmdqf" event={"ID":"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4","Type":"ContainerDied","Data":"4047d306af6b2602b115264b1c59f7350817ef60f3acfd84f0ad7b19accf59f6"} Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.859436 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pmdqf" event={"ID":"a85cc1f7-60d6-4c65-ac0c-2584ae44dad4","Type":"ContainerDied","Data":"3f1eb64abac07f745c90e5cb48f3b02e7c6f4dc2edd9404f1dabaa6e5c9ac88f"} Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.858233 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pmdqf" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.859578 4854 scope.go:117] "RemoveContainer" containerID="4047d306af6b2602b115264b1c59f7350817ef60f3acfd84f0ad7b19accf59f6" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.887121 4854 scope.go:117] "RemoveContainer" containerID="e9bea8675c5456b4e21f7720323759bf7d22ab4d24245654fdf4a5e6d86ff966" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.907695 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pmdqf"] Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.917167 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pmdqf"] Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.921861 4854 scope.go:117] "RemoveContainer" containerID="286ad5c795ef3b1023a2274713d26098e7622da8db08b509bed1997501eb1a09" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.955940 4854 scope.go:117] "RemoveContainer" containerID="4047d306af6b2602b115264b1c59f7350817ef60f3acfd84f0ad7b19accf59f6" Oct 07 14:55:00 crc kubenswrapper[4854]: E1007 14:55:00.956670 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4047d306af6b2602b115264b1c59f7350817ef60f3acfd84f0ad7b19accf59f6\": container with ID starting with 4047d306af6b2602b115264b1c59f7350817ef60f3acfd84f0ad7b19accf59f6 not found: ID does not exist" containerID="4047d306af6b2602b115264b1c59f7350817ef60f3acfd84f0ad7b19accf59f6" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.956744 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4047d306af6b2602b115264b1c59f7350817ef60f3acfd84f0ad7b19accf59f6"} err="failed to get container status \"4047d306af6b2602b115264b1c59f7350817ef60f3acfd84f0ad7b19accf59f6\": rpc error: code = NotFound desc = could not find container \"4047d306af6b2602b115264b1c59f7350817ef60f3acfd84f0ad7b19accf59f6\": container with ID starting with 4047d306af6b2602b115264b1c59f7350817ef60f3acfd84f0ad7b19accf59f6 not found: ID does not exist" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.956789 4854 scope.go:117] "RemoveContainer" containerID="e9bea8675c5456b4e21f7720323759bf7d22ab4d24245654fdf4a5e6d86ff966" Oct 07 14:55:00 crc kubenswrapper[4854]: E1007 14:55:00.957208 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9bea8675c5456b4e21f7720323759bf7d22ab4d24245654fdf4a5e6d86ff966\": container with ID starting with e9bea8675c5456b4e21f7720323759bf7d22ab4d24245654fdf4a5e6d86ff966 not found: ID does not exist" containerID="e9bea8675c5456b4e21f7720323759bf7d22ab4d24245654fdf4a5e6d86ff966" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.957244 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9bea8675c5456b4e21f7720323759bf7d22ab4d24245654fdf4a5e6d86ff966"} err="failed to get container status \"e9bea8675c5456b4e21f7720323759bf7d22ab4d24245654fdf4a5e6d86ff966\": rpc error: code = NotFound desc = could not find container \"e9bea8675c5456b4e21f7720323759bf7d22ab4d24245654fdf4a5e6d86ff966\": container with ID starting with e9bea8675c5456b4e21f7720323759bf7d22ab4d24245654fdf4a5e6d86ff966 not found: ID does not exist" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.957266 4854 scope.go:117] "RemoveContainer" containerID="286ad5c795ef3b1023a2274713d26098e7622da8db08b509bed1997501eb1a09" Oct 07 14:55:00 crc kubenswrapper[4854]: E1007 14:55:00.957719 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286ad5c795ef3b1023a2274713d26098e7622da8db08b509bed1997501eb1a09\": container with ID starting with 286ad5c795ef3b1023a2274713d26098e7622da8db08b509bed1997501eb1a09 not found: ID does not exist" containerID="286ad5c795ef3b1023a2274713d26098e7622da8db08b509bed1997501eb1a09" Oct 07 14:55:00 crc kubenswrapper[4854]: I1007 14:55:00.957801 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286ad5c795ef3b1023a2274713d26098e7622da8db08b509bed1997501eb1a09"} err="failed to get container status \"286ad5c795ef3b1023a2274713d26098e7622da8db08b509bed1997501eb1a09\": rpc error: code = NotFound desc = could not find container \"286ad5c795ef3b1023a2274713d26098e7622da8db08b509bed1997501eb1a09\": container with ID starting with 286ad5c795ef3b1023a2274713d26098e7622da8db08b509bed1997501eb1a09 not found: ID does not exist" Oct 07 14:55:02 crc kubenswrapper[4854]: I1007 14:55:02.718140 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" path="/var/lib/kubelet/pods/a85cc1f7-60d6-4c65-ac0c-2584ae44dad4/volumes" Oct 07 14:55:03 crc kubenswrapper[4854]: I1007 14:55:03.703688 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:55:03 crc kubenswrapper[4854]: E1007 14:55:03.704635 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:55:14 crc kubenswrapper[4854]: I1007 14:55:14.713653 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:55:14 crc kubenswrapper[4854]: E1007 14:55:14.714405 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:55:26 crc kubenswrapper[4854]: I1007 14:55:26.707488 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:55:26 crc kubenswrapper[4854]: E1007 14:55:26.708196 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:55:38 crc kubenswrapper[4854]: I1007 14:55:38.703407 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:55:38 crc kubenswrapper[4854]: E1007 14:55:38.704295 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:55:48 crc kubenswrapper[4854]: I1007 14:55:48.380688 4854 generic.go:334] "Generic (PLEG): container finished" podID="d7ff666b-5204-4865-9ba1-924d47a82480" containerID="1f2755ae8f04081a476f44c4844dbb09b5beff1078663ccbed5d552c24f6177f" exitCode=0 Oct 07 14:55:48 crc kubenswrapper[4854]: I1007 14:55:48.380789 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" event={"ID":"d7ff666b-5204-4865-9ba1-924d47a82480","Type":"ContainerDied","Data":"1f2755ae8f04081a476f44c4844dbb09b5beff1078663ccbed5d552c24f6177f"} Oct 07 14:55:49 crc kubenswrapper[4854]: I1007 14:55:49.702743 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:55:49 crc kubenswrapper[4854]: E1007 14:55:49.703500 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:55:49 crc kubenswrapper[4854]: I1007 14:55:49.899136 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:55:49 crc kubenswrapper[4854]: I1007 14:55:49.998247 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-ceph\") pod \"d7ff666b-5204-4865-9ba1-924d47a82480\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " Oct 07 14:55:49 crc kubenswrapper[4854]: I1007 14:55:49.998321 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7746h\" (UniqueName: \"kubernetes.io/projected/d7ff666b-5204-4865-9ba1-924d47a82480-kube-api-access-7746h\") pod \"d7ff666b-5204-4865-9ba1-924d47a82480\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " Oct 07 14:55:49 crc kubenswrapper[4854]: I1007 14:55:49.998359 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-ssh-key\") pod \"d7ff666b-5204-4865-9ba1-924d47a82480\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " Oct 07 14:55:49 crc kubenswrapper[4854]: I1007 14:55:49.998426 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-inventory\") pod \"d7ff666b-5204-4865-9ba1-924d47a82480\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " Oct 07 14:55:49 crc kubenswrapper[4854]: I1007 14:55:49.998445 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-neutron-dhcp-agent-neutron-config-0\") pod \"d7ff666b-5204-4865-9ba1-924d47a82480\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " Oct 07 14:55:49 crc kubenswrapper[4854]: I1007 14:55:49.998509 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-neutron-dhcp-combined-ca-bundle\") pod \"d7ff666b-5204-4865-9ba1-924d47a82480\" (UID: \"d7ff666b-5204-4865-9ba1-924d47a82480\") " Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.003634 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-ceph" (OuterVolumeSpecName: "ceph") pod "d7ff666b-5204-4865-9ba1-924d47a82480" (UID: "d7ff666b-5204-4865-9ba1-924d47a82480"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.003743 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7ff666b-5204-4865-9ba1-924d47a82480-kube-api-access-7746h" (OuterVolumeSpecName: "kube-api-access-7746h") pod "d7ff666b-5204-4865-9ba1-924d47a82480" (UID: "d7ff666b-5204-4865-9ba1-924d47a82480"). InnerVolumeSpecName "kube-api-access-7746h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.008357 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "d7ff666b-5204-4865-9ba1-924d47a82480" (UID: "d7ff666b-5204-4865-9ba1-924d47a82480"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.027930 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d7ff666b-5204-4865-9ba1-924d47a82480" (UID: "d7ff666b-5204-4865-9ba1-924d47a82480"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.030777 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "d7ff666b-5204-4865-9ba1-924d47a82480" (UID: "d7ff666b-5204-4865-9ba1-924d47a82480"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.038337 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-inventory" (OuterVolumeSpecName: "inventory") pod "d7ff666b-5204-4865-9ba1-924d47a82480" (UID: "d7ff666b-5204-4865-9ba1-924d47a82480"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.101058 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.101096 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7746h\" (UniqueName: \"kubernetes.io/projected/d7ff666b-5204-4865-9ba1-924d47a82480-kube-api-access-7746h\") on node \"crc\" DevicePath \"\"" Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.101108 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.101115 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.101126 4854 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.101136 4854 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7ff666b-5204-4865-9ba1-924d47a82480-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.400414 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" event={"ID":"d7ff666b-5204-4865-9ba1-924d47a82480","Type":"ContainerDied","Data":"2c13735ce6c5f6837f1844e2147d3c2871d2143cac732b6a2126aa2a3f53f969"} Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.400453 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c13735ce6c5f6837f1844e2147d3c2871d2143cac732b6a2126aa2a3f53f969" Oct 07 14:55:50 crc kubenswrapper[4854]: I1007 14:55:50.400505 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ht5md" Oct 07 14:56:01 crc kubenswrapper[4854]: I1007 14:56:01.703516 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:56:01 crc kubenswrapper[4854]: E1007 14:56:01.704787 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.047963 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.049583 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="840cffec-0fba-4a84-8d36-4c0cc26cadff" containerName="nova-cell0-conductor-conductor" containerID="cri-o://1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7" gracePeriod=30 Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.084070 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.084448 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="3619fac8-bf60-4908-a9ef-bb5af339f530" containerName="nova-cell1-conductor-conductor" containerID="cri-o://0748a0f6c6ff32f97816e5d32c085f155fba2d7e63486d88aea7d29d50169426" gracePeriod=30 Oct 07 14:56:06 crc kubenswrapper[4854]: E1007 14:56:06.472215 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 14:56:06 crc kubenswrapper[4854]: E1007 14:56:06.474450 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 14:56:06 crc kubenswrapper[4854]: E1007 14:56:06.475890 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 14:56:06 crc kubenswrapper[4854]: E1007 14:56:06.476035 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="840cffec-0fba-4a84-8d36-4c0cc26cadff" containerName="nova-cell0-conductor-conductor" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.665245 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-llp86"] Oct 07 14:56:06 crc kubenswrapper[4854]: E1007 14:56:06.665764 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ff666b-5204-4865-9ba1-924d47a82480" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.665793 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ff666b-5204-4865-9ba1-924d47a82480" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 07 14:56:06 crc kubenswrapper[4854]: E1007 14:56:06.665831 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" containerName="extract-utilities" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.665840 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" containerName="extract-utilities" Oct 07 14:56:06 crc kubenswrapper[4854]: E1007 14:56:06.665860 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" containerName="registry-server" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.665867 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" containerName="registry-server" Oct 07 14:56:06 crc kubenswrapper[4854]: E1007 14:56:06.665891 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" containerName="extract-content" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.665898 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" containerName="extract-content" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.666141 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85cc1f7-60d6-4c65-ac0c-2584ae44dad4" containerName="registry-server" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.666186 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7ff666b-5204-4865-9ba1-924d47a82480" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.668524 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.684552 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-llp86"] Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.806910 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e636ae9-5753-49ab-b4ab-010572562e7d-catalog-content\") pod \"redhat-marketplace-llp86\" (UID: \"5e636ae9-5753-49ab-b4ab-010572562e7d\") " pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.807071 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e636ae9-5753-49ab-b4ab-010572562e7d-utilities\") pod \"redhat-marketplace-llp86\" (UID: \"5e636ae9-5753-49ab-b4ab-010572562e7d\") " pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.807140 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpgfv\" (UniqueName: \"kubernetes.io/projected/5e636ae9-5753-49ab-b4ab-010572562e7d-kube-api-access-zpgfv\") pod \"redhat-marketplace-llp86\" (UID: \"5e636ae9-5753-49ab-b4ab-010572562e7d\") " pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.909736 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e636ae9-5753-49ab-b4ab-010572562e7d-utilities\") pod \"redhat-marketplace-llp86\" (UID: \"5e636ae9-5753-49ab-b4ab-010572562e7d\") " pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.909840 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpgfv\" (UniqueName: \"kubernetes.io/projected/5e636ae9-5753-49ab-b4ab-010572562e7d-kube-api-access-zpgfv\") pod \"redhat-marketplace-llp86\" (UID: \"5e636ae9-5753-49ab-b4ab-010572562e7d\") " pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.909992 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e636ae9-5753-49ab-b4ab-010572562e7d-catalog-content\") pod \"redhat-marketplace-llp86\" (UID: \"5e636ae9-5753-49ab-b4ab-010572562e7d\") " pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.910278 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e636ae9-5753-49ab-b4ab-010572562e7d-utilities\") pod \"redhat-marketplace-llp86\" (UID: \"5e636ae9-5753-49ab-b4ab-010572562e7d\") " pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.910575 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e636ae9-5753-49ab-b4ab-010572562e7d-catalog-content\") pod \"redhat-marketplace-llp86\" (UID: \"5e636ae9-5753-49ab-b4ab-010572562e7d\") " pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.931937 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpgfv\" (UniqueName: \"kubernetes.io/projected/5e636ae9-5753-49ab-b4ab-010572562e7d-kube-api-access-zpgfv\") pod \"redhat-marketplace-llp86\" (UID: \"5e636ae9-5753-49ab-b4ab-010572562e7d\") " pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:06 crc kubenswrapper[4854]: I1007 14:56:06.993955 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:07 crc kubenswrapper[4854]: I1007 14:56:07.443046 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-llp86"] Oct 07 14:56:07 crc kubenswrapper[4854]: W1007 14:56:07.525915 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e636ae9_5753_49ab_b4ab_010572562e7d.slice/crio-f5eff11b12657bd838b61f46395c57fc43279b9aa07dece6feefc0677bb197f4 WatchSource:0}: Error finding container f5eff11b12657bd838b61f46395c57fc43279b9aa07dece6feefc0677bb197f4: Status 404 returned error can't find the container with id f5eff11b12657bd838b61f46395c57fc43279b9aa07dece6feefc0677bb197f4 Oct 07 14:56:07 crc kubenswrapper[4854]: I1007 14:56:07.611332 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-llp86" event={"ID":"5e636ae9-5753-49ab-b4ab-010572562e7d","Type":"ContainerStarted","Data":"f5eff11b12657bd838b61f46395c57fc43279b9aa07dece6feefc0677bb197f4"} Oct 07 14:56:07 crc kubenswrapper[4854]: E1007 14:56:07.964711 4854 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e636ae9_5753_49ab_b4ab_010572562e7d.slice/crio-d1365f5057f1dbe38c5d9f99934fd8d1efbc973d423227cfcef2e9f1b6542d03.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e636ae9_5753_49ab_b4ab_010572562e7d.slice/crio-conmon-d1365f5057f1dbe38c5d9f99934fd8d1efbc973d423227cfcef2e9f1b6542d03.scope\": RecentStats: unable to find data in memory cache]" Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.026975 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.027262 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f3779dac-2010-491e-b67c-9e54dc96802e" containerName="nova-api-log" containerID="cri-o://11763ec9a98b94affeaab04f64e454d4111a5c368421f72492b914777447ad35" gracePeriod=30 Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.027412 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f3779dac-2010-491e-b67c-9e54dc96802e" containerName="nova-api-api" containerID="cri-o://5b7f6ddecbcef7acbabcf085221c456d859aa45dac3b562ade0d34ef256b53bc" gracePeriod=30 Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.080445 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.080649 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6ccb0960-6b05-4548-9d24-65538a53bac0" containerName="nova-scheduler-scheduler" containerID="cri-o://1388830712f06a409d1bb2a6a13fb394eba9c882cd3cf368287304ee41f9a32f" gracePeriod=30 Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.102133 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.102874 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="54138e60-1a8c-4f4d-8179-35716959b0b2" containerName="nova-metadata-log" containerID="cri-o://576dc60cb46b84846dadae86fbc29c26242b9882698ba05c1721911ee044989b" gracePeriod=30 Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.102953 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="54138e60-1a8c-4f4d-8179-35716959b0b2" containerName="nova-metadata-metadata" containerID="cri-o://db3ad8cc3b3119aba55866de81a6fcb5996998bd79b7382a5d19db51b3aab5b7" gracePeriod=30 Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.632972 4854 generic.go:334] "Generic (PLEG): container finished" podID="f3779dac-2010-491e-b67c-9e54dc96802e" containerID="11763ec9a98b94affeaab04f64e454d4111a5c368421f72492b914777447ad35" exitCode=143 Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.633091 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3779dac-2010-491e-b67c-9e54dc96802e","Type":"ContainerDied","Data":"11763ec9a98b94affeaab04f64e454d4111a5c368421f72492b914777447ad35"} Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.635304 4854 generic.go:334] "Generic (PLEG): container finished" podID="54138e60-1a8c-4f4d-8179-35716959b0b2" containerID="576dc60cb46b84846dadae86fbc29c26242b9882698ba05c1721911ee044989b" exitCode=143 Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.635428 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54138e60-1a8c-4f4d-8179-35716959b0b2","Type":"ContainerDied","Data":"576dc60cb46b84846dadae86fbc29c26242b9882698ba05c1721911ee044989b"} Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.637126 4854 generic.go:334] "Generic (PLEG): container finished" podID="5e636ae9-5753-49ab-b4ab-010572562e7d" containerID="d1365f5057f1dbe38c5d9f99934fd8d1efbc973d423227cfcef2e9f1b6542d03" exitCode=0 Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.637190 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-llp86" event={"ID":"5e636ae9-5753-49ab-b4ab-010572562e7d","Type":"ContainerDied","Data":"d1365f5057f1dbe38c5d9f99934fd8d1efbc973d423227cfcef2e9f1b6542d03"} Oct 07 14:56:08 crc kubenswrapper[4854]: I1007 14:56:08.640775 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.065385 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qr776"] Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.068544 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.084190 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qr776"] Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.193389 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/337ac98e-97c5-4988-957c-95552f5ffb41-utilities\") pod \"community-operators-qr776\" (UID: \"337ac98e-97c5-4988-957c-95552f5ffb41\") " pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.193476 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/337ac98e-97c5-4988-957c-95552f5ffb41-catalog-content\") pod \"community-operators-qr776\" (UID: \"337ac98e-97c5-4988-957c-95552f5ffb41\") " pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.193645 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t477\" (UniqueName: \"kubernetes.io/projected/337ac98e-97c5-4988-957c-95552f5ffb41-kube-api-access-2t477\") pod \"community-operators-qr776\" (UID: \"337ac98e-97c5-4988-957c-95552f5ffb41\") " pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.307455 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t477\" (UniqueName: \"kubernetes.io/projected/337ac98e-97c5-4988-957c-95552f5ffb41-kube-api-access-2t477\") pod \"community-operators-qr776\" (UID: \"337ac98e-97c5-4988-957c-95552f5ffb41\") " pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.307765 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/337ac98e-97c5-4988-957c-95552f5ffb41-utilities\") pod \"community-operators-qr776\" (UID: \"337ac98e-97c5-4988-957c-95552f5ffb41\") " pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.307804 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/337ac98e-97c5-4988-957c-95552f5ffb41-catalog-content\") pod \"community-operators-qr776\" (UID: \"337ac98e-97c5-4988-957c-95552f5ffb41\") " pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.308733 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/337ac98e-97c5-4988-957c-95552f5ffb41-utilities\") pod \"community-operators-qr776\" (UID: \"337ac98e-97c5-4988-957c-95552f5ffb41\") " pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.308862 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/337ac98e-97c5-4988-957c-95552f5ffb41-catalog-content\") pod \"community-operators-qr776\" (UID: \"337ac98e-97c5-4988-957c-95552f5ffb41\") " pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.326563 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t477\" (UniqueName: \"kubernetes.io/projected/337ac98e-97c5-4988-957c-95552f5ffb41-kube-api-access-2t477\") pod \"community-operators-qr776\" (UID: \"337ac98e-97c5-4988-957c-95552f5ffb41\") " pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.398767 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:09 crc kubenswrapper[4854]: E1007 14:56:09.652584 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1388830712f06a409d1bb2a6a13fb394eba9c882cd3cf368287304ee41f9a32f is running failed: container process not found" containerID="1388830712f06a409d1bb2a6a13fb394eba9c882cd3cf368287304ee41f9a32f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 14:56:09 crc kubenswrapper[4854]: E1007 14:56:09.654642 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1388830712f06a409d1bb2a6a13fb394eba9c882cd3cf368287304ee41f9a32f is running failed: container process not found" containerID="1388830712f06a409d1bb2a6a13fb394eba9c882cd3cf368287304ee41f9a32f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 14:56:09 crc kubenswrapper[4854]: E1007 14:56:09.655316 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1388830712f06a409d1bb2a6a13fb394eba9c882cd3cf368287304ee41f9a32f is running failed: container process not found" containerID="1388830712f06a409d1bb2a6a13fb394eba9c882cd3cf368287304ee41f9a32f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 14:56:09 crc kubenswrapper[4854]: E1007 14:56:09.655367 4854 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1388830712f06a409d1bb2a6a13fb394eba9c882cd3cf368287304ee41f9a32f is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6ccb0960-6b05-4548-9d24-65538a53bac0" containerName="nova-scheduler-scheduler" Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.671324 4854 generic.go:334] "Generic (PLEG): container finished" podID="6ccb0960-6b05-4548-9d24-65538a53bac0" containerID="1388830712f06a409d1bb2a6a13fb394eba9c882cd3cf368287304ee41f9a32f" exitCode=0 Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.671392 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ccb0960-6b05-4548-9d24-65538a53bac0","Type":"ContainerDied","Data":"1388830712f06a409d1bb2a6a13fb394eba9c882cd3cf368287304ee41f9a32f"} Oct 07 14:56:09 crc kubenswrapper[4854]: I1007 14:56:09.945204 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.024316 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ccb0960-6b05-4548-9d24-65538a53bac0-combined-ca-bundle\") pod \"6ccb0960-6b05-4548-9d24-65538a53bac0\" (UID: \"6ccb0960-6b05-4548-9d24-65538a53bac0\") " Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.024389 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6kww\" (UniqueName: \"kubernetes.io/projected/6ccb0960-6b05-4548-9d24-65538a53bac0-kube-api-access-m6kww\") pod \"6ccb0960-6b05-4548-9d24-65538a53bac0\" (UID: \"6ccb0960-6b05-4548-9d24-65538a53bac0\") " Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.024634 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ccb0960-6b05-4548-9d24-65538a53bac0-config-data\") pod \"6ccb0960-6b05-4548-9d24-65538a53bac0\" (UID: \"6ccb0960-6b05-4548-9d24-65538a53bac0\") " Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.030050 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ccb0960-6b05-4548-9d24-65538a53bac0-kube-api-access-m6kww" (OuterVolumeSpecName: "kube-api-access-m6kww") pod "6ccb0960-6b05-4548-9d24-65538a53bac0" (UID: "6ccb0960-6b05-4548-9d24-65538a53bac0"). InnerVolumeSpecName "kube-api-access-m6kww". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.057857 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ccb0960-6b05-4548-9d24-65538a53bac0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ccb0960-6b05-4548-9d24-65538a53bac0" (UID: "6ccb0960-6b05-4548-9d24-65538a53bac0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.077456 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qr776"] Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.080553 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ccb0960-6b05-4548-9d24-65538a53bac0-config-data" (OuterVolumeSpecName: "config-data") pod "6ccb0960-6b05-4548-9d24-65538a53bac0" (UID: "6ccb0960-6b05-4548-9d24-65538a53bac0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:56:10 crc kubenswrapper[4854]: W1007 14:56:10.083920 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod337ac98e_97c5_4988_957c_95552f5ffb41.slice/crio-df98d0d3672d67259db24fe3106afe68c69f82271513e3cad9ef7a817ec97dbd WatchSource:0}: Error finding container df98d0d3672d67259db24fe3106afe68c69f82271513e3cad9ef7a817ec97dbd: Status 404 returned error can't find the container with id df98d0d3672d67259db24fe3106afe68c69f82271513e3cad9ef7a817ec97dbd Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.126539 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ccb0960-6b05-4548-9d24-65538a53bac0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.126569 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6kww\" (UniqueName: \"kubernetes.io/projected/6ccb0960-6b05-4548-9d24-65538a53bac0-kube-api-access-m6kww\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.126579 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ccb0960-6b05-4548-9d24-65538a53bac0-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:10 crc kubenswrapper[4854]: E1007 14:56:10.459253 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0748a0f6c6ff32f97816e5d32c085f155fba2d7e63486d88aea7d29d50169426" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 14:56:10 crc kubenswrapper[4854]: E1007 14:56:10.462206 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0748a0f6c6ff32f97816e5d32c085f155fba2d7e63486d88aea7d29d50169426" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 14:56:10 crc kubenswrapper[4854]: E1007 14:56:10.463851 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0748a0f6c6ff32f97816e5d32c085f155fba2d7e63486d88aea7d29d50169426" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 14:56:10 crc kubenswrapper[4854]: E1007 14:56:10.463920 4854 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="3619fac8-bf60-4908-a9ef-bb5af339f530" containerName="nova-cell1-conductor-conductor" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.683244 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ccb0960-6b05-4548-9d24-65538a53bac0","Type":"ContainerDied","Data":"25e06ef36a4269fdf10ef198957a8ade6ecc8a8f037d073a2c0979b49f08fe20"} Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.683319 4854 scope.go:117] "RemoveContainer" containerID="1388830712f06a409d1bb2a6a13fb394eba9c882cd3cf368287304ee41f9a32f" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.684462 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.686480 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr776" event={"ID":"337ac98e-97c5-4988-957c-95552f5ffb41","Type":"ContainerStarted","Data":"df98d0d3672d67259db24fe3106afe68c69f82271513e3cad9ef7a817ec97dbd"} Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.759679 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.774486 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.786103 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:56:10 crc kubenswrapper[4854]: E1007 14:56:10.786634 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ccb0960-6b05-4548-9d24-65538a53bac0" containerName="nova-scheduler-scheduler" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.786652 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ccb0960-6b05-4548-9d24-65538a53bac0" containerName="nova-scheduler-scheduler" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.786895 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ccb0960-6b05-4548-9d24-65538a53bac0" containerName="nova-scheduler-scheduler" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.787795 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.790247 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.796406 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.856012 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhw9b\" (UniqueName: \"kubernetes.io/projected/2c745615-bc65-4a81-9cd0-04aeb6dc7dd1-kube-api-access-rhw9b\") pod \"nova-scheduler-0\" (UID: \"2c745615-bc65-4a81-9cd0-04aeb6dc7dd1\") " pod="openstack/nova-scheduler-0" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.856163 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c745615-bc65-4a81-9cd0-04aeb6dc7dd1-config-data\") pod \"nova-scheduler-0\" (UID: \"2c745615-bc65-4a81-9cd0-04aeb6dc7dd1\") " pod="openstack/nova-scheduler-0" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.856232 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c745615-bc65-4a81-9cd0-04aeb6dc7dd1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c745615-bc65-4a81-9cd0-04aeb6dc7dd1\") " pod="openstack/nova-scheduler-0" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.958303 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhw9b\" (UniqueName: \"kubernetes.io/projected/2c745615-bc65-4a81-9cd0-04aeb6dc7dd1-kube-api-access-rhw9b\") pod \"nova-scheduler-0\" (UID: \"2c745615-bc65-4a81-9cd0-04aeb6dc7dd1\") " pod="openstack/nova-scheduler-0" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.958995 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c745615-bc65-4a81-9cd0-04aeb6dc7dd1-config-data\") pod \"nova-scheduler-0\" (UID: \"2c745615-bc65-4a81-9cd0-04aeb6dc7dd1\") " pod="openstack/nova-scheduler-0" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.959261 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c745615-bc65-4a81-9cd0-04aeb6dc7dd1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c745615-bc65-4a81-9cd0-04aeb6dc7dd1\") " pod="openstack/nova-scheduler-0" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.969663 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c745615-bc65-4a81-9cd0-04aeb6dc7dd1-config-data\") pod \"nova-scheduler-0\" (UID: \"2c745615-bc65-4a81-9cd0-04aeb6dc7dd1\") " pod="openstack/nova-scheduler-0" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.974141 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c745615-bc65-4a81-9cd0-04aeb6dc7dd1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c745615-bc65-4a81-9cd0-04aeb6dc7dd1\") " pod="openstack/nova-scheduler-0" Oct 07 14:56:10 crc kubenswrapper[4854]: I1007 14:56:10.979479 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhw9b\" (UniqueName: \"kubernetes.io/projected/2c745615-bc65-4a81-9cd0-04aeb6dc7dd1-kube-api-access-rhw9b\") pod \"nova-scheduler-0\" (UID: \"2c745615-bc65-4a81-9cd0-04aeb6dc7dd1\") " pod="openstack/nova-scheduler-0" Oct 07 14:56:11 crc kubenswrapper[4854]: I1007 14:56:11.108321 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:56:11 crc kubenswrapper[4854]: E1007 14:56:11.468757 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7 is running failed: container process not found" containerID="1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 14:56:11 crc kubenswrapper[4854]: E1007 14:56:11.469189 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7 is running failed: container process not found" containerID="1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 14:56:11 crc kubenswrapper[4854]: E1007 14:56:11.469517 4854 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7 is running failed: container process not found" containerID="1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 07 14:56:11 crc kubenswrapper[4854]: E1007 14:56:11.469641 4854 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="840cffec-0fba-4a84-8d36-4c0cc26cadff" containerName="nova-cell0-conductor-conductor" Oct 07 14:56:11 crc kubenswrapper[4854]: I1007 14:56:11.543783 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="54138e60-1a8c-4f4d-8179-35716959b0b2" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.87:8775/\": read tcp 10.217.0.2:48228->10.217.1.87:8775: read: connection reset by peer" Oct 07 14:56:11 crc kubenswrapper[4854]: I1007 14:56:11.543824 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="54138e60-1a8c-4f4d-8179-35716959b0b2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.87:8775/\": read tcp 10.217.0.2:48238->10.217.1.87:8775: read: connection reset by peer" Oct 07 14:56:11 crc kubenswrapper[4854]: I1007 14:56:11.708568 4854 generic.go:334] "Generic (PLEG): container finished" podID="54138e60-1a8c-4f4d-8179-35716959b0b2" containerID="db3ad8cc3b3119aba55866de81a6fcb5996998bd79b7382a5d19db51b3aab5b7" exitCode=0 Oct 07 14:56:11 crc kubenswrapper[4854]: I1007 14:56:11.708831 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54138e60-1a8c-4f4d-8179-35716959b0b2","Type":"ContainerDied","Data":"db3ad8cc3b3119aba55866de81a6fcb5996998bd79b7382a5d19db51b3aab5b7"} Oct 07 14:56:11 crc kubenswrapper[4854]: I1007 14:56:11.710564 4854 generic.go:334] "Generic (PLEG): container finished" podID="840cffec-0fba-4a84-8d36-4c0cc26cadff" containerID="1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7" exitCode=0 Oct 07 14:56:11 crc kubenswrapper[4854]: I1007 14:56:11.710614 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"840cffec-0fba-4a84-8d36-4c0cc26cadff","Type":"ContainerDied","Data":"1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7"} Oct 07 14:56:11 crc kubenswrapper[4854]: I1007 14:56:11.717865 4854 generic.go:334] "Generic (PLEG): container finished" podID="f3779dac-2010-491e-b67c-9e54dc96802e" containerID="5b7f6ddecbcef7acbabcf085221c456d859aa45dac3b562ade0d34ef256b53bc" exitCode=0 Oct 07 14:56:11 crc kubenswrapper[4854]: I1007 14:56:11.717923 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3779dac-2010-491e-b67c-9e54dc96802e","Type":"ContainerDied","Data":"5b7f6ddecbcef7acbabcf085221c456d859aa45dac3b562ade0d34ef256b53bc"} Oct 07 14:56:11 crc kubenswrapper[4854]: I1007 14:56:11.719683 4854 generic.go:334] "Generic (PLEG): container finished" podID="337ac98e-97c5-4988-957c-95552f5ffb41" containerID="2099e2c54421dd9496b5836aea45e5a8273b50002edfe92f8dc317146c6104ec" exitCode=0 Oct 07 14:56:11 crc kubenswrapper[4854]: I1007 14:56:11.719728 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr776" event={"ID":"337ac98e-97c5-4988-957c-95552f5ffb41","Type":"ContainerDied","Data":"2099e2c54421dd9496b5836aea45e5a8273b50002edfe92f8dc317146c6104ec"} Oct 07 14:56:11 crc kubenswrapper[4854]: I1007 14:56:11.726074 4854 generic.go:334] "Generic (PLEG): container finished" podID="3619fac8-bf60-4908-a9ef-bb5af339f530" containerID="0748a0f6c6ff32f97816e5d32c085f155fba2d7e63486d88aea7d29d50169426" exitCode=0 Oct 07 14:56:11 crc kubenswrapper[4854]: I1007 14:56:11.726130 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3619fac8-bf60-4908-a9ef-bb5af339f530","Type":"ContainerDied","Data":"0748a0f6c6ff32f97816e5d32c085f155fba2d7e63486d88aea7d29d50169426"} Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.299264 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.390445 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4znxl\" (UniqueName: \"kubernetes.io/projected/54138e60-1a8c-4f4d-8179-35716959b0b2-kube-api-access-4znxl\") pod \"54138e60-1a8c-4f4d-8179-35716959b0b2\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.390717 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54138e60-1a8c-4f4d-8179-35716959b0b2-config-data\") pod \"54138e60-1a8c-4f4d-8179-35716959b0b2\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.390803 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54138e60-1a8c-4f4d-8179-35716959b0b2-logs\") pod \"54138e60-1a8c-4f4d-8179-35716959b0b2\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.390838 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54138e60-1a8c-4f4d-8179-35716959b0b2-combined-ca-bundle\") pod \"54138e60-1a8c-4f4d-8179-35716959b0b2\" (UID: \"54138e60-1a8c-4f4d-8179-35716959b0b2\") " Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.396511 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54138e60-1a8c-4f4d-8179-35716959b0b2-logs" (OuterVolumeSpecName: "logs") pod "54138e60-1a8c-4f4d-8179-35716959b0b2" (UID: "54138e60-1a8c-4f4d-8179-35716959b0b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.403283 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54138e60-1a8c-4f4d-8179-35716959b0b2-kube-api-access-4znxl" (OuterVolumeSpecName: "kube-api-access-4znxl") pod "54138e60-1a8c-4f4d-8179-35716959b0b2" (UID: "54138e60-1a8c-4f4d-8179-35716959b0b2"). InnerVolumeSpecName "kube-api-access-4znxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.428779 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54138e60-1a8c-4f4d-8179-35716959b0b2-config-data" (OuterVolumeSpecName: "config-data") pod "54138e60-1a8c-4f4d-8179-35716959b0b2" (UID: "54138e60-1a8c-4f4d-8179-35716959b0b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.493453 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54138e60-1a8c-4f4d-8179-35716959b0b2-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.493480 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54138e60-1a8c-4f4d-8179-35716959b0b2-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.493494 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4znxl\" (UniqueName: \"kubernetes.io/projected/54138e60-1a8c-4f4d-8179-35716959b0b2-kube-api-access-4znxl\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.632909 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.694333 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54138e60-1a8c-4f4d-8179-35716959b0b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54138e60-1a8c-4f4d-8179-35716959b0b2" (UID: "54138e60-1a8c-4f4d-8179-35716959b0b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.696713 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54138e60-1a8c-4f4d-8179-35716959b0b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.702960 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:56:12 crc kubenswrapper[4854]: E1007 14:56:12.703288 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.716472 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ccb0960-6b05-4548-9d24-65538a53bac0" path="/var/lib/kubelet/pods/6ccb0960-6b05-4548-9d24-65538a53bac0/volumes" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.744213 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3619fac8-bf60-4908-a9ef-bb5af339f530","Type":"ContainerDied","Data":"c412d7a263468f393580bb9a4d8377fcd8bbb60e81d2912fd0d0a39981e65b47"} Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.744252 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c412d7a263468f393580bb9a4d8377fcd8bbb60e81d2912fd0d0a39981e65b47" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.749726 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54138e60-1a8c-4f4d-8179-35716959b0b2","Type":"ContainerDied","Data":"1d19d093543669e298353b699bdccde259e990288064bca6ece4b76897388f04"} Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.749749 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.750486 4854 scope.go:117] "RemoveContainer" containerID="db3ad8cc3b3119aba55866de81a6fcb5996998bd79b7382a5d19db51b3aab5b7" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.767203 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"840cffec-0fba-4a84-8d36-4c0cc26cadff","Type":"ContainerDied","Data":"38e96b0c3bf2de8356a585173b43064d5621456d59e140af636ecbd7eca371be"} Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.767246 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38e96b0c3bf2de8356a585173b43064d5621456d59e140af636ecbd7eca371be" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.773269 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c745615-bc65-4a81-9cd0-04aeb6dc7dd1","Type":"ContainerStarted","Data":"32d3189012999b852fc4b43e0a58144c9794cd9f1845f5489db492b5d095cf4a"} Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.776515 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-llp86" event={"ID":"5e636ae9-5753-49ab-b4ab-010572562e7d","Type":"ContainerStarted","Data":"12434ac2dcfdbb02cc08c43aa8fb724cbd34495f5f5d219f3fdbd6199822c058"} Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.790449 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.809481 4854 scope.go:117] "RemoveContainer" containerID="576dc60cb46b84846dadae86fbc29c26242b9882698ba05c1721911ee044989b" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.837242 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.921367 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3619fac8-bf60-4908-a9ef-bb5af339f530-config-data\") pod \"3619fac8-bf60-4908-a9ef-bb5af339f530\" (UID: \"3619fac8-bf60-4908-a9ef-bb5af339f530\") " Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.921468 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840cffec-0fba-4a84-8d36-4c0cc26cadff-config-data\") pod \"840cffec-0fba-4a84-8d36-4c0cc26cadff\" (UID: \"840cffec-0fba-4a84-8d36-4c0cc26cadff\") " Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.921545 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw5jq\" (UniqueName: \"kubernetes.io/projected/840cffec-0fba-4a84-8d36-4c0cc26cadff-kube-api-access-xw5jq\") pod \"840cffec-0fba-4a84-8d36-4c0cc26cadff\" (UID: \"840cffec-0fba-4a84-8d36-4c0cc26cadff\") " Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.921700 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtph2\" (UniqueName: \"kubernetes.io/projected/3619fac8-bf60-4908-a9ef-bb5af339f530-kube-api-access-jtph2\") pod \"3619fac8-bf60-4908-a9ef-bb5af339f530\" (UID: \"3619fac8-bf60-4908-a9ef-bb5af339f530\") " Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.921788 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3619fac8-bf60-4908-a9ef-bb5af339f530-combined-ca-bundle\") pod \"3619fac8-bf60-4908-a9ef-bb5af339f530\" (UID: \"3619fac8-bf60-4908-a9ef-bb5af339f530\") " Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.921889 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840cffec-0fba-4a84-8d36-4c0cc26cadff-combined-ca-bundle\") pod \"840cffec-0fba-4a84-8d36-4c0cc26cadff\" (UID: \"840cffec-0fba-4a84-8d36-4c0cc26cadff\") " Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.928774 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/840cffec-0fba-4a84-8d36-4c0cc26cadff-kube-api-access-xw5jq" (OuterVolumeSpecName: "kube-api-access-xw5jq") pod "840cffec-0fba-4a84-8d36-4c0cc26cadff" (UID: "840cffec-0fba-4a84-8d36-4c0cc26cadff"). InnerVolumeSpecName "kube-api-access-xw5jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.935918 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.945371 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.947007 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3619fac8-bf60-4908-a9ef-bb5af339f530-kube-api-access-jtph2" (OuterVolumeSpecName: "kube-api-access-jtph2") pod "3619fac8-bf60-4908-a9ef-bb5af339f530" (UID: "3619fac8-bf60-4908-a9ef-bb5af339f530"). InnerVolumeSpecName "kube-api-access-jtph2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.973952 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.975569 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/840cffec-0fba-4a84-8d36-4c0cc26cadff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "840cffec-0fba-4a84-8d36-4c0cc26cadff" (UID: "840cffec-0fba-4a84-8d36-4c0cc26cadff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.979189 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/840cffec-0fba-4a84-8d36-4c0cc26cadff-config-data" (OuterVolumeSpecName: "config-data") pod "840cffec-0fba-4a84-8d36-4c0cc26cadff" (UID: "840cffec-0fba-4a84-8d36-4c0cc26cadff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.990505 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3619fac8-bf60-4908-a9ef-bb5af339f530-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3619fac8-bf60-4908-a9ef-bb5af339f530" (UID: "3619fac8-bf60-4908-a9ef-bb5af339f530"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.995253 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:56:12 crc kubenswrapper[4854]: E1007 14:56:12.995846 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3619fac8-bf60-4908-a9ef-bb5af339f530" containerName="nova-cell1-conductor-conductor" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.995878 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="3619fac8-bf60-4908-a9ef-bb5af339f530" containerName="nova-cell1-conductor-conductor" Oct 07 14:56:12 crc kubenswrapper[4854]: E1007 14:56:12.995904 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3779dac-2010-491e-b67c-9e54dc96802e" containerName="nova-api-api" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.995910 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3779dac-2010-491e-b67c-9e54dc96802e" containerName="nova-api-api" Oct 07 14:56:12 crc kubenswrapper[4854]: E1007 14:56:12.995918 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54138e60-1a8c-4f4d-8179-35716959b0b2" containerName="nova-metadata-log" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.995941 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="54138e60-1a8c-4f4d-8179-35716959b0b2" containerName="nova-metadata-log" Oct 07 14:56:12 crc kubenswrapper[4854]: E1007 14:56:12.995969 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840cffec-0fba-4a84-8d36-4c0cc26cadff" containerName="nova-cell0-conductor-conductor" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.995975 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="840cffec-0fba-4a84-8d36-4c0cc26cadff" containerName="nova-cell0-conductor-conductor" Oct 07 14:56:12 crc kubenswrapper[4854]: E1007 14:56:12.995991 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3779dac-2010-491e-b67c-9e54dc96802e" containerName="nova-api-log" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.995996 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3779dac-2010-491e-b67c-9e54dc96802e" containerName="nova-api-log" Oct 07 14:56:12 crc kubenswrapper[4854]: E1007 14:56:12.996029 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54138e60-1a8c-4f4d-8179-35716959b0b2" containerName="nova-metadata-metadata" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.996036 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="54138e60-1a8c-4f4d-8179-35716959b0b2" containerName="nova-metadata-metadata" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.996330 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="54138e60-1a8c-4f4d-8179-35716959b0b2" containerName="nova-metadata-log" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.996348 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3779dac-2010-491e-b67c-9e54dc96802e" containerName="nova-api-api" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.996361 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3779dac-2010-491e-b67c-9e54dc96802e" containerName="nova-api-log" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.996375 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="840cffec-0fba-4a84-8d36-4c0cc26cadff" containerName="nova-cell0-conductor-conductor" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.996404 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="3619fac8-bf60-4908-a9ef-bb5af339f530" containerName="nova-cell1-conductor-conductor" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.996415 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="54138e60-1a8c-4f4d-8179-35716959b0b2" containerName="nova-metadata-metadata" Oct 07 14:56:12 crc kubenswrapper[4854]: I1007 14:56:12.998017 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.004320 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.005538 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3619fac8-bf60-4908-a9ef-bb5af339f530-config-data" (OuterVolumeSpecName: "config-data") pod "3619fac8-bf60-4908-a9ef-bb5af339f530" (UID: "3619fac8-bf60-4908-a9ef-bb5af339f530"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.023455 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lxl4\" (UniqueName: \"kubernetes.io/projected/f3779dac-2010-491e-b67c-9e54dc96802e-kube-api-access-5lxl4\") pod \"f3779dac-2010-491e-b67c-9e54dc96802e\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.023614 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3779dac-2010-491e-b67c-9e54dc96802e-config-data\") pod \"f3779dac-2010-491e-b67c-9e54dc96802e\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.023791 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3779dac-2010-491e-b67c-9e54dc96802e-logs\") pod \"f3779dac-2010-491e-b67c-9e54dc96802e\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.023857 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3779dac-2010-491e-b67c-9e54dc96802e-combined-ca-bundle\") pod \"f3779dac-2010-491e-b67c-9e54dc96802e\" (UID: \"f3779dac-2010-491e-b67c-9e54dc96802e\") " Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.024303 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3619fac8-bf60-4908-a9ef-bb5af339f530-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.024317 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/840cffec-0fba-4a84-8d36-4c0cc26cadff-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.024326 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw5jq\" (UniqueName: \"kubernetes.io/projected/840cffec-0fba-4a84-8d36-4c0cc26cadff-kube-api-access-xw5jq\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.024335 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtph2\" (UniqueName: \"kubernetes.io/projected/3619fac8-bf60-4908-a9ef-bb5af339f530-kube-api-access-jtph2\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.024343 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3619fac8-bf60-4908-a9ef-bb5af339f530-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.024353 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/840cffec-0fba-4a84-8d36-4c0cc26cadff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.024536 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3779dac-2010-491e-b67c-9e54dc96802e-logs" (OuterVolumeSpecName: "logs") pod "f3779dac-2010-491e-b67c-9e54dc96802e" (UID: "f3779dac-2010-491e-b67c-9e54dc96802e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.029055 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3779dac-2010-491e-b67c-9e54dc96802e-kube-api-access-5lxl4" (OuterVolumeSpecName: "kube-api-access-5lxl4") pod "f3779dac-2010-491e-b67c-9e54dc96802e" (UID: "f3779dac-2010-491e-b67c-9e54dc96802e"). InnerVolumeSpecName "kube-api-access-5lxl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.049213 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.076536 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3779dac-2010-491e-b67c-9e54dc96802e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3779dac-2010-491e-b67c-9e54dc96802e" (UID: "f3779dac-2010-491e-b67c-9e54dc96802e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.082968 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3779dac-2010-491e-b67c-9e54dc96802e-config-data" (OuterVolumeSpecName: "config-data") pod "f3779dac-2010-491e-b67c-9e54dc96802e" (UID: "f3779dac-2010-491e-b67c-9e54dc96802e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.127033 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c\") " pod="openstack/nova-metadata-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.127079 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c-config-data\") pod \"nova-metadata-0\" (UID: \"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c\") " pod="openstack/nova-metadata-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.127154 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxp5p\" (UniqueName: \"kubernetes.io/projected/f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c-kube-api-access-dxp5p\") pod \"nova-metadata-0\" (UID: \"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c\") " pod="openstack/nova-metadata-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.127249 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c-logs\") pod \"nova-metadata-0\" (UID: \"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c\") " pod="openstack/nova-metadata-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.127316 4854 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3779dac-2010-491e-b67c-9e54dc96802e-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.127327 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3779dac-2010-491e-b67c-9e54dc96802e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.127338 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lxl4\" (UniqueName: \"kubernetes.io/projected/f3779dac-2010-491e-b67c-9e54dc96802e-kube-api-access-5lxl4\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.127346 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3779dac-2010-491e-b67c-9e54dc96802e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.231261 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxp5p\" (UniqueName: \"kubernetes.io/projected/f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c-kube-api-access-dxp5p\") pod \"nova-metadata-0\" (UID: \"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c\") " pod="openstack/nova-metadata-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.231909 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c-logs\") pod \"nova-metadata-0\" (UID: \"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c\") " pod="openstack/nova-metadata-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.232196 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c-config-data\") pod \"nova-metadata-0\" (UID: \"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c\") " pod="openstack/nova-metadata-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.232358 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c\") " pod="openstack/nova-metadata-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.234272 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c-logs\") pod \"nova-metadata-0\" (UID: \"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c\") " pod="openstack/nova-metadata-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.236845 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c\") " pod="openstack/nova-metadata-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.239278 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c-config-data\") pod \"nova-metadata-0\" (UID: \"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c\") " pod="openstack/nova-metadata-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.251508 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxp5p\" (UniqueName: \"kubernetes.io/projected/f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c-kube-api-access-dxp5p\") pod \"nova-metadata-0\" (UID: \"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c\") " pod="openstack/nova-metadata-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.322249 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.789430 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c745615-bc65-4a81-9cd0-04aeb6dc7dd1","Type":"ContainerStarted","Data":"8816f9049bd0c2df28e37b3032d2201c4390028668ef93c1887d26e3cb648162"} Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.794236 4854 generic.go:334] "Generic (PLEG): container finished" podID="5e636ae9-5753-49ab-b4ab-010572562e7d" containerID="12434ac2dcfdbb02cc08c43aa8fb724cbd34495f5f5d219f3fdbd6199822c058" exitCode=0 Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.794289 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-llp86" event={"ID":"5e636ae9-5753-49ab-b4ab-010572562e7d","Type":"ContainerDied","Data":"12434ac2dcfdbb02cc08c43aa8fb724cbd34495f5f5d219f3fdbd6199822c058"} Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.809916 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.811228 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3779dac-2010-491e-b67c-9e54dc96802e","Type":"ContainerDied","Data":"df9e3db0ac52b6bbc01b374026bfaa4eeb4cadfc26ac4e427952e0237bb2b3f7"} Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.811286 4854 scope.go:117] "RemoveContainer" containerID="5b7f6ddecbcef7acbabcf085221c456d859aa45dac3b562ade0d34ef256b53bc" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.815497 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.8154752480000003 podStartE2EDuration="3.815475248s" podCreationTimestamp="2025-10-07 14:56:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:56:13.807989461 +0000 UTC m=+9089.795821716" watchObservedRunningTime="2025-10-07 14:56:13.815475248 +0000 UTC m=+9089.803307503" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.818407 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.818401 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.852444 4854 scope.go:117] "RemoveContainer" containerID="11763ec9a98b94affeaab04f64e454d4111a5c368421f72492b914777447ad35" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.902783 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.914122 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.929616 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.940355 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.956194 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.957675 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.962908 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.965484 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.980808 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.988815 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.990984 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:56:13 crc kubenswrapper[4854]: I1007 14:56:13.995925 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:13.998390 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.009219 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.019271 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.020609 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.025541 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.029190 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.059092 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0825000-748f-40a8-a3b8-c1009d4b9f9e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e0825000-748f-40a8-a3b8-c1009d4b9f9e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.059193 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f891daf4-fdec-4042-a6d7-e2b6519d69d4-logs\") pod \"nova-api-0\" (UID: \"f891daf4-fdec-4042-a6d7-e2b6519d69d4\") " pod="openstack/nova-api-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.059230 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0825000-748f-40a8-a3b8-c1009d4b9f9e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e0825000-748f-40a8-a3b8-c1009d4b9f9e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.059249 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsz72\" (UniqueName: \"kubernetes.io/projected/f891daf4-fdec-4042-a6d7-e2b6519d69d4-kube-api-access-tsz72\") pod \"nova-api-0\" (UID: \"f891daf4-fdec-4042-a6d7-e2b6519d69d4\") " pod="openstack/nova-api-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.059304 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-577b7\" (UniqueName: \"kubernetes.io/projected/e0825000-748f-40a8-a3b8-c1009d4b9f9e-kube-api-access-577b7\") pod \"nova-cell1-conductor-0\" (UID: \"e0825000-748f-40a8-a3b8-c1009d4b9f9e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.059359 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f891daf4-fdec-4042-a6d7-e2b6519d69d4-config-data\") pod \"nova-api-0\" (UID: \"f891daf4-fdec-4042-a6d7-e2b6519d69d4\") " pod="openstack/nova-api-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.059386 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f891daf4-fdec-4042-a6d7-e2b6519d69d4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f891daf4-fdec-4042-a6d7-e2b6519d69d4\") " pod="openstack/nova-api-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.161408 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f891daf4-fdec-4042-a6d7-e2b6519d69d4-config-data\") pod \"nova-api-0\" (UID: \"f891daf4-fdec-4042-a6d7-e2b6519d69d4\") " pod="openstack/nova-api-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.161473 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f891daf4-fdec-4042-a6d7-e2b6519d69d4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f891daf4-fdec-4042-a6d7-e2b6519d69d4\") " pod="openstack/nova-api-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.161507 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b78f233-a649-4acd-a7fd-da9e1932d230-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2b78f233-a649-4acd-a7fd-da9e1932d230\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.161526 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n59lp\" (UniqueName: \"kubernetes.io/projected/2b78f233-a649-4acd-a7fd-da9e1932d230-kube-api-access-n59lp\") pod \"nova-cell0-conductor-0\" (UID: \"2b78f233-a649-4acd-a7fd-da9e1932d230\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.161557 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0825000-748f-40a8-a3b8-c1009d4b9f9e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e0825000-748f-40a8-a3b8-c1009d4b9f9e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.161636 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b78f233-a649-4acd-a7fd-da9e1932d230-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2b78f233-a649-4acd-a7fd-da9e1932d230\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.161703 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f891daf4-fdec-4042-a6d7-e2b6519d69d4-logs\") pod \"nova-api-0\" (UID: \"f891daf4-fdec-4042-a6d7-e2b6519d69d4\") " pod="openstack/nova-api-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.161749 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0825000-748f-40a8-a3b8-c1009d4b9f9e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e0825000-748f-40a8-a3b8-c1009d4b9f9e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.161792 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsz72\" (UniqueName: \"kubernetes.io/projected/f891daf4-fdec-4042-a6d7-e2b6519d69d4-kube-api-access-tsz72\") pod \"nova-api-0\" (UID: \"f891daf4-fdec-4042-a6d7-e2b6519d69d4\") " pod="openstack/nova-api-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.161888 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-577b7\" (UniqueName: \"kubernetes.io/projected/e0825000-748f-40a8-a3b8-c1009d4b9f9e-kube-api-access-577b7\") pod \"nova-cell1-conductor-0\" (UID: \"e0825000-748f-40a8-a3b8-c1009d4b9f9e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.162603 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f891daf4-fdec-4042-a6d7-e2b6519d69d4-logs\") pod \"nova-api-0\" (UID: \"f891daf4-fdec-4042-a6d7-e2b6519d69d4\") " pod="openstack/nova-api-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.168535 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f891daf4-fdec-4042-a6d7-e2b6519d69d4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f891daf4-fdec-4042-a6d7-e2b6519d69d4\") " pod="openstack/nova-api-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.168955 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f891daf4-fdec-4042-a6d7-e2b6519d69d4-config-data\") pod \"nova-api-0\" (UID: \"f891daf4-fdec-4042-a6d7-e2b6519d69d4\") " pod="openstack/nova-api-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.169262 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0825000-748f-40a8-a3b8-c1009d4b9f9e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e0825000-748f-40a8-a3b8-c1009d4b9f9e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.169983 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0825000-748f-40a8-a3b8-c1009d4b9f9e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e0825000-748f-40a8-a3b8-c1009d4b9f9e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.177470 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.183644 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsz72\" (UniqueName: \"kubernetes.io/projected/f891daf4-fdec-4042-a6d7-e2b6519d69d4-kube-api-access-tsz72\") pod \"nova-api-0\" (UID: \"f891daf4-fdec-4042-a6d7-e2b6519d69d4\") " pod="openstack/nova-api-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.184159 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-577b7\" (UniqueName: \"kubernetes.io/projected/e0825000-748f-40a8-a3b8-c1009d4b9f9e-kube-api-access-577b7\") pod \"nova-cell1-conductor-0\" (UID: \"e0825000-748f-40a8-a3b8-c1009d4b9f9e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.266103 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b78f233-a649-4acd-a7fd-da9e1932d230-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2b78f233-a649-4acd-a7fd-da9e1932d230\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.266537 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n59lp\" (UniqueName: \"kubernetes.io/projected/2b78f233-a649-4acd-a7fd-da9e1932d230-kube-api-access-n59lp\") pod \"nova-cell0-conductor-0\" (UID: \"2b78f233-a649-4acd-a7fd-da9e1932d230\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.266603 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b78f233-a649-4acd-a7fd-da9e1932d230-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2b78f233-a649-4acd-a7fd-da9e1932d230\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.270168 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b78f233-a649-4acd-a7fd-da9e1932d230-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2b78f233-a649-4acd-a7fd-da9e1932d230\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.271990 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b78f233-a649-4acd-a7fd-da9e1932d230-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2b78f233-a649-4acd-a7fd-da9e1932d230\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.281399 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n59lp\" (UniqueName: \"kubernetes.io/projected/2b78f233-a649-4acd-a7fd-da9e1932d230-kube-api-access-n59lp\") pod \"nova-cell0-conductor-0\" (UID: \"2b78f233-a649-4acd-a7fd-da9e1932d230\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.293995 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.325305 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.353938 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.729024 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3619fac8-bf60-4908-a9ef-bb5af339f530" path="/var/lib/kubelet/pods/3619fac8-bf60-4908-a9ef-bb5af339f530/volumes" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.729931 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54138e60-1a8c-4f4d-8179-35716959b0b2" path="/var/lib/kubelet/pods/54138e60-1a8c-4f4d-8179-35716959b0b2/volumes" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.732531 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="840cffec-0fba-4a84-8d36-4c0cc26cadff" path="/var/lib/kubelet/pods/840cffec-0fba-4a84-8d36-4c0cc26cadff/volumes" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.733303 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3779dac-2010-491e-b67c-9e54dc96802e" path="/var/lib/kubelet/pods/f3779dac-2010-491e-b67c-9e54dc96802e/volumes" Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.826711 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.831455 4854 generic.go:334] "Generic (PLEG): container finished" podID="337ac98e-97c5-4988-957c-95552f5ffb41" containerID="8f859ac6265c8de10ba0828d10ef060cd33e846da6a2eddfe4eae2fb91cc1320" exitCode=0 Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.831896 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr776" event={"ID":"337ac98e-97c5-4988-957c-95552f5ffb41","Type":"ContainerDied","Data":"8f859ac6265c8de10ba0828d10ef060cd33e846da6a2eddfe4eae2fb91cc1320"} Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.837359 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c","Type":"ContainerStarted","Data":"b820de7c26ac09c72e40a636e5a07b8f2d289f754e807108ffff560836a152b1"} Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.837407 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c","Type":"ContainerStarted","Data":"4e3ba48b6866effce7d8db690ced26cb701b19a66f1edcf2b3e5a59e8511b9f7"} Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.913756 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 14:56:14 crc kubenswrapper[4854]: I1007 14:56:14.994685 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 14:56:15 crc kubenswrapper[4854]: W1007 14:56:15.028857 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b78f233_a649_4acd_a7fd_da9e1932d230.slice/crio-a690b27edc9ef943b00e6a41d07bcfba4812b1d088ddbc953f331dbaa29e2d1a WatchSource:0}: Error finding container a690b27edc9ef943b00e6a41d07bcfba4812b1d088ddbc953f331dbaa29e2d1a: Status 404 returned error can't find the container with id a690b27edc9ef943b00e6a41d07bcfba4812b1d088ddbc953f331dbaa29e2d1a Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.859724 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f891daf4-fdec-4042-a6d7-e2b6519d69d4","Type":"ContainerStarted","Data":"e960f63b4c3b194b8f81a8b9cc619ae5807b432d3732fcd715c8c5d1811e95bc"} Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.860269 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f891daf4-fdec-4042-a6d7-e2b6519d69d4","Type":"ContainerStarted","Data":"c4992655db4f7d6c0a6a9292be198ab52f5c49448edeb2d136f30669ce54b9b8"} Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.860281 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f891daf4-fdec-4042-a6d7-e2b6519d69d4","Type":"ContainerStarted","Data":"08b5e812519948ffd6d92a40646e0f0f919b9078bd746aab8fd61ba799f2fd68"} Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.862368 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2b78f233-a649-4acd-a7fd-da9e1932d230","Type":"ContainerStarted","Data":"08468ee4fd2aad102838063d98d4d4df9ec3efb1617ef26e75e33a1e456a65fd"} Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.862408 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2b78f233-a649-4acd-a7fd-da9e1932d230","Type":"ContainerStarted","Data":"a690b27edc9ef943b00e6a41d07bcfba4812b1d088ddbc953f331dbaa29e2d1a"} Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.862849 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.865552 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-llp86" event={"ID":"5e636ae9-5753-49ab-b4ab-010572562e7d","Type":"ContainerStarted","Data":"29b5cf9ffce5ffa759c5afdc3d9ca25ee9e6a16ce74fa15304399d1d58bfe910"} Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.869624 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c","Type":"ContainerStarted","Data":"051572bc3c5d5e935331ac225750984a0a6b3f6bda834a34d411110e3ce7409f"} Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.873343 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr776" event={"ID":"337ac98e-97c5-4988-957c-95552f5ffb41","Type":"ContainerStarted","Data":"5c9cf0e836a2522e6209fe738c739a44f659d109762df30d3d69b28524196ede"} Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.876156 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e0825000-748f-40a8-a3b8-c1009d4b9f9e","Type":"ContainerStarted","Data":"1402857d0c06241c1810fe228a267feb872f98328cd83ca528ce340c11846d55"} Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.876196 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e0825000-748f-40a8-a3b8-c1009d4b9f9e","Type":"ContainerStarted","Data":"48fb89fe2dbccc67dda5c18d754b23db2f82de5212b9400a56e0af8a55fface9"} Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.876306 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.892794 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.892771181 podStartE2EDuration="2.892771181s" podCreationTimestamp="2025-10-07 14:56:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:56:15.880765485 +0000 UTC m=+9091.868597740" watchObservedRunningTime="2025-10-07 14:56:15.892771181 +0000 UTC m=+9091.880603436" Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.908319 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.90829894 podStartE2EDuration="2.90829894s" podCreationTimestamp="2025-10-07 14:56:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:56:15.904559342 +0000 UTC m=+9091.892391607" watchObservedRunningTime="2025-10-07 14:56:15.90829894 +0000 UTC m=+9091.896131195" Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.959638 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.9596178220000002 podStartE2EDuration="3.959617822s" podCreationTimestamp="2025-10-07 14:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:56:15.927576107 +0000 UTC m=+9091.915408352" watchObservedRunningTime="2025-10-07 14:56:15.959617822 +0000 UTC m=+9091.947450077" Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.961847 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-llp86" podStartSLOduration=3.705567098 podStartE2EDuration="9.961837787s" podCreationTimestamp="2025-10-07 14:56:06 +0000 UTC" firstStartedPulling="2025-10-07 14:56:08.640325868 +0000 UTC m=+9084.628158123" lastFinishedPulling="2025-10-07 14:56:14.896596557 +0000 UTC m=+9090.884428812" observedRunningTime="2025-10-07 14:56:15.960688513 +0000 UTC m=+9091.948520768" watchObservedRunningTime="2025-10-07 14:56:15.961837787 +0000 UTC m=+9091.949670042" Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.981065 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.981043471 podStartE2EDuration="2.981043471s" podCreationTimestamp="2025-10-07 14:56:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:56:15.976650314 +0000 UTC m=+9091.964482579" watchObservedRunningTime="2025-10-07 14:56:15.981043471 +0000 UTC m=+9091.968875726" Oct 07 14:56:15 crc kubenswrapper[4854]: I1007 14:56:15.999493 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qr776" podStartSLOduration=3.237690051 podStartE2EDuration="6.999467644s" podCreationTimestamp="2025-10-07 14:56:09 +0000 UTC" firstStartedPulling="2025-10-07 14:56:11.722637184 +0000 UTC m=+9087.710469439" lastFinishedPulling="2025-10-07 14:56:15.484414777 +0000 UTC m=+9091.472247032" observedRunningTime="2025-10-07 14:56:15.996388575 +0000 UTC m=+9091.984220850" watchObservedRunningTime="2025-10-07 14:56:15.999467644 +0000 UTC m=+9091.987299899" Oct 07 14:56:16 crc kubenswrapper[4854]: I1007 14:56:16.109376 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 14:56:16 crc kubenswrapper[4854]: I1007 14:56:16.996004 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:16 crc kubenswrapper[4854]: I1007 14:56:16.996324 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:17 crc kubenswrapper[4854]: I1007 14:56:17.056045 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:18 crc kubenswrapper[4854]: I1007 14:56:18.323399 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 14:56:18 crc kubenswrapper[4854]: I1007 14:56:18.323739 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 14:56:19 crc kubenswrapper[4854]: I1007 14:56:19.399610 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:19 crc kubenswrapper[4854]: I1007 14:56:19.399992 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:19 crc kubenswrapper[4854]: I1007 14:56:19.466761 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:21 crc kubenswrapper[4854]: I1007 14:56:21.109212 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 14:56:21 crc kubenswrapper[4854]: I1007 14:56:21.145388 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 14:56:21 crc kubenswrapper[4854]: I1007 14:56:21.964341 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 14:56:23 crc kubenswrapper[4854]: I1007 14:56:23.322527 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 14:56:23 crc kubenswrapper[4854]: I1007 14:56:23.323427 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 14:56:24 crc kubenswrapper[4854]: I1007 14:56:24.327044 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 14:56:24 crc kubenswrapper[4854]: I1007 14:56:24.327476 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 14:56:24 crc kubenswrapper[4854]: I1007 14:56:24.330398 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 07 14:56:24 crc kubenswrapper[4854]: I1007 14:56:24.405350 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.194:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 14:56:24 crc kubenswrapper[4854]: I1007 14:56:24.405611 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.194:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 14:56:24 crc kubenswrapper[4854]: I1007 14:56:24.409226 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 07 14:56:24 crc kubenswrapper[4854]: I1007 14:56:24.709506 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:56:24 crc kubenswrapper[4854]: E1007 14:56:24.709835 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:56:25 crc kubenswrapper[4854]: I1007 14:56:25.409374 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f891daf4-fdec-4042-a6d7-e2b6519d69d4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 14:56:25 crc kubenswrapper[4854]: I1007 14:56:25.409822 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f891daf4-fdec-4042-a6d7-e2b6519d69d4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 14:56:27 crc kubenswrapper[4854]: I1007 14:56:27.084045 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:27 crc kubenswrapper[4854]: I1007 14:56:27.139009 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-llp86"] Oct 07 14:56:27 crc kubenswrapper[4854]: I1007 14:56:27.999048 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-llp86" podUID="5e636ae9-5753-49ab-b4ab-010572562e7d" containerName="registry-server" containerID="cri-o://29b5cf9ffce5ffa759c5afdc3d9ca25ee9e6a16ce74fa15304399d1d58bfe910" gracePeriod=2 Oct 07 14:56:28 crc kubenswrapper[4854]: I1007 14:56:28.572000 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:28 crc kubenswrapper[4854]: I1007 14:56:28.638495 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpgfv\" (UniqueName: \"kubernetes.io/projected/5e636ae9-5753-49ab-b4ab-010572562e7d-kube-api-access-zpgfv\") pod \"5e636ae9-5753-49ab-b4ab-010572562e7d\" (UID: \"5e636ae9-5753-49ab-b4ab-010572562e7d\") " Oct 07 14:56:28 crc kubenswrapper[4854]: I1007 14:56:28.639006 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e636ae9-5753-49ab-b4ab-010572562e7d-catalog-content\") pod \"5e636ae9-5753-49ab-b4ab-010572562e7d\" (UID: \"5e636ae9-5753-49ab-b4ab-010572562e7d\") " Oct 07 14:56:28 crc kubenswrapper[4854]: I1007 14:56:28.639283 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e636ae9-5753-49ab-b4ab-010572562e7d-utilities\") pod \"5e636ae9-5753-49ab-b4ab-010572562e7d\" (UID: \"5e636ae9-5753-49ab-b4ab-010572562e7d\") " Oct 07 14:56:28 crc kubenswrapper[4854]: I1007 14:56:28.640122 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e636ae9-5753-49ab-b4ab-010572562e7d-utilities" (OuterVolumeSpecName: "utilities") pod "5e636ae9-5753-49ab-b4ab-010572562e7d" (UID: "5e636ae9-5753-49ab-b4ab-010572562e7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:56:28 crc kubenswrapper[4854]: I1007 14:56:28.647502 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e636ae9-5753-49ab-b4ab-010572562e7d-kube-api-access-zpgfv" (OuterVolumeSpecName: "kube-api-access-zpgfv") pod "5e636ae9-5753-49ab-b4ab-010572562e7d" (UID: "5e636ae9-5753-49ab-b4ab-010572562e7d"). InnerVolumeSpecName "kube-api-access-zpgfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:56:28 crc kubenswrapper[4854]: I1007 14:56:28.650397 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e636ae9-5753-49ab-b4ab-010572562e7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e636ae9-5753-49ab-b4ab-010572562e7d" (UID: "5e636ae9-5753-49ab-b4ab-010572562e7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:56:28 crc kubenswrapper[4854]: I1007 14:56:28.742020 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e636ae9-5753-49ab-b4ab-010572562e7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:28 crc kubenswrapper[4854]: I1007 14:56:28.742054 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpgfv\" (UniqueName: \"kubernetes.io/projected/5e636ae9-5753-49ab-b4ab-010572562e7d-kube-api-access-zpgfv\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:28 crc kubenswrapper[4854]: I1007 14:56:28.742063 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e636ae9-5753-49ab-b4ab-010572562e7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.013449 4854 generic.go:334] "Generic (PLEG): container finished" podID="5e636ae9-5753-49ab-b4ab-010572562e7d" containerID="29b5cf9ffce5ffa759c5afdc3d9ca25ee9e6a16ce74fa15304399d1d58bfe910" exitCode=0 Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.013506 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-llp86" event={"ID":"5e636ae9-5753-49ab-b4ab-010572562e7d","Type":"ContainerDied","Data":"29b5cf9ffce5ffa759c5afdc3d9ca25ee9e6a16ce74fa15304399d1d58bfe910"} Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.013516 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-llp86" Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.013537 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-llp86" event={"ID":"5e636ae9-5753-49ab-b4ab-010572562e7d","Type":"ContainerDied","Data":"f5eff11b12657bd838b61f46395c57fc43279b9aa07dece6feefc0677bb197f4"} Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.013559 4854 scope.go:117] "RemoveContainer" containerID="29b5cf9ffce5ffa759c5afdc3d9ca25ee9e6a16ce74fa15304399d1d58bfe910" Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.047493 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-llp86"] Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.054680 4854 scope.go:117] "RemoveContainer" containerID="12434ac2dcfdbb02cc08c43aa8fb724cbd34495f5f5d219f3fdbd6199822c058" Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.061106 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-llp86"] Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.094020 4854 scope.go:117] "RemoveContainer" containerID="d1365f5057f1dbe38c5d9f99934fd8d1efbc973d423227cfcef2e9f1b6542d03" Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.140226 4854 scope.go:117] "RemoveContainer" containerID="29b5cf9ffce5ffa759c5afdc3d9ca25ee9e6a16ce74fa15304399d1d58bfe910" Oct 07 14:56:29 crc kubenswrapper[4854]: E1007 14:56:29.141511 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b5cf9ffce5ffa759c5afdc3d9ca25ee9e6a16ce74fa15304399d1d58bfe910\": container with ID starting with 29b5cf9ffce5ffa759c5afdc3d9ca25ee9e6a16ce74fa15304399d1d58bfe910 not found: ID does not exist" containerID="29b5cf9ffce5ffa759c5afdc3d9ca25ee9e6a16ce74fa15304399d1d58bfe910" Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.141587 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b5cf9ffce5ffa759c5afdc3d9ca25ee9e6a16ce74fa15304399d1d58bfe910"} err="failed to get container status \"29b5cf9ffce5ffa759c5afdc3d9ca25ee9e6a16ce74fa15304399d1d58bfe910\": rpc error: code = NotFound desc = could not find container \"29b5cf9ffce5ffa759c5afdc3d9ca25ee9e6a16ce74fa15304399d1d58bfe910\": container with ID starting with 29b5cf9ffce5ffa759c5afdc3d9ca25ee9e6a16ce74fa15304399d1d58bfe910 not found: ID does not exist" Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.141620 4854 scope.go:117] "RemoveContainer" containerID="12434ac2dcfdbb02cc08c43aa8fb724cbd34495f5f5d219f3fdbd6199822c058" Oct 07 14:56:29 crc kubenswrapper[4854]: E1007 14:56:29.142169 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12434ac2dcfdbb02cc08c43aa8fb724cbd34495f5f5d219f3fdbd6199822c058\": container with ID starting with 12434ac2dcfdbb02cc08c43aa8fb724cbd34495f5f5d219f3fdbd6199822c058 not found: ID does not exist" containerID="12434ac2dcfdbb02cc08c43aa8fb724cbd34495f5f5d219f3fdbd6199822c058" Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.142211 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12434ac2dcfdbb02cc08c43aa8fb724cbd34495f5f5d219f3fdbd6199822c058"} err="failed to get container status \"12434ac2dcfdbb02cc08c43aa8fb724cbd34495f5f5d219f3fdbd6199822c058\": rpc error: code = NotFound desc = could not find container \"12434ac2dcfdbb02cc08c43aa8fb724cbd34495f5f5d219f3fdbd6199822c058\": container with ID starting with 12434ac2dcfdbb02cc08c43aa8fb724cbd34495f5f5d219f3fdbd6199822c058 not found: ID does not exist" Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.142236 4854 scope.go:117] "RemoveContainer" containerID="d1365f5057f1dbe38c5d9f99934fd8d1efbc973d423227cfcef2e9f1b6542d03" Oct 07 14:56:29 crc kubenswrapper[4854]: E1007 14:56:29.142602 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1365f5057f1dbe38c5d9f99934fd8d1efbc973d423227cfcef2e9f1b6542d03\": container with ID starting with d1365f5057f1dbe38c5d9f99934fd8d1efbc973d423227cfcef2e9f1b6542d03 not found: ID does not exist" containerID="d1365f5057f1dbe38c5d9f99934fd8d1efbc973d423227cfcef2e9f1b6542d03" Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.142637 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1365f5057f1dbe38c5d9f99934fd8d1efbc973d423227cfcef2e9f1b6542d03"} err="failed to get container status \"d1365f5057f1dbe38c5d9f99934fd8d1efbc973d423227cfcef2e9f1b6542d03\": rpc error: code = NotFound desc = could not find container \"d1365f5057f1dbe38c5d9f99934fd8d1efbc973d423227cfcef2e9f1b6542d03\": container with ID starting with d1365f5057f1dbe38c5d9f99934fd8d1efbc973d423227cfcef2e9f1b6542d03 not found: ID does not exist" Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.173416 4854 scope.go:117] "RemoveContainer" containerID="1c7fea179c8c721785e4fd31cd968128bd001e619047793a11f3cbc54a0a86f7" Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.212277 4854 scope.go:117] "RemoveContainer" containerID="0748a0f6c6ff32f97816e5d32c085f155fba2d7e63486d88aea7d29d50169426" Oct 07 14:56:29 crc kubenswrapper[4854]: I1007 14:56:29.450165 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:30 crc kubenswrapper[4854]: I1007 14:56:30.722384 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e636ae9-5753-49ab-b4ab-010572562e7d" path="/var/lib/kubelet/pods/5e636ae9-5753-49ab-b4ab-010572562e7d/volumes" Oct 07 14:56:31 crc kubenswrapper[4854]: I1007 14:56:31.720163 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qr776"] Oct 07 14:56:31 crc kubenswrapper[4854]: I1007 14:56:31.720684 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qr776" podUID="337ac98e-97c5-4988-957c-95552f5ffb41" containerName="registry-server" containerID="cri-o://5c9cf0e836a2522e6209fe738c739a44f659d109762df30d3d69b28524196ede" gracePeriod=2 Oct 07 14:56:32 crc kubenswrapper[4854]: I1007 14:56:32.063249 4854 generic.go:334] "Generic (PLEG): container finished" podID="337ac98e-97c5-4988-957c-95552f5ffb41" containerID="5c9cf0e836a2522e6209fe738c739a44f659d109762df30d3d69b28524196ede" exitCode=0 Oct 07 14:56:32 crc kubenswrapper[4854]: I1007 14:56:32.063348 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr776" event={"ID":"337ac98e-97c5-4988-957c-95552f5ffb41","Type":"ContainerDied","Data":"5c9cf0e836a2522e6209fe738c739a44f659d109762df30d3d69b28524196ede"} Oct 07 14:56:32 crc kubenswrapper[4854]: I1007 14:56:32.302559 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:32 crc kubenswrapper[4854]: I1007 14:56:32.420180 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/337ac98e-97c5-4988-957c-95552f5ffb41-catalog-content\") pod \"337ac98e-97c5-4988-957c-95552f5ffb41\" (UID: \"337ac98e-97c5-4988-957c-95552f5ffb41\") " Oct 07 14:56:32 crc kubenswrapper[4854]: I1007 14:56:32.420382 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/337ac98e-97c5-4988-957c-95552f5ffb41-utilities\") pod \"337ac98e-97c5-4988-957c-95552f5ffb41\" (UID: \"337ac98e-97c5-4988-957c-95552f5ffb41\") " Oct 07 14:56:32 crc kubenswrapper[4854]: I1007 14:56:32.420411 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t477\" (UniqueName: \"kubernetes.io/projected/337ac98e-97c5-4988-957c-95552f5ffb41-kube-api-access-2t477\") pod \"337ac98e-97c5-4988-957c-95552f5ffb41\" (UID: \"337ac98e-97c5-4988-957c-95552f5ffb41\") " Oct 07 14:56:32 crc kubenswrapper[4854]: I1007 14:56:32.421758 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/337ac98e-97c5-4988-957c-95552f5ffb41-utilities" (OuterVolumeSpecName: "utilities") pod "337ac98e-97c5-4988-957c-95552f5ffb41" (UID: "337ac98e-97c5-4988-957c-95552f5ffb41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:56:32 crc kubenswrapper[4854]: I1007 14:56:32.422305 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/337ac98e-97c5-4988-957c-95552f5ffb41-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:32 crc kubenswrapper[4854]: I1007 14:56:32.426300 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/337ac98e-97c5-4988-957c-95552f5ffb41-kube-api-access-2t477" (OuterVolumeSpecName: "kube-api-access-2t477") pod "337ac98e-97c5-4988-957c-95552f5ffb41" (UID: "337ac98e-97c5-4988-957c-95552f5ffb41"). InnerVolumeSpecName "kube-api-access-2t477". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:56:32 crc kubenswrapper[4854]: I1007 14:56:32.478049 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/337ac98e-97c5-4988-957c-95552f5ffb41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "337ac98e-97c5-4988-957c-95552f5ffb41" (UID: "337ac98e-97c5-4988-957c-95552f5ffb41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:56:32 crc kubenswrapper[4854]: I1007 14:56:32.524976 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t477\" (UniqueName: \"kubernetes.io/projected/337ac98e-97c5-4988-957c-95552f5ffb41-kube-api-access-2t477\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:32 crc kubenswrapper[4854]: I1007 14:56:32.525346 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/337ac98e-97c5-4988-957c-95552f5ffb41-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:56:33 crc kubenswrapper[4854]: I1007 14:56:33.078881 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr776" event={"ID":"337ac98e-97c5-4988-957c-95552f5ffb41","Type":"ContainerDied","Data":"df98d0d3672d67259db24fe3106afe68c69f82271513e3cad9ef7a817ec97dbd"} Oct 07 14:56:33 crc kubenswrapper[4854]: I1007 14:56:33.079364 4854 scope.go:117] "RemoveContainer" containerID="5c9cf0e836a2522e6209fe738c739a44f659d109762df30d3d69b28524196ede" Oct 07 14:56:33 crc kubenswrapper[4854]: I1007 14:56:33.079172 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qr776" Oct 07 14:56:33 crc kubenswrapper[4854]: I1007 14:56:33.121872 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qr776"] Oct 07 14:56:33 crc kubenswrapper[4854]: I1007 14:56:33.131503 4854 scope.go:117] "RemoveContainer" containerID="8f859ac6265c8de10ba0828d10ef060cd33e846da6a2eddfe4eae2fb91cc1320" Oct 07 14:56:33 crc kubenswrapper[4854]: I1007 14:56:33.135685 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qr776"] Oct 07 14:56:33 crc kubenswrapper[4854]: I1007 14:56:33.156625 4854 scope.go:117] "RemoveContainer" containerID="2099e2c54421dd9496b5836aea45e5a8273b50002edfe92f8dc317146c6104ec" Oct 07 14:56:33 crc kubenswrapper[4854]: I1007 14:56:33.326131 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 14:56:33 crc kubenswrapper[4854]: I1007 14:56:33.327140 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 14:56:33 crc kubenswrapper[4854]: I1007 14:56:33.328841 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 14:56:34 crc kubenswrapper[4854]: I1007 14:56:34.099108 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 14:56:34 crc kubenswrapper[4854]: I1007 14:56:34.332668 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 14:56:34 crc kubenswrapper[4854]: I1007 14:56:34.333163 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 14:56:34 crc kubenswrapper[4854]: I1007 14:56:34.334481 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 14:56:34 crc kubenswrapper[4854]: I1007 14:56:34.335765 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 14:56:34 crc kubenswrapper[4854]: I1007 14:56:34.719192 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="337ac98e-97c5-4988-957c-95552f5ffb41" path="/var/lib/kubelet/pods/337ac98e-97c5-4988-957c-95552f5ffb41/volumes" Oct 07 14:56:35 crc kubenswrapper[4854]: I1007 14:56:35.105953 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 14:56:35 crc kubenswrapper[4854]: I1007 14:56:35.113112 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.195427 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp"] Oct 07 14:56:36 crc kubenswrapper[4854]: E1007 14:56:36.195865 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e636ae9-5753-49ab-b4ab-010572562e7d" containerName="extract-utilities" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.195877 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e636ae9-5753-49ab-b4ab-010572562e7d" containerName="extract-utilities" Oct 07 14:56:36 crc kubenswrapper[4854]: E1007 14:56:36.195898 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e636ae9-5753-49ab-b4ab-010572562e7d" containerName="extract-content" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.195904 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e636ae9-5753-49ab-b4ab-010572562e7d" containerName="extract-content" Oct 07 14:56:36 crc kubenswrapper[4854]: E1007 14:56:36.195913 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337ac98e-97c5-4988-957c-95552f5ffb41" containerName="extract-content" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.195919 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="337ac98e-97c5-4988-957c-95552f5ffb41" containerName="extract-content" Oct 07 14:56:36 crc kubenswrapper[4854]: E1007 14:56:36.195930 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e636ae9-5753-49ab-b4ab-010572562e7d" containerName="registry-server" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.195936 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e636ae9-5753-49ab-b4ab-010572562e7d" containerName="registry-server" Oct 07 14:56:36 crc kubenswrapper[4854]: E1007 14:56:36.195959 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337ac98e-97c5-4988-957c-95552f5ffb41" containerName="registry-server" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.195965 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="337ac98e-97c5-4988-957c-95552f5ffb41" containerName="registry-server" Oct 07 14:56:36 crc kubenswrapper[4854]: E1007 14:56:36.195975 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337ac98e-97c5-4988-957c-95552f5ffb41" containerName="extract-utilities" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.195980 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="337ac98e-97c5-4988-957c-95552f5ffb41" containerName="extract-utilities" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.196191 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e636ae9-5753-49ab-b4ab-010572562e7d" containerName="registry-server" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.196213 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="337ac98e-97c5-4988-957c-95552f5ffb41" containerName="registry-server" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.197184 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.199694 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.200922 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.201212 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.201444 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.201646 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.201776 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n7cf5" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.201905 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.212104 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp"] Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.328386 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.328728 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.328770 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.328786 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.328816 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.328842 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.328873 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.328911 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqc78\" (UniqueName: \"kubernetes.io/projected/0913c01c-83f0-4041-a160-3ab1f63d15f3-kube-api-access-rqc78\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.328933 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.328960 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.328980 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.431223 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqc78\" (UniqueName: \"kubernetes.io/projected/0913c01c-83f0-4041-a160-3ab1f63d15f3-kube-api-access-rqc78\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.431287 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.431326 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.431354 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.431478 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.431539 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.431587 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.431610 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.431653 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.431688 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.431730 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.433364 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.433390 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.438079 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.438139 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.439791 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.439815 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.441204 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.441506 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.441970 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.442013 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.459447 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqc78\" (UniqueName: \"kubernetes.io/projected/0913c01c-83f0-4041-a160-3ab1f63d15f3-kube-api-access-rqc78\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:36 crc kubenswrapper[4854]: I1007 14:56:36.538572 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 14:56:37 crc kubenswrapper[4854]: I1007 14:56:37.060206 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp"] Oct 07 14:56:37 crc kubenswrapper[4854]: I1007 14:56:37.122398 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" event={"ID":"0913c01c-83f0-4041-a160-3ab1f63d15f3","Type":"ContainerStarted","Data":"f80022350a753bcd7d3cb8d2736e51fd965479e83a0a5eb478bb6fb03c9f3e65"} Oct 07 14:56:38 crc kubenswrapper[4854]: I1007 14:56:38.134237 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" event={"ID":"0913c01c-83f0-4041-a160-3ab1f63d15f3","Type":"ContainerStarted","Data":"9abf41c25013d4683beb5c40160ef5cfb17da4b90416eefd014da2938b13523a"} Oct 07 14:56:38 crc kubenswrapper[4854]: I1007 14:56:38.160220 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" podStartSLOduration=1.677707047 podStartE2EDuration="2.160201565s" podCreationTimestamp="2025-10-07 14:56:36 +0000 UTC" firstStartedPulling="2025-10-07 14:56:37.057688407 +0000 UTC m=+9113.045520672" lastFinishedPulling="2025-10-07 14:56:37.540182935 +0000 UTC m=+9113.528015190" observedRunningTime="2025-10-07 14:56:38.152046159 +0000 UTC m=+9114.139878414" watchObservedRunningTime="2025-10-07 14:56:38.160201565 +0000 UTC m=+9114.148033820" Oct 07 14:56:38 crc kubenswrapper[4854]: I1007 14:56:38.703097 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:56:38 crc kubenswrapper[4854]: E1007 14:56:38.703911 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:56:42 crc kubenswrapper[4854]: I1007 14:56:42.778313 4854 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod54138e60-1a8c-4f4d-8179-35716959b0b2"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod54138e60-1a8c-4f4d-8179-35716959b0b2] : Timed out while waiting for systemd to remove kubepods-besteffort-pod54138e60_1a8c_4f4d_8179_35716959b0b2.slice" Oct 07 14:56:42 crc kubenswrapper[4854]: I1007 14:56:42.896246 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="f3779dac-2010-491e-b67c-9e54dc96802e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.86:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 14:56:42 crc kubenswrapper[4854]: I1007 14:56:42.896243 4854 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="f3779dac-2010-491e-b67c-9e54dc96802e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.86:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 14:56:53 crc kubenswrapper[4854]: I1007 14:56:53.702767 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:56:53 crc kubenswrapper[4854]: E1007 14:56:53.704543 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:57:05 crc kubenswrapper[4854]: I1007 14:57:05.703463 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:57:05 crc kubenswrapper[4854]: E1007 14:57:05.704627 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:57:18 crc kubenswrapper[4854]: I1007 14:57:18.702874 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:57:18 crc kubenswrapper[4854]: E1007 14:57:18.703740 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:57:31 crc kubenswrapper[4854]: I1007 14:57:31.702682 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:57:31 crc kubenswrapper[4854]: E1007 14:57:31.703760 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 14:57:42 crc kubenswrapper[4854]: I1007 14:57:42.702994 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 14:57:42 crc kubenswrapper[4854]: I1007 14:57:42.976867 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"58ac045a48df7ab0097355790ac73ebe4d593819b5da9bfc43d6c446f42d496c"} Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.175792 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz"] Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.177791 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.182617 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.182790 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.185208 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83ef8cc8-3195-431c-8ad1-d0f116629070-secret-volume\") pod \"collect-profiles-29330820-pxfzz\" (UID: \"83ef8cc8-3195-431c-8ad1-d0f116629070\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.185292 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83ef8cc8-3195-431c-8ad1-d0f116629070-config-volume\") pod \"collect-profiles-29330820-pxfzz\" (UID: \"83ef8cc8-3195-431c-8ad1-d0f116629070\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.185314 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frv6l\" (UniqueName: \"kubernetes.io/projected/83ef8cc8-3195-431c-8ad1-d0f116629070-kube-api-access-frv6l\") pod \"collect-profiles-29330820-pxfzz\" (UID: \"83ef8cc8-3195-431c-8ad1-d0f116629070\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.193896 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz"] Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.286516 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83ef8cc8-3195-431c-8ad1-d0f116629070-config-volume\") pod \"collect-profiles-29330820-pxfzz\" (UID: \"83ef8cc8-3195-431c-8ad1-d0f116629070\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.286808 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frv6l\" (UniqueName: \"kubernetes.io/projected/83ef8cc8-3195-431c-8ad1-d0f116629070-kube-api-access-frv6l\") pod \"collect-profiles-29330820-pxfzz\" (UID: \"83ef8cc8-3195-431c-8ad1-d0f116629070\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.286936 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83ef8cc8-3195-431c-8ad1-d0f116629070-secret-volume\") pod \"collect-profiles-29330820-pxfzz\" (UID: \"83ef8cc8-3195-431c-8ad1-d0f116629070\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.287430 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83ef8cc8-3195-431c-8ad1-d0f116629070-config-volume\") pod \"collect-profiles-29330820-pxfzz\" (UID: \"83ef8cc8-3195-431c-8ad1-d0f116629070\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.293445 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83ef8cc8-3195-431c-8ad1-d0f116629070-secret-volume\") pod \"collect-profiles-29330820-pxfzz\" (UID: \"83ef8cc8-3195-431c-8ad1-d0f116629070\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.306821 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frv6l\" (UniqueName: \"kubernetes.io/projected/83ef8cc8-3195-431c-8ad1-d0f116629070-kube-api-access-frv6l\") pod \"collect-profiles-29330820-pxfzz\" (UID: \"83ef8cc8-3195-431c-8ad1-d0f116629070\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.502595 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" Oct 07 15:00:00 crc kubenswrapper[4854]: I1007 15:00:00.984632 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz"] Oct 07 15:00:01 crc kubenswrapper[4854]: I1007 15:00:01.974870 4854 generic.go:334] "Generic (PLEG): container finished" podID="83ef8cc8-3195-431c-8ad1-d0f116629070" containerID="fa52cddbf972e4378195ab9db4e99af79b8d550c98847738040216e49087e47d" exitCode=0 Oct 07 15:00:01 crc kubenswrapper[4854]: I1007 15:00:01.975292 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" event={"ID":"83ef8cc8-3195-431c-8ad1-d0f116629070","Type":"ContainerDied","Data":"fa52cddbf972e4378195ab9db4e99af79b8d550c98847738040216e49087e47d"} Oct 07 15:00:01 crc kubenswrapper[4854]: I1007 15:00:01.975424 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" event={"ID":"83ef8cc8-3195-431c-8ad1-d0f116629070","Type":"ContainerStarted","Data":"a4c7d227068de02d6fd5379bc3d7868c4fa53e96e7fffd01e9359f72763a624f"} Oct 07 15:00:03 crc kubenswrapper[4854]: I1007 15:00:03.421457 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" Oct 07 15:00:03 crc kubenswrapper[4854]: I1007 15:00:03.506818 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83ef8cc8-3195-431c-8ad1-d0f116629070-secret-volume\") pod \"83ef8cc8-3195-431c-8ad1-d0f116629070\" (UID: \"83ef8cc8-3195-431c-8ad1-d0f116629070\") " Oct 07 15:00:03 crc kubenswrapper[4854]: I1007 15:00:03.507042 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83ef8cc8-3195-431c-8ad1-d0f116629070-config-volume\") pod \"83ef8cc8-3195-431c-8ad1-d0f116629070\" (UID: \"83ef8cc8-3195-431c-8ad1-d0f116629070\") " Oct 07 15:00:03 crc kubenswrapper[4854]: I1007 15:00:03.507443 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frv6l\" (UniqueName: \"kubernetes.io/projected/83ef8cc8-3195-431c-8ad1-d0f116629070-kube-api-access-frv6l\") pod \"83ef8cc8-3195-431c-8ad1-d0f116629070\" (UID: \"83ef8cc8-3195-431c-8ad1-d0f116629070\") " Oct 07 15:00:03 crc kubenswrapper[4854]: I1007 15:00:03.509261 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83ef8cc8-3195-431c-8ad1-d0f116629070-config-volume" (OuterVolumeSpecName: "config-volume") pod "83ef8cc8-3195-431c-8ad1-d0f116629070" (UID: "83ef8cc8-3195-431c-8ad1-d0f116629070"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:00:03 crc kubenswrapper[4854]: I1007 15:00:03.513275 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ef8cc8-3195-431c-8ad1-d0f116629070-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "83ef8cc8-3195-431c-8ad1-d0f116629070" (UID: "83ef8cc8-3195-431c-8ad1-d0f116629070"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:00:03 crc kubenswrapper[4854]: I1007 15:00:03.513659 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83ef8cc8-3195-431c-8ad1-d0f116629070-kube-api-access-frv6l" (OuterVolumeSpecName: "kube-api-access-frv6l") pod "83ef8cc8-3195-431c-8ad1-d0f116629070" (UID: "83ef8cc8-3195-431c-8ad1-d0f116629070"). InnerVolumeSpecName "kube-api-access-frv6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:00:03 crc kubenswrapper[4854]: I1007 15:00:03.609777 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frv6l\" (UniqueName: \"kubernetes.io/projected/83ef8cc8-3195-431c-8ad1-d0f116629070-kube-api-access-frv6l\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:03 crc kubenswrapper[4854]: I1007 15:00:03.609823 4854 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/83ef8cc8-3195-431c-8ad1-d0f116629070-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:03 crc kubenswrapper[4854]: I1007 15:00:03.609843 4854 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83ef8cc8-3195-431c-8ad1-d0f116629070-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:04 crc kubenswrapper[4854]: I1007 15:00:04.013355 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" event={"ID":"83ef8cc8-3195-431c-8ad1-d0f116629070","Type":"ContainerDied","Data":"a4c7d227068de02d6fd5379bc3d7868c4fa53e96e7fffd01e9359f72763a624f"} Oct 07 15:00:04 crc kubenswrapper[4854]: I1007 15:00:04.013396 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4c7d227068de02d6fd5379bc3d7868c4fa53e96e7fffd01e9359f72763a624f" Oct 07 15:00:04 crc kubenswrapper[4854]: I1007 15:00:04.013401 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-pxfzz" Oct 07 15:00:04 crc kubenswrapper[4854]: I1007 15:00:04.538594 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z"] Oct 07 15:00:04 crc kubenswrapper[4854]: I1007 15:00:04.559601 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-wcn9z"] Oct 07 15:00:04 crc kubenswrapper[4854]: I1007 15:00:04.724278 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d" path="/var/lib/kubelet/pods/cb83bc05-c755-4cac-9e66-cf1f0ecfaa6d/volumes" Oct 07 15:00:10 crc kubenswrapper[4854]: I1007 15:00:10.808281 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:00:10 crc kubenswrapper[4854]: I1007 15:00:10.808820 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:00:29 crc kubenswrapper[4854]: I1007 15:00:29.454791 4854 scope.go:117] "RemoveContainer" containerID="b75a32137dae60a5b40000a02c61eb34fa49e9c81e786da904141ab3b376d898" Oct 07 15:00:40 crc kubenswrapper[4854]: I1007 15:00:40.807813 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:00:40 crc kubenswrapper[4854]: I1007 15:00:40.808432 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.186222 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29330821-clqdr"] Oct 07 15:01:00 crc kubenswrapper[4854]: E1007 15:01:00.187218 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ef8cc8-3195-431c-8ad1-d0f116629070" containerName="collect-profiles" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.187234 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ef8cc8-3195-431c-8ad1-d0f116629070" containerName="collect-profiles" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.187524 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ef8cc8-3195-431c-8ad1-d0f116629070" containerName="collect-profiles" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.188406 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.212360 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29330821-clqdr"] Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.336880 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn9zm\" (UniqueName: \"kubernetes.io/projected/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-kube-api-access-fn9zm\") pod \"keystone-cron-29330821-clqdr\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.337276 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-config-data\") pod \"keystone-cron-29330821-clqdr\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.337350 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-fernet-keys\") pod \"keystone-cron-29330821-clqdr\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.337410 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-combined-ca-bundle\") pod \"keystone-cron-29330821-clqdr\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.439240 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-combined-ca-bundle\") pod \"keystone-cron-29330821-clqdr\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.439448 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn9zm\" (UniqueName: \"kubernetes.io/projected/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-kube-api-access-fn9zm\") pod \"keystone-cron-29330821-clqdr\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.439532 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-config-data\") pod \"keystone-cron-29330821-clqdr\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.439578 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-fernet-keys\") pod \"keystone-cron-29330821-clqdr\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.446680 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-fernet-keys\") pod \"keystone-cron-29330821-clqdr\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.446871 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-combined-ca-bundle\") pod \"keystone-cron-29330821-clqdr\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.451031 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-config-data\") pod \"keystone-cron-29330821-clqdr\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.463633 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn9zm\" (UniqueName: \"kubernetes.io/projected/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-kube-api-access-fn9zm\") pod \"keystone-cron-29330821-clqdr\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:00 crc kubenswrapper[4854]: I1007 15:01:00.529307 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:01 crc kubenswrapper[4854]: I1007 15:01:01.020941 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29330821-clqdr"] Oct 07 15:01:01 crc kubenswrapper[4854]: W1007 15:01:01.553204 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b5efd4b_3bc2_4b2f_864e_a5d64d84e593.slice/crio-54c698f0a14d57f2dbe83fa34ef94d3630175070834a2ca2a006404fd805f147 WatchSource:0}: Error finding container 54c698f0a14d57f2dbe83fa34ef94d3630175070834a2ca2a006404fd805f147: Status 404 returned error can't find the container with id 54c698f0a14d57f2dbe83fa34ef94d3630175070834a2ca2a006404fd805f147 Oct 07 15:01:01 crc kubenswrapper[4854]: I1007 15:01:01.675122 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330821-clqdr" event={"ID":"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593","Type":"ContainerStarted","Data":"54c698f0a14d57f2dbe83fa34ef94d3630175070834a2ca2a006404fd805f147"} Oct 07 15:01:02 crc kubenswrapper[4854]: I1007 15:01:02.687475 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330821-clqdr" event={"ID":"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593","Type":"ContainerStarted","Data":"7c4d1e9886bd0cd70bf4f1bb7ed66682d6e44b06675f50fbb82315d1d67ae659"} Oct 07 15:01:02 crc kubenswrapper[4854]: I1007 15:01:02.717310 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29330821-clqdr" podStartSLOduration=2.7172867050000002 podStartE2EDuration="2.717286705s" podCreationTimestamp="2025-10-07 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:01:02.704421032 +0000 UTC m=+9378.692253287" watchObservedRunningTime="2025-10-07 15:01:02.717286705 +0000 UTC m=+9378.705118970" Oct 07 15:01:04 crc kubenswrapper[4854]: I1007 15:01:04.652575 4854 trace.go:236] Trace[1984443612]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-0" (07-Oct-2025 15:01:03.625) (total time: 1026ms): Oct 07 15:01:04 crc kubenswrapper[4854]: Trace[1984443612]: [1.026842871s] [1.026842871s] END Oct 07 15:01:05 crc kubenswrapper[4854]: I1007 15:01:05.734494 4854 generic.go:334] "Generic (PLEG): container finished" podID="5b5efd4b-3bc2-4b2f-864e-a5d64d84e593" containerID="7c4d1e9886bd0cd70bf4f1bb7ed66682d6e44b06675f50fbb82315d1d67ae659" exitCode=0 Oct 07 15:01:05 crc kubenswrapper[4854]: I1007 15:01:05.734900 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330821-clqdr" event={"ID":"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593","Type":"ContainerDied","Data":"7c4d1e9886bd0cd70bf4f1bb7ed66682d6e44b06675f50fbb82315d1d67ae659"} Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.186017 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.339611 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-combined-ca-bundle\") pod \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.339809 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn9zm\" (UniqueName: \"kubernetes.io/projected/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-kube-api-access-fn9zm\") pod \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.339935 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-config-data\") pod \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.339994 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-fernet-keys\") pod \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\" (UID: \"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593\") " Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.345258 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5b5efd4b-3bc2-4b2f-864e-a5d64d84e593" (UID: "5b5efd4b-3bc2-4b2f-864e-a5d64d84e593"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.349195 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-kube-api-access-fn9zm" (OuterVolumeSpecName: "kube-api-access-fn9zm") pod "5b5efd4b-3bc2-4b2f-864e-a5d64d84e593" (UID: "5b5efd4b-3bc2-4b2f-864e-a5d64d84e593"). InnerVolumeSpecName "kube-api-access-fn9zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.390788 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b5efd4b-3bc2-4b2f-864e-a5d64d84e593" (UID: "5b5efd4b-3bc2-4b2f-864e-a5d64d84e593"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.423210 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-config-data" (OuterVolumeSpecName: "config-data") pod "5b5efd4b-3bc2-4b2f-864e-a5d64d84e593" (UID: "5b5efd4b-3bc2-4b2f-864e-a5d64d84e593"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.442085 4854 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.442114 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn9zm\" (UniqueName: \"kubernetes.io/projected/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-kube-api-access-fn9zm\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.442126 4854 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.442136 4854 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b5efd4b-3bc2-4b2f-864e-a5d64d84e593-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.761859 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330821-clqdr" event={"ID":"5b5efd4b-3bc2-4b2f-864e-a5d64d84e593","Type":"ContainerDied","Data":"54c698f0a14d57f2dbe83fa34ef94d3630175070834a2ca2a006404fd805f147"} Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.762390 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54c698f0a14d57f2dbe83fa34ef94d3630175070834a2ca2a006404fd805f147" Oct 07 15:01:07 crc kubenswrapper[4854]: I1007 15:01:07.762027 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330821-clqdr" Oct 07 15:01:10 crc kubenswrapper[4854]: I1007 15:01:10.809624 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:01:10 crc kubenswrapper[4854]: I1007 15:01:10.810169 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:01:10 crc kubenswrapper[4854]: I1007 15:01:10.810215 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 15:01:10 crc kubenswrapper[4854]: I1007 15:01:10.810974 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58ac045a48df7ab0097355790ac73ebe4d593819b5da9bfc43d6c446f42d496c"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:01:10 crc kubenswrapper[4854]: I1007 15:01:10.811019 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://58ac045a48df7ab0097355790ac73ebe4d593819b5da9bfc43d6c446f42d496c" gracePeriod=600 Oct 07 15:01:11 crc kubenswrapper[4854]: I1007 15:01:11.803631 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="58ac045a48df7ab0097355790ac73ebe4d593819b5da9bfc43d6c446f42d496c" exitCode=0 Oct 07 15:01:11 crc kubenswrapper[4854]: I1007 15:01:11.803712 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"58ac045a48df7ab0097355790ac73ebe4d593819b5da9bfc43d6c446f42d496c"} Oct 07 15:01:11 crc kubenswrapper[4854]: I1007 15:01:11.804346 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1"} Oct 07 15:01:11 crc kubenswrapper[4854]: I1007 15:01:11.804372 4854 scope.go:117] "RemoveContainer" containerID="e54b3b075b488a857a2b079c8bf74c15dbd3a9b0c33ba12b65add9ef54f2f5a0" Oct 07 15:03:40 crc kubenswrapper[4854]: I1007 15:03:40.808508 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:03:40 crc kubenswrapper[4854]: I1007 15:03:40.809658 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:04:10 crc kubenswrapper[4854]: I1007 15:04:10.807610 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:04:10 crc kubenswrapper[4854]: I1007 15:04:10.808286 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:04:40 crc kubenswrapper[4854]: I1007 15:04:40.807627 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:04:40 crc kubenswrapper[4854]: I1007 15:04:40.808406 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:04:40 crc kubenswrapper[4854]: I1007 15:04:40.808464 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 15:04:40 crc kubenswrapper[4854]: I1007 15:04:40.809409 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:04:40 crc kubenswrapper[4854]: I1007 15:04:40.809466 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" gracePeriod=600 Oct 07 15:04:40 crc kubenswrapper[4854]: E1007 15:04:40.987519 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:04:41 crc kubenswrapper[4854]: I1007 15:04:41.545382 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" exitCode=0 Oct 07 15:04:41 crc kubenswrapper[4854]: I1007 15:04:41.545389 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1"} Oct 07 15:04:41 crc kubenswrapper[4854]: I1007 15:04:41.545849 4854 scope.go:117] "RemoveContainer" containerID="58ac045a48df7ab0097355790ac73ebe4d593819b5da9bfc43d6c446f42d496c" Oct 07 15:04:41 crc kubenswrapper[4854]: I1007 15:04:41.546584 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:04:41 crc kubenswrapper[4854]: E1007 15:04:41.546893 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:04:55 crc kubenswrapper[4854]: I1007 15:04:55.702907 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:04:55 crc kubenswrapper[4854]: E1007 15:04:55.703647 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:05:09 crc kubenswrapper[4854]: I1007 15:05:09.703076 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:05:09 crc kubenswrapper[4854]: E1007 15:05:09.703964 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:05:22 crc kubenswrapper[4854]: I1007 15:05:22.703556 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:05:22 crc kubenswrapper[4854]: E1007 15:05:22.705001 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:05:24 crc kubenswrapper[4854]: I1007 15:05:24.738822 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-699n7"] Oct 07 15:05:24 crc kubenswrapper[4854]: E1007 15:05:24.739890 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5efd4b-3bc2-4b2f-864e-a5d64d84e593" containerName="keystone-cron" Oct 07 15:05:24 crc kubenswrapper[4854]: I1007 15:05:24.739906 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5efd4b-3bc2-4b2f-864e-a5d64d84e593" containerName="keystone-cron" Oct 07 15:05:24 crc kubenswrapper[4854]: I1007 15:05:24.740126 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b5efd4b-3bc2-4b2f-864e-a5d64d84e593" containerName="keystone-cron" Oct 07 15:05:24 crc kubenswrapper[4854]: I1007 15:05:24.741873 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:24 crc kubenswrapper[4854]: I1007 15:05:24.758444 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-699n7"] Oct 07 15:05:24 crc kubenswrapper[4854]: I1007 15:05:24.898997 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aea35dc-aa3f-45d5-8046-23dd551c6c81-catalog-content\") pod \"redhat-operators-699n7\" (UID: \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\") " pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:24 crc kubenswrapper[4854]: I1007 15:05:24.899106 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pcrb\" (UniqueName: \"kubernetes.io/projected/9aea35dc-aa3f-45d5-8046-23dd551c6c81-kube-api-access-4pcrb\") pod \"redhat-operators-699n7\" (UID: \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\") " pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:24 crc kubenswrapper[4854]: I1007 15:05:24.899223 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aea35dc-aa3f-45d5-8046-23dd551c6c81-utilities\") pod \"redhat-operators-699n7\" (UID: \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\") " pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:25 crc kubenswrapper[4854]: I1007 15:05:25.000886 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pcrb\" (UniqueName: \"kubernetes.io/projected/9aea35dc-aa3f-45d5-8046-23dd551c6c81-kube-api-access-4pcrb\") pod \"redhat-operators-699n7\" (UID: \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\") " pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:25 crc kubenswrapper[4854]: I1007 15:05:25.001017 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aea35dc-aa3f-45d5-8046-23dd551c6c81-utilities\") pod \"redhat-operators-699n7\" (UID: \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\") " pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:25 crc kubenswrapper[4854]: I1007 15:05:25.001127 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aea35dc-aa3f-45d5-8046-23dd551c6c81-catalog-content\") pod \"redhat-operators-699n7\" (UID: \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\") " pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:25 crc kubenswrapper[4854]: I1007 15:05:25.001597 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aea35dc-aa3f-45d5-8046-23dd551c6c81-utilities\") pod \"redhat-operators-699n7\" (UID: \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\") " pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:25 crc kubenswrapper[4854]: I1007 15:05:25.001625 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aea35dc-aa3f-45d5-8046-23dd551c6c81-catalog-content\") pod \"redhat-operators-699n7\" (UID: \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\") " pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:25 crc kubenswrapper[4854]: I1007 15:05:25.041531 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pcrb\" (UniqueName: \"kubernetes.io/projected/9aea35dc-aa3f-45d5-8046-23dd551c6c81-kube-api-access-4pcrb\") pod \"redhat-operators-699n7\" (UID: \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\") " pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:25 crc kubenswrapper[4854]: I1007 15:05:25.085202 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:25 crc kubenswrapper[4854]: I1007 15:05:25.578744 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-699n7"] Oct 07 15:05:26 crc kubenswrapper[4854]: I1007 15:05:26.122751 4854 generic.go:334] "Generic (PLEG): container finished" podID="9aea35dc-aa3f-45d5-8046-23dd551c6c81" containerID="8ac00ea53245a7c41cc4db48270b66a6c66013b6771efc46d30728ab85ab39fe" exitCode=0 Oct 07 15:05:26 crc kubenswrapper[4854]: I1007 15:05:26.122837 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-699n7" event={"ID":"9aea35dc-aa3f-45d5-8046-23dd551c6c81","Type":"ContainerDied","Data":"8ac00ea53245a7c41cc4db48270b66a6c66013b6771efc46d30728ab85ab39fe"} Oct 07 15:05:26 crc kubenswrapper[4854]: I1007 15:05:26.123128 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-699n7" event={"ID":"9aea35dc-aa3f-45d5-8046-23dd551c6c81","Type":"ContainerStarted","Data":"4fc2434002434f7fa4827774f40cab431dcb71311511ea886d6739b12abd5060"} Oct 07 15:05:26 crc kubenswrapper[4854]: I1007 15:05:26.124756 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 15:05:27 crc kubenswrapper[4854]: I1007 15:05:27.136345 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-699n7" event={"ID":"9aea35dc-aa3f-45d5-8046-23dd551c6c81","Type":"ContainerStarted","Data":"5892b5c0de2d37940f6cecebfca8e30a5b0614cd08b26c630de052fc99288abe"} Oct 07 15:05:28 crc kubenswrapper[4854]: I1007 15:05:28.155190 4854 generic.go:334] "Generic (PLEG): container finished" podID="9aea35dc-aa3f-45d5-8046-23dd551c6c81" containerID="5892b5c0de2d37940f6cecebfca8e30a5b0614cd08b26c630de052fc99288abe" exitCode=0 Oct 07 15:05:28 crc kubenswrapper[4854]: I1007 15:05:28.155263 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-699n7" event={"ID":"9aea35dc-aa3f-45d5-8046-23dd551c6c81","Type":"ContainerDied","Data":"5892b5c0de2d37940f6cecebfca8e30a5b0614cd08b26c630de052fc99288abe"} Oct 07 15:05:30 crc kubenswrapper[4854]: I1007 15:05:30.188223 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-699n7" event={"ID":"9aea35dc-aa3f-45d5-8046-23dd551c6c81","Type":"ContainerStarted","Data":"399fcd3e26e97acffb1e1b0107da1bbdad7ce0a7432c939244d040dc696cd7a0"} Oct 07 15:05:30 crc kubenswrapper[4854]: I1007 15:05:30.217525 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-699n7" podStartSLOduration=3.308999454 podStartE2EDuration="6.217505262s" podCreationTimestamp="2025-10-07 15:05:24 +0000 UTC" firstStartedPulling="2025-10-07 15:05:26.124546985 +0000 UTC m=+9642.112379240" lastFinishedPulling="2025-10-07 15:05:29.033052793 +0000 UTC m=+9645.020885048" observedRunningTime="2025-10-07 15:05:30.208209083 +0000 UTC m=+9646.196041348" watchObservedRunningTime="2025-10-07 15:05:30.217505262 +0000 UTC m=+9646.205337537" Oct 07 15:05:35 crc kubenswrapper[4854]: I1007 15:05:35.085868 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:35 crc kubenswrapper[4854]: I1007 15:05:35.086380 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:36 crc kubenswrapper[4854]: I1007 15:05:36.153559 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-699n7" podUID="9aea35dc-aa3f-45d5-8046-23dd551c6c81" containerName="registry-server" probeResult="failure" output=< Oct 07 15:05:36 crc kubenswrapper[4854]: timeout: failed to connect service ":50051" within 1s Oct 07 15:05:36 crc kubenswrapper[4854]: > Oct 07 15:05:37 crc kubenswrapper[4854]: I1007 15:05:37.703645 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:05:37 crc kubenswrapper[4854]: E1007 15:05:37.704300 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:05:46 crc kubenswrapper[4854]: I1007 15:05:46.161524 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-699n7" podUID="9aea35dc-aa3f-45d5-8046-23dd551c6c81" containerName="registry-server" probeResult="failure" output=< Oct 07 15:05:46 crc kubenswrapper[4854]: timeout: failed to connect service ":50051" within 1s Oct 07 15:05:46 crc kubenswrapper[4854]: > Oct 07 15:05:48 crc kubenswrapper[4854]: I1007 15:05:48.704582 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:05:48 crc kubenswrapper[4854]: E1007 15:05:48.705377 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:05:55 crc kubenswrapper[4854]: I1007 15:05:55.145119 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:55 crc kubenswrapper[4854]: I1007 15:05:55.198654 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:55 crc kubenswrapper[4854]: I1007 15:05:55.942672 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-699n7"] Oct 07 15:05:56 crc kubenswrapper[4854]: I1007 15:05:56.481335 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-699n7" podUID="9aea35dc-aa3f-45d5-8046-23dd551c6c81" containerName="registry-server" containerID="cri-o://399fcd3e26e97acffb1e1b0107da1bbdad7ce0a7432c939244d040dc696cd7a0" gracePeriod=2 Oct 07 15:05:57 crc kubenswrapper[4854]: I1007 15:05:57.495203 4854 generic.go:334] "Generic (PLEG): container finished" podID="9aea35dc-aa3f-45d5-8046-23dd551c6c81" containerID="399fcd3e26e97acffb1e1b0107da1bbdad7ce0a7432c939244d040dc696cd7a0" exitCode=0 Oct 07 15:05:57 crc kubenswrapper[4854]: I1007 15:05:57.495280 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-699n7" event={"ID":"9aea35dc-aa3f-45d5-8046-23dd551c6c81","Type":"ContainerDied","Data":"399fcd3e26e97acffb1e1b0107da1bbdad7ce0a7432c939244d040dc696cd7a0"} Oct 07 15:05:57 crc kubenswrapper[4854]: I1007 15:05:57.582811 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:57 crc kubenswrapper[4854]: I1007 15:05:57.672887 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aea35dc-aa3f-45d5-8046-23dd551c6c81-catalog-content\") pod \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\" (UID: \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\") " Oct 07 15:05:57 crc kubenswrapper[4854]: I1007 15:05:57.673060 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pcrb\" (UniqueName: \"kubernetes.io/projected/9aea35dc-aa3f-45d5-8046-23dd551c6c81-kube-api-access-4pcrb\") pod \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\" (UID: \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\") " Oct 07 15:05:57 crc kubenswrapper[4854]: I1007 15:05:57.673200 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aea35dc-aa3f-45d5-8046-23dd551c6c81-utilities\") pod \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\" (UID: \"9aea35dc-aa3f-45d5-8046-23dd551c6c81\") " Oct 07 15:05:57 crc kubenswrapper[4854]: I1007 15:05:57.674281 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aea35dc-aa3f-45d5-8046-23dd551c6c81-utilities" (OuterVolumeSpecName: "utilities") pod "9aea35dc-aa3f-45d5-8046-23dd551c6c81" (UID: "9aea35dc-aa3f-45d5-8046-23dd551c6c81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:05:57 crc kubenswrapper[4854]: I1007 15:05:57.679767 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aea35dc-aa3f-45d5-8046-23dd551c6c81-kube-api-access-4pcrb" (OuterVolumeSpecName: "kube-api-access-4pcrb") pod "9aea35dc-aa3f-45d5-8046-23dd551c6c81" (UID: "9aea35dc-aa3f-45d5-8046-23dd551c6c81"). InnerVolumeSpecName "kube-api-access-4pcrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:05:57 crc kubenswrapper[4854]: I1007 15:05:57.755769 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aea35dc-aa3f-45d5-8046-23dd551c6c81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9aea35dc-aa3f-45d5-8046-23dd551c6c81" (UID: "9aea35dc-aa3f-45d5-8046-23dd551c6c81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:05:57 crc kubenswrapper[4854]: I1007 15:05:57.776341 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aea35dc-aa3f-45d5-8046-23dd551c6c81-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:05:57 crc kubenswrapper[4854]: I1007 15:05:57.776371 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pcrb\" (UniqueName: \"kubernetes.io/projected/9aea35dc-aa3f-45d5-8046-23dd551c6c81-kube-api-access-4pcrb\") on node \"crc\" DevicePath \"\"" Oct 07 15:05:57 crc kubenswrapper[4854]: I1007 15:05:57.776382 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aea35dc-aa3f-45d5-8046-23dd551c6c81-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:05:58 crc kubenswrapper[4854]: I1007 15:05:58.519990 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-699n7" event={"ID":"9aea35dc-aa3f-45d5-8046-23dd551c6c81","Type":"ContainerDied","Data":"4fc2434002434f7fa4827774f40cab431dcb71311511ea886d6739b12abd5060"} Oct 07 15:05:58 crc kubenswrapper[4854]: I1007 15:05:58.520414 4854 scope.go:117] "RemoveContainer" containerID="399fcd3e26e97acffb1e1b0107da1bbdad7ce0a7432c939244d040dc696cd7a0" Oct 07 15:05:58 crc kubenswrapper[4854]: I1007 15:05:58.520048 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-699n7" Oct 07 15:05:58 crc kubenswrapper[4854]: I1007 15:05:58.559876 4854 scope.go:117] "RemoveContainer" containerID="5892b5c0de2d37940f6cecebfca8e30a5b0614cd08b26c630de052fc99288abe" Oct 07 15:05:58 crc kubenswrapper[4854]: I1007 15:05:58.563566 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-699n7"] Oct 07 15:05:58 crc kubenswrapper[4854]: I1007 15:05:58.574660 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-699n7"] Oct 07 15:05:58 crc kubenswrapper[4854]: I1007 15:05:58.584851 4854 scope.go:117] "RemoveContainer" containerID="8ac00ea53245a7c41cc4db48270b66a6c66013b6771efc46d30728ab85ab39fe" Oct 07 15:05:58 crc kubenswrapper[4854]: I1007 15:05:58.739754 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aea35dc-aa3f-45d5-8046-23dd551c6c81" path="/var/lib/kubelet/pods/9aea35dc-aa3f-45d5-8046-23dd551c6c81/volumes" Oct 07 15:06:03 crc kubenswrapper[4854]: I1007 15:06:03.709755 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:06:03 crc kubenswrapper[4854]: E1007 15:06:03.711264 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:06:16 crc kubenswrapper[4854]: I1007 15:06:16.703663 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:06:16 crc kubenswrapper[4854]: E1007 15:06:16.704920 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.053200 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6j8ng"] Oct 07 15:06:20 crc kubenswrapper[4854]: E1007 15:06:20.056269 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aea35dc-aa3f-45d5-8046-23dd551c6c81" containerName="extract-content" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.056475 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aea35dc-aa3f-45d5-8046-23dd551c6c81" containerName="extract-content" Oct 07 15:06:20 crc kubenswrapper[4854]: E1007 15:06:20.056622 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aea35dc-aa3f-45d5-8046-23dd551c6c81" containerName="registry-server" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.056710 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aea35dc-aa3f-45d5-8046-23dd551c6c81" containerName="registry-server" Oct 07 15:06:20 crc kubenswrapper[4854]: E1007 15:06:20.056811 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aea35dc-aa3f-45d5-8046-23dd551c6c81" containerName="extract-utilities" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.056902 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aea35dc-aa3f-45d5-8046-23dd551c6c81" containerName="extract-utilities" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.057414 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aea35dc-aa3f-45d5-8046-23dd551c6c81" containerName="registry-server" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.060435 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.065112 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6j8ng"] Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.163638 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb40dff0-cd99-417b-af61-738537703c0a-utilities\") pod \"redhat-marketplace-6j8ng\" (UID: \"cb40dff0-cd99-417b-af61-738537703c0a\") " pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.163727 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb40dff0-cd99-417b-af61-738537703c0a-catalog-content\") pod \"redhat-marketplace-6j8ng\" (UID: \"cb40dff0-cd99-417b-af61-738537703c0a\") " pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.163941 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fl6k\" (UniqueName: \"kubernetes.io/projected/cb40dff0-cd99-417b-af61-738537703c0a-kube-api-access-7fl6k\") pod \"redhat-marketplace-6j8ng\" (UID: \"cb40dff0-cd99-417b-af61-738537703c0a\") " pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.265892 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fl6k\" (UniqueName: \"kubernetes.io/projected/cb40dff0-cd99-417b-af61-738537703c0a-kube-api-access-7fl6k\") pod \"redhat-marketplace-6j8ng\" (UID: \"cb40dff0-cd99-417b-af61-738537703c0a\") " pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.266082 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb40dff0-cd99-417b-af61-738537703c0a-utilities\") pod \"redhat-marketplace-6j8ng\" (UID: \"cb40dff0-cd99-417b-af61-738537703c0a\") " pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.266170 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb40dff0-cd99-417b-af61-738537703c0a-catalog-content\") pod \"redhat-marketplace-6j8ng\" (UID: \"cb40dff0-cd99-417b-af61-738537703c0a\") " pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.266738 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb40dff0-cd99-417b-af61-738537703c0a-catalog-content\") pod \"redhat-marketplace-6j8ng\" (UID: \"cb40dff0-cd99-417b-af61-738537703c0a\") " pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.266819 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb40dff0-cd99-417b-af61-738537703c0a-utilities\") pod \"redhat-marketplace-6j8ng\" (UID: \"cb40dff0-cd99-417b-af61-738537703c0a\") " pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.286753 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fl6k\" (UniqueName: \"kubernetes.io/projected/cb40dff0-cd99-417b-af61-738537703c0a-kube-api-access-7fl6k\") pod \"redhat-marketplace-6j8ng\" (UID: \"cb40dff0-cd99-417b-af61-738537703c0a\") " pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.399067 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:20 crc kubenswrapper[4854]: I1007 15:06:20.885770 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6j8ng"] Oct 07 15:06:21 crc kubenswrapper[4854]: I1007 15:06:21.807249 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6j8ng" event={"ID":"cb40dff0-cd99-417b-af61-738537703c0a","Type":"ContainerStarted","Data":"85bed3f7082af65a7770e5c109985b9c86fdedbe3745896d9abf6e5b92ade5da"} Oct 07 15:06:22 crc kubenswrapper[4854]: I1007 15:06:22.820183 4854 generic.go:334] "Generic (PLEG): container finished" podID="cb40dff0-cd99-417b-af61-738537703c0a" containerID="b4eaf34e9614b9eb93ce292b3492d952218580825da8bbc2628dd397e5a551c6" exitCode=0 Oct 07 15:06:22 crc kubenswrapper[4854]: I1007 15:06:22.820247 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6j8ng" event={"ID":"cb40dff0-cd99-417b-af61-738537703c0a","Type":"ContainerDied","Data":"b4eaf34e9614b9eb93ce292b3492d952218580825da8bbc2628dd397e5a551c6"} Oct 07 15:06:24 crc kubenswrapper[4854]: I1007 15:06:24.852773 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6j8ng" event={"ID":"cb40dff0-cd99-417b-af61-738537703c0a","Type":"ContainerStarted","Data":"9f297e1cfc9a075833ac41e7355d3057d147bcc062eb7ccbb6c488aaf6d09378"} Oct 07 15:06:25 crc kubenswrapper[4854]: I1007 15:06:25.868436 4854 generic.go:334] "Generic (PLEG): container finished" podID="cb40dff0-cd99-417b-af61-738537703c0a" containerID="9f297e1cfc9a075833ac41e7355d3057d147bcc062eb7ccbb6c488aaf6d09378" exitCode=0 Oct 07 15:06:25 crc kubenswrapper[4854]: I1007 15:06:25.868482 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6j8ng" event={"ID":"cb40dff0-cd99-417b-af61-738537703c0a","Type":"ContainerDied","Data":"9f297e1cfc9a075833ac41e7355d3057d147bcc062eb7ccbb6c488aaf6d09378"} Oct 07 15:06:27 crc kubenswrapper[4854]: I1007 15:06:27.898616 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6j8ng" event={"ID":"cb40dff0-cd99-417b-af61-738537703c0a","Type":"ContainerStarted","Data":"c7013950baad942d3f7e86b21a1e3282da5cc05cb5b349f9a09f184256777226"} Oct 07 15:06:27 crc kubenswrapper[4854]: I1007 15:06:27.933999 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6j8ng" podStartSLOduration=4.092357703 podStartE2EDuration="7.933968235s" podCreationTimestamp="2025-10-07 15:06:20 +0000 UTC" firstStartedPulling="2025-10-07 15:06:22.822683545 +0000 UTC m=+9698.810515800" lastFinishedPulling="2025-10-07 15:06:26.664294037 +0000 UTC m=+9702.652126332" observedRunningTime="2025-10-07 15:06:27.924724088 +0000 UTC m=+9703.912556373" watchObservedRunningTime="2025-10-07 15:06:27.933968235 +0000 UTC m=+9703.921800530" Oct 07 15:06:29 crc kubenswrapper[4854]: I1007 15:06:29.703054 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:06:29 crc kubenswrapper[4854]: E1007 15:06:29.703832 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:06:30 crc kubenswrapper[4854]: I1007 15:06:30.399801 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:30 crc kubenswrapper[4854]: I1007 15:06:30.400276 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:30 crc kubenswrapper[4854]: I1007 15:06:30.485247 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:32 crc kubenswrapper[4854]: I1007 15:06:32.013943 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:32 crc kubenswrapper[4854]: I1007 15:06:32.086541 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6j8ng"] Oct 07 15:06:33 crc kubenswrapper[4854]: I1007 15:06:33.976276 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6j8ng" podUID="cb40dff0-cd99-417b-af61-738537703c0a" containerName="registry-server" containerID="cri-o://c7013950baad942d3f7e86b21a1e3282da5cc05cb5b349f9a09f184256777226" gracePeriod=2 Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.488799 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.513410 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb40dff0-cd99-417b-af61-738537703c0a-utilities\") pod \"cb40dff0-cd99-417b-af61-738537703c0a\" (UID: \"cb40dff0-cd99-417b-af61-738537703c0a\") " Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.514785 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb40dff0-cd99-417b-af61-738537703c0a-utilities" (OuterVolumeSpecName: "utilities") pod "cb40dff0-cd99-417b-af61-738537703c0a" (UID: "cb40dff0-cd99-417b-af61-738537703c0a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.515666 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fl6k\" (UniqueName: \"kubernetes.io/projected/cb40dff0-cd99-417b-af61-738537703c0a-kube-api-access-7fl6k\") pod \"cb40dff0-cd99-417b-af61-738537703c0a\" (UID: \"cb40dff0-cd99-417b-af61-738537703c0a\") " Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.515765 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb40dff0-cd99-417b-af61-738537703c0a-catalog-content\") pod \"cb40dff0-cd99-417b-af61-738537703c0a\" (UID: \"cb40dff0-cd99-417b-af61-738537703c0a\") " Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.516374 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb40dff0-cd99-417b-af61-738537703c0a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.534351 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb40dff0-cd99-417b-af61-738537703c0a-kube-api-access-7fl6k" (OuterVolumeSpecName: "kube-api-access-7fl6k") pod "cb40dff0-cd99-417b-af61-738537703c0a" (UID: "cb40dff0-cd99-417b-af61-738537703c0a"). InnerVolumeSpecName "kube-api-access-7fl6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.550666 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb40dff0-cd99-417b-af61-738537703c0a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb40dff0-cd99-417b-af61-738537703c0a" (UID: "cb40dff0-cd99-417b-af61-738537703c0a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.618257 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fl6k\" (UniqueName: \"kubernetes.io/projected/cb40dff0-cd99-417b-af61-738537703c0a-kube-api-access-7fl6k\") on node \"crc\" DevicePath \"\"" Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.618304 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb40dff0-cd99-417b-af61-738537703c0a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.992097 4854 generic.go:334] "Generic (PLEG): container finished" podID="cb40dff0-cd99-417b-af61-738537703c0a" containerID="c7013950baad942d3f7e86b21a1e3282da5cc05cb5b349f9a09f184256777226" exitCode=0 Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.992228 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6j8ng" event={"ID":"cb40dff0-cd99-417b-af61-738537703c0a","Type":"ContainerDied","Data":"c7013950baad942d3f7e86b21a1e3282da5cc05cb5b349f9a09f184256777226"} Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.992358 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6j8ng" event={"ID":"cb40dff0-cd99-417b-af61-738537703c0a","Type":"ContainerDied","Data":"85bed3f7082af65a7770e5c109985b9c86fdedbe3745896d9abf6e5b92ade5da"} Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.992394 4854 scope.go:117] "RemoveContainer" containerID="c7013950baad942d3f7e86b21a1e3282da5cc05cb5b349f9a09f184256777226" Oct 07 15:06:34 crc kubenswrapper[4854]: I1007 15:06:34.992294 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6j8ng" Oct 07 15:06:35 crc kubenswrapper[4854]: I1007 15:06:35.042251 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6j8ng"] Oct 07 15:06:35 crc kubenswrapper[4854]: I1007 15:06:35.044372 4854 scope.go:117] "RemoveContainer" containerID="9f297e1cfc9a075833ac41e7355d3057d147bcc062eb7ccbb6c488aaf6d09378" Oct 07 15:06:35 crc kubenswrapper[4854]: I1007 15:06:35.053851 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6j8ng"] Oct 07 15:06:35 crc kubenswrapper[4854]: I1007 15:06:35.463737 4854 scope.go:117] "RemoveContainer" containerID="b4eaf34e9614b9eb93ce292b3492d952218580825da8bbc2628dd397e5a551c6" Oct 07 15:06:35 crc kubenswrapper[4854]: I1007 15:06:35.550019 4854 scope.go:117] "RemoveContainer" containerID="c7013950baad942d3f7e86b21a1e3282da5cc05cb5b349f9a09f184256777226" Oct 07 15:06:35 crc kubenswrapper[4854]: E1007 15:06:35.550609 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7013950baad942d3f7e86b21a1e3282da5cc05cb5b349f9a09f184256777226\": container with ID starting with c7013950baad942d3f7e86b21a1e3282da5cc05cb5b349f9a09f184256777226 not found: ID does not exist" containerID="c7013950baad942d3f7e86b21a1e3282da5cc05cb5b349f9a09f184256777226" Oct 07 15:06:35 crc kubenswrapper[4854]: I1007 15:06:35.550663 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7013950baad942d3f7e86b21a1e3282da5cc05cb5b349f9a09f184256777226"} err="failed to get container status \"c7013950baad942d3f7e86b21a1e3282da5cc05cb5b349f9a09f184256777226\": rpc error: code = NotFound desc = could not find container \"c7013950baad942d3f7e86b21a1e3282da5cc05cb5b349f9a09f184256777226\": container with ID starting with c7013950baad942d3f7e86b21a1e3282da5cc05cb5b349f9a09f184256777226 not found: ID does not exist" Oct 07 15:06:35 crc kubenswrapper[4854]: I1007 15:06:35.550694 4854 scope.go:117] "RemoveContainer" containerID="9f297e1cfc9a075833ac41e7355d3057d147bcc062eb7ccbb6c488aaf6d09378" Oct 07 15:06:35 crc kubenswrapper[4854]: E1007 15:06:35.551220 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f297e1cfc9a075833ac41e7355d3057d147bcc062eb7ccbb6c488aaf6d09378\": container with ID starting with 9f297e1cfc9a075833ac41e7355d3057d147bcc062eb7ccbb6c488aaf6d09378 not found: ID does not exist" containerID="9f297e1cfc9a075833ac41e7355d3057d147bcc062eb7ccbb6c488aaf6d09378" Oct 07 15:06:35 crc kubenswrapper[4854]: I1007 15:06:35.551291 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f297e1cfc9a075833ac41e7355d3057d147bcc062eb7ccbb6c488aaf6d09378"} err="failed to get container status \"9f297e1cfc9a075833ac41e7355d3057d147bcc062eb7ccbb6c488aaf6d09378\": rpc error: code = NotFound desc = could not find container \"9f297e1cfc9a075833ac41e7355d3057d147bcc062eb7ccbb6c488aaf6d09378\": container with ID starting with 9f297e1cfc9a075833ac41e7355d3057d147bcc062eb7ccbb6c488aaf6d09378 not found: ID does not exist" Oct 07 15:06:35 crc kubenswrapper[4854]: I1007 15:06:35.551342 4854 scope.go:117] "RemoveContainer" containerID="b4eaf34e9614b9eb93ce292b3492d952218580825da8bbc2628dd397e5a551c6" Oct 07 15:06:35 crc kubenswrapper[4854]: E1007 15:06:35.551642 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4eaf34e9614b9eb93ce292b3492d952218580825da8bbc2628dd397e5a551c6\": container with ID starting with b4eaf34e9614b9eb93ce292b3492d952218580825da8bbc2628dd397e5a551c6 not found: ID does not exist" containerID="b4eaf34e9614b9eb93ce292b3492d952218580825da8bbc2628dd397e5a551c6" Oct 07 15:06:35 crc kubenswrapper[4854]: I1007 15:06:35.551678 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4eaf34e9614b9eb93ce292b3492d952218580825da8bbc2628dd397e5a551c6"} err="failed to get container status \"b4eaf34e9614b9eb93ce292b3492d952218580825da8bbc2628dd397e5a551c6\": rpc error: code = NotFound desc = could not find container \"b4eaf34e9614b9eb93ce292b3492d952218580825da8bbc2628dd397e5a551c6\": container with ID starting with b4eaf34e9614b9eb93ce292b3492d952218580825da8bbc2628dd397e5a551c6 not found: ID does not exist" Oct 07 15:06:36 crc kubenswrapper[4854]: I1007 15:06:36.717490 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb40dff0-cd99-417b-af61-738537703c0a" path="/var/lib/kubelet/pods/cb40dff0-cd99-417b-af61-738537703c0a/volumes" Oct 07 15:06:40 crc kubenswrapper[4854]: I1007 15:06:40.702823 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:06:40 crc kubenswrapper[4854]: E1007 15:06:40.703970 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:06:51 crc kubenswrapper[4854]: I1007 15:06:51.703979 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:06:51 crc kubenswrapper[4854]: E1007 15:06:51.704908 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:07:03 crc kubenswrapper[4854]: I1007 15:07:03.703863 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:07:03 crc kubenswrapper[4854]: E1007 15:07:03.704877 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:07:12 crc kubenswrapper[4854]: I1007 15:07:12.447734 4854 generic.go:334] "Generic (PLEG): container finished" podID="0913c01c-83f0-4041-a160-3ab1f63d15f3" containerID="9abf41c25013d4683beb5c40160ef5cfb17da4b90416eefd014da2938b13523a" exitCode=0 Oct 07 15:07:12 crc kubenswrapper[4854]: I1007 15:07:12.447953 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" event={"ID":"0913c01c-83f0-4041-a160-3ab1f63d15f3","Type":"ContainerDied","Data":"9abf41c25013d4683beb5c40160ef5cfb17da4b90416eefd014da2938b13523a"} Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.927949 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.965096 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-compute-config-0\") pod \"0913c01c-83f0-4041-a160-3ab1f63d15f3\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.965255 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-inventory\") pod \"0913c01c-83f0-4041-a160-3ab1f63d15f3\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.965299 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-ceph\") pod \"0913c01c-83f0-4041-a160-3ab1f63d15f3\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.965349 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cells-global-config-0\") pod \"0913c01c-83f0-4041-a160-3ab1f63d15f3\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.965412 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cells-global-config-1\") pod \"0913c01c-83f0-4041-a160-3ab1f63d15f3\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.965507 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-compute-config-1\") pod \"0913c01c-83f0-4041-a160-3ab1f63d15f3\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.965567 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-migration-ssh-key-1\") pod \"0913c01c-83f0-4041-a160-3ab1f63d15f3\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.965607 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-ssh-key\") pod \"0913c01c-83f0-4041-a160-3ab1f63d15f3\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.965682 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-combined-ca-bundle\") pod \"0913c01c-83f0-4041-a160-3ab1f63d15f3\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.965851 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-migration-ssh-key-0\") pod \"0913c01c-83f0-4041-a160-3ab1f63d15f3\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.965919 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqc78\" (UniqueName: \"kubernetes.io/projected/0913c01c-83f0-4041-a160-3ab1f63d15f3-kube-api-access-rqc78\") pod \"0913c01c-83f0-4041-a160-3ab1f63d15f3\" (UID: \"0913c01c-83f0-4041-a160-3ab1f63d15f3\") " Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.973827 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-ceph" (OuterVolumeSpecName: "ceph") pod "0913c01c-83f0-4041-a160-3ab1f63d15f3" (UID: "0913c01c-83f0-4041-a160-3ab1f63d15f3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.978614 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0913c01c-83f0-4041-a160-3ab1f63d15f3-kube-api-access-rqc78" (OuterVolumeSpecName: "kube-api-access-rqc78") pod "0913c01c-83f0-4041-a160-3ab1f63d15f3" (UID: "0913c01c-83f0-4041-a160-3ab1f63d15f3"). InnerVolumeSpecName "kube-api-access-rqc78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:07:13 crc kubenswrapper[4854]: I1007 15:07:13.990445 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "0913c01c-83f0-4041-a160-3ab1f63d15f3" (UID: "0913c01c-83f0-4041-a160-3ab1f63d15f3"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.000757 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "0913c01c-83f0-4041-a160-3ab1f63d15f3" (UID: "0913c01c-83f0-4041-a160-3ab1f63d15f3"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.016722 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "0913c01c-83f0-4041-a160-3ab1f63d15f3" (UID: "0913c01c-83f0-4041-a160-3ab1f63d15f3"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.019501 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "0913c01c-83f0-4041-a160-3ab1f63d15f3" (UID: "0913c01c-83f0-4041-a160-3ab1f63d15f3"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.029506 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "0913c01c-83f0-4041-a160-3ab1f63d15f3" (UID: "0913c01c-83f0-4041-a160-3ab1f63d15f3"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.031884 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-inventory" (OuterVolumeSpecName: "inventory") pod "0913c01c-83f0-4041-a160-3ab1f63d15f3" (UID: "0913c01c-83f0-4041-a160-3ab1f63d15f3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.034759 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0913c01c-83f0-4041-a160-3ab1f63d15f3" (UID: "0913c01c-83f0-4041-a160-3ab1f63d15f3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.040782 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "0913c01c-83f0-4041-a160-3ab1f63d15f3" (UID: "0913c01c-83f0-4041-a160-3ab1f63d15f3"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.041658 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "0913c01c-83f0-4041-a160-3ab1f63d15f3" (UID: "0913c01c-83f0-4041-a160-3ab1f63d15f3"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.067925 4854 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.067954 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqc78\" (UniqueName: \"kubernetes.io/projected/0913c01c-83f0-4041-a160-3ab1f63d15f3-kube-api-access-rqc78\") on node \"crc\" DevicePath \"\"" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.067965 4854 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.067977 4854 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.067986 4854 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.067994 4854 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.068003 4854 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.068011 4854 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.068019 4854 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.068026 4854 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.068035 4854 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0913c01c-83f0-4041-a160-3ab1f63d15f3-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.474798 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" event={"ID":"0913c01c-83f0-4041-a160-3ab1f63d15f3","Type":"ContainerDied","Data":"f80022350a753bcd7d3cb8d2736e51fd965479e83a0a5eb478bb6fb03c9f3e65"} Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.474846 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f80022350a753bcd7d3cb8d2736e51fd965479e83a0a5eb478bb6fb03c9f3e65" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.474895 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp" Oct 07 15:07:14 crc kubenswrapper[4854]: I1007 15:07:14.709370 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:07:14 crc kubenswrapper[4854]: E1007 15:07:14.709887 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:07:27 crc kubenswrapper[4854]: I1007 15:07:27.703016 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:07:27 crc kubenswrapper[4854]: E1007 15:07:27.705195 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:07:39 crc kubenswrapper[4854]: I1007 15:07:39.703346 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:07:39 crc kubenswrapper[4854]: E1007 15:07:39.708308 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:07:51 crc kubenswrapper[4854]: I1007 15:07:51.702639 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:07:51 crc kubenswrapper[4854]: E1007 15:07:51.703480 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:08:06 crc kubenswrapper[4854]: I1007 15:08:06.704102 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:08:06 crc kubenswrapper[4854]: E1007 15:08:06.705086 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:08:17 crc kubenswrapper[4854]: I1007 15:08:17.703385 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:08:17 crc kubenswrapper[4854]: E1007 15:08:17.704088 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:08:27 crc kubenswrapper[4854]: E1007 15:08:27.520261 4854 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.243:58108->38.102.83.243:43445: read tcp 38.102.83.243:58108->38.102.83.243:43445: read: connection reset by peer Oct 07 15:08:31 crc kubenswrapper[4854]: I1007 15:08:31.703698 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:08:31 crc kubenswrapper[4854]: E1007 15:08:31.704479 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.217603 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rc62w"] Oct 07 15:08:42 crc kubenswrapper[4854]: E1007 15:08:42.218964 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb40dff0-cd99-417b-af61-738537703c0a" containerName="registry-server" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.218987 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb40dff0-cd99-417b-af61-738537703c0a" containerName="registry-server" Oct 07 15:08:42 crc kubenswrapper[4854]: E1007 15:08:42.219011 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb40dff0-cd99-417b-af61-738537703c0a" containerName="extract-content" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.219022 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb40dff0-cd99-417b-af61-738537703c0a" containerName="extract-content" Oct 07 15:08:42 crc kubenswrapper[4854]: E1007 15:08:42.219064 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb40dff0-cd99-417b-af61-738537703c0a" containerName="extract-utilities" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.219074 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb40dff0-cd99-417b-af61-738537703c0a" containerName="extract-utilities" Oct 07 15:08:42 crc kubenswrapper[4854]: E1007 15:08:42.219090 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0913c01c-83f0-4041-a160-3ab1f63d15f3" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.219101 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="0913c01c-83f0-4041-a160-3ab1f63d15f3" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.219412 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="0913c01c-83f0-4041-a160-3ab1f63d15f3" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.219451 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb40dff0-cd99-417b-af61-738537703c0a" containerName="registry-server" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.222384 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.234521 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rc62w"] Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.397654 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw9z4\" (UniqueName: \"kubernetes.io/projected/4a57db85-5ef6-44f4-9265-965e2626a116-kube-api-access-rw9z4\") pod \"certified-operators-rc62w\" (UID: \"4a57db85-5ef6-44f4-9265-965e2626a116\") " pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.397819 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a57db85-5ef6-44f4-9265-965e2626a116-catalog-content\") pod \"certified-operators-rc62w\" (UID: \"4a57db85-5ef6-44f4-9265-965e2626a116\") " pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.398054 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a57db85-5ef6-44f4-9265-965e2626a116-utilities\") pod \"certified-operators-rc62w\" (UID: \"4a57db85-5ef6-44f4-9265-965e2626a116\") " pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.416312 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hctn7"] Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.418835 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.427934 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hctn7"] Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.500293 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a57db85-5ef6-44f4-9265-965e2626a116-utilities\") pod \"certified-operators-rc62w\" (UID: \"4a57db85-5ef6-44f4-9265-965e2626a116\") " pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.500676 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw9z4\" (UniqueName: \"kubernetes.io/projected/4a57db85-5ef6-44f4-9265-965e2626a116-kube-api-access-rw9z4\") pod \"certified-operators-rc62w\" (UID: \"4a57db85-5ef6-44f4-9265-965e2626a116\") " pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.500706 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3fcfef0-5261-4b36-8507-e2049441a06b-utilities\") pod \"community-operators-hctn7\" (UID: \"c3fcfef0-5261-4b36-8507-e2049441a06b\") " pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.500755 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3fcfef0-5261-4b36-8507-e2049441a06b-catalog-content\") pod \"community-operators-hctn7\" (UID: \"c3fcfef0-5261-4b36-8507-e2049441a06b\") " pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.500817 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a57db85-5ef6-44f4-9265-965e2626a116-catalog-content\") pod \"certified-operators-rc62w\" (UID: \"4a57db85-5ef6-44f4-9265-965e2626a116\") " pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.500860 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a57db85-5ef6-44f4-9265-965e2626a116-utilities\") pod \"certified-operators-rc62w\" (UID: \"4a57db85-5ef6-44f4-9265-965e2626a116\") " pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.500917 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxv58\" (UniqueName: \"kubernetes.io/projected/c3fcfef0-5261-4b36-8507-e2049441a06b-kube-api-access-vxv58\") pod \"community-operators-hctn7\" (UID: \"c3fcfef0-5261-4b36-8507-e2049441a06b\") " pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.501214 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a57db85-5ef6-44f4-9265-965e2626a116-catalog-content\") pod \"certified-operators-rc62w\" (UID: \"4a57db85-5ef6-44f4-9265-965e2626a116\") " pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.526086 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw9z4\" (UniqueName: \"kubernetes.io/projected/4a57db85-5ef6-44f4-9265-965e2626a116-kube-api-access-rw9z4\") pod \"certified-operators-rc62w\" (UID: \"4a57db85-5ef6-44f4-9265-965e2626a116\") " pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.555666 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.602433 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3fcfef0-5261-4b36-8507-e2049441a06b-utilities\") pod \"community-operators-hctn7\" (UID: \"c3fcfef0-5261-4b36-8507-e2049441a06b\") " pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.602493 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3fcfef0-5261-4b36-8507-e2049441a06b-catalog-content\") pod \"community-operators-hctn7\" (UID: \"c3fcfef0-5261-4b36-8507-e2049441a06b\") " pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.602539 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxv58\" (UniqueName: \"kubernetes.io/projected/c3fcfef0-5261-4b36-8507-e2049441a06b-kube-api-access-vxv58\") pod \"community-operators-hctn7\" (UID: \"c3fcfef0-5261-4b36-8507-e2049441a06b\") " pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.603390 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3fcfef0-5261-4b36-8507-e2049441a06b-utilities\") pod \"community-operators-hctn7\" (UID: \"c3fcfef0-5261-4b36-8507-e2049441a06b\") " pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.603607 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3fcfef0-5261-4b36-8507-e2049441a06b-catalog-content\") pod \"community-operators-hctn7\" (UID: \"c3fcfef0-5261-4b36-8507-e2049441a06b\") " pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.621240 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxv58\" (UniqueName: \"kubernetes.io/projected/c3fcfef0-5261-4b36-8507-e2049441a06b-kube-api-access-vxv58\") pod \"community-operators-hctn7\" (UID: \"c3fcfef0-5261-4b36-8507-e2049441a06b\") " pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:08:42 crc kubenswrapper[4854]: I1007 15:08:42.743782 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:08:43 crc kubenswrapper[4854]: I1007 15:08:43.180708 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rc62w"] Oct 07 15:08:43 crc kubenswrapper[4854]: I1007 15:08:43.398525 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hctn7"] Oct 07 15:08:43 crc kubenswrapper[4854]: I1007 15:08:43.447705 4854 generic.go:334] "Generic (PLEG): container finished" podID="4a57db85-5ef6-44f4-9265-965e2626a116" containerID="202ca976961d84cbb9da5fe55d760da9c07523dd642882369f108be0bf1881c1" exitCode=0 Oct 07 15:08:43 crc kubenswrapper[4854]: I1007 15:08:43.447752 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rc62w" event={"ID":"4a57db85-5ef6-44f4-9265-965e2626a116","Type":"ContainerDied","Data":"202ca976961d84cbb9da5fe55d760da9c07523dd642882369f108be0bf1881c1"} Oct 07 15:08:43 crc kubenswrapper[4854]: I1007 15:08:43.447779 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rc62w" event={"ID":"4a57db85-5ef6-44f4-9265-965e2626a116","Type":"ContainerStarted","Data":"5bc6c7cc3dd038b2d2417d874a361b0fd96a3763018f3ceff41fca44d69df487"} Oct 07 15:08:43 crc kubenswrapper[4854]: W1007 15:08:43.458388 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3fcfef0_5261_4b36_8507_e2049441a06b.slice/crio-824c25d01d54c80160edd4c91679142820b21bb570d55dee444489d1eeb8a865 WatchSource:0}: Error finding container 824c25d01d54c80160edd4c91679142820b21bb570d55dee444489d1eeb8a865: Status 404 returned error can't find the container with id 824c25d01d54c80160edd4c91679142820b21bb570d55dee444489d1eeb8a865 Oct 07 15:08:44 crc kubenswrapper[4854]: I1007 15:08:44.457801 4854 generic.go:334] "Generic (PLEG): container finished" podID="c3fcfef0-5261-4b36-8507-e2049441a06b" containerID="cbcfd2ea95162bf4eee69dccb0c749aa530ec037a03102317b2b856b89701cf2" exitCode=0 Oct 07 15:08:44 crc kubenswrapper[4854]: I1007 15:08:44.457862 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hctn7" event={"ID":"c3fcfef0-5261-4b36-8507-e2049441a06b","Type":"ContainerDied","Data":"cbcfd2ea95162bf4eee69dccb0c749aa530ec037a03102317b2b856b89701cf2"} Oct 07 15:08:44 crc kubenswrapper[4854]: I1007 15:08:44.457893 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hctn7" event={"ID":"c3fcfef0-5261-4b36-8507-e2049441a06b","Type":"ContainerStarted","Data":"824c25d01d54c80160edd4c91679142820b21bb570d55dee444489d1eeb8a865"} Oct 07 15:08:44 crc kubenswrapper[4854]: I1007 15:08:44.717815 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:08:44 crc kubenswrapper[4854]: E1007 15:08:44.718669 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:08:52 crc kubenswrapper[4854]: I1007 15:08:52.541934 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hctn7" event={"ID":"c3fcfef0-5261-4b36-8507-e2049441a06b","Type":"ContainerStarted","Data":"482dd1bf21d0dc91ea3252c574e856d2267f8429b11b9744e433ed81a165d1c7"} Oct 07 15:08:53 crc kubenswrapper[4854]: I1007 15:08:53.557843 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rc62w" event={"ID":"4a57db85-5ef6-44f4-9265-965e2626a116","Type":"ContainerStarted","Data":"7ebe91d2a51af28f4fd4aa723a88fd3c6fbc1b9656c9794ca8f5f7fcf533c38d"} Oct 07 15:08:54 crc kubenswrapper[4854]: I1007 15:08:54.579299 4854 generic.go:334] "Generic (PLEG): container finished" podID="c3fcfef0-5261-4b36-8507-e2049441a06b" containerID="482dd1bf21d0dc91ea3252c574e856d2267f8429b11b9744e433ed81a165d1c7" exitCode=0 Oct 07 15:08:54 crc kubenswrapper[4854]: I1007 15:08:54.579502 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hctn7" event={"ID":"c3fcfef0-5261-4b36-8507-e2049441a06b","Type":"ContainerDied","Data":"482dd1bf21d0dc91ea3252c574e856d2267f8429b11b9744e433ed81a165d1c7"} Oct 07 15:08:54 crc kubenswrapper[4854]: I1007 15:08:54.584285 4854 generic.go:334] "Generic (PLEG): container finished" podID="4a57db85-5ef6-44f4-9265-965e2626a116" containerID="7ebe91d2a51af28f4fd4aa723a88fd3c6fbc1b9656c9794ca8f5f7fcf533c38d" exitCode=0 Oct 07 15:08:54 crc kubenswrapper[4854]: I1007 15:08:54.584317 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rc62w" event={"ID":"4a57db85-5ef6-44f4-9265-965e2626a116","Type":"ContainerDied","Data":"7ebe91d2a51af28f4fd4aa723a88fd3c6fbc1b9656c9794ca8f5f7fcf533c38d"} Oct 07 15:08:56 crc kubenswrapper[4854]: I1007 15:08:56.703220 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:08:56 crc kubenswrapper[4854]: E1007 15:08:56.703806 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:08:59 crc kubenswrapper[4854]: I1007 15:08:59.656138 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hctn7" event={"ID":"c3fcfef0-5261-4b36-8507-e2049441a06b","Type":"ContainerStarted","Data":"e372fdaa4c251ae56db592724a968b9fe4de2d1beb9e3f03b1893234b518af29"} Oct 07 15:08:59 crc kubenswrapper[4854]: I1007 15:08:59.658305 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rc62w" event={"ID":"4a57db85-5ef6-44f4-9265-965e2626a116","Type":"ContainerStarted","Data":"4be3ec3ec8921c1ab7aa72b8bef8cee90d70897acb8654ed1cdad1351f24303a"} Oct 07 15:08:59 crc kubenswrapper[4854]: I1007 15:08:59.687856 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hctn7" podStartSLOduration=3.190516842 podStartE2EDuration="17.687832221s" podCreationTimestamp="2025-10-07 15:08:42 +0000 UTC" firstStartedPulling="2025-10-07 15:08:44.459954778 +0000 UTC m=+9840.447787043" lastFinishedPulling="2025-10-07 15:08:58.957270167 +0000 UTC m=+9854.945102422" observedRunningTime="2025-10-07 15:08:59.678601974 +0000 UTC m=+9855.666434249" watchObservedRunningTime="2025-10-07 15:08:59.687832221 +0000 UTC m=+9855.675664486" Oct 07 15:08:59 crc kubenswrapper[4854]: I1007 15:08:59.706557 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rc62w" podStartSLOduration=2.637526136 podStartE2EDuration="17.706538653s" podCreationTimestamp="2025-10-07 15:08:42 +0000 UTC" firstStartedPulling="2025-10-07 15:08:43.450613855 +0000 UTC m=+9839.438446120" lastFinishedPulling="2025-10-07 15:08:58.519626332 +0000 UTC m=+9854.507458637" observedRunningTime="2025-10-07 15:08:59.698263493 +0000 UTC m=+9855.686095778" watchObservedRunningTime="2025-10-07 15:08:59.706538653 +0000 UTC m=+9855.694370908" Oct 07 15:09:02 crc kubenswrapper[4854]: I1007 15:09:02.557405 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:09:02 crc kubenswrapper[4854]: I1007 15:09:02.558228 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:09:02 crc kubenswrapper[4854]: I1007 15:09:02.608359 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:09:02 crc kubenswrapper[4854]: I1007 15:09:02.744959 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:09:02 crc kubenswrapper[4854]: I1007 15:09:02.745248 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:09:02 crc kubenswrapper[4854]: I1007 15:09:02.793669 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:09:08 crc kubenswrapper[4854]: I1007 15:09:08.703358 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:09:08 crc kubenswrapper[4854]: E1007 15:09:08.704276 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:09:13 crc kubenswrapper[4854]: I1007 15:09:13.388868 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:09:13 crc kubenswrapper[4854]: I1007 15:09:13.395778 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rc62w" Oct 07 15:09:13 crc kubenswrapper[4854]: I1007 15:09:13.819484 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hctn7"] Oct 07 15:09:13 crc kubenswrapper[4854]: I1007 15:09:13.820029 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hctn7" podUID="c3fcfef0-5261-4b36-8507-e2049441a06b" containerName="registry-server" containerID="cri-o://e372fdaa4c251ae56db592724a968b9fe4de2d1beb9e3f03b1893234b518af29" gracePeriod=2 Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.386138 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.435336 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3fcfef0-5261-4b36-8507-e2049441a06b-utilities\") pod \"c3fcfef0-5261-4b36-8507-e2049441a06b\" (UID: \"c3fcfef0-5261-4b36-8507-e2049441a06b\") " Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.435589 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxv58\" (UniqueName: \"kubernetes.io/projected/c3fcfef0-5261-4b36-8507-e2049441a06b-kube-api-access-vxv58\") pod \"c3fcfef0-5261-4b36-8507-e2049441a06b\" (UID: \"c3fcfef0-5261-4b36-8507-e2049441a06b\") " Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.435752 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3fcfef0-5261-4b36-8507-e2049441a06b-catalog-content\") pod \"c3fcfef0-5261-4b36-8507-e2049441a06b\" (UID: \"c3fcfef0-5261-4b36-8507-e2049441a06b\") " Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.436346 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3fcfef0-5261-4b36-8507-e2049441a06b-utilities" (OuterVolumeSpecName: "utilities") pod "c3fcfef0-5261-4b36-8507-e2049441a06b" (UID: "c3fcfef0-5261-4b36-8507-e2049441a06b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.437418 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3fcfef0-5261-4b36-8507-e2049441a06b-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.469402 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3fcfef0-5261-4b36-8507-e2049441a06b-kube-api-access-vxv58" (OuterVolumeSpecName: "kube-api-access-vxv58") pod "c3fcfef0-5261-4b36-8507-e2049441a06b" (UID: "c3fcfef0-5261-4b36-8507-e2049441a06b"). InnerVolumeSpecName "kube-api-access-vxv58". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.496826 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3fcfef0-5261-4b36-8507-e2049441a06b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3fcfef0-5261-4b36-8507-e2049441a06b" (UID: "c3fcfef0-5261-4b36-8507-e2049441a06b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.539015 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxv58\" (UniqueName: \"kubernetes.io/projected/c3fcfef0-5261-4b36-8507-e2049441a06b-kube-api-access-vxv58\") on node \"crc\" DevicePath \"\"" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.539048 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3fcfef0-5261-4b36-8507-e2049441a06b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.828499 4854 generic.go:334] "Generic (PLEG): container finished" podID="c3fcfef0-5261-4b36-8507-e2049441a06b" containerID="e372fdaa4c251ae56db592724a968b9fe4de2d1beb9e3f03b1893234b518af29" exitCode=0 Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.828538 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hctn7" event={"ID":"c3fcfef0-5261-4b36-8507-e2049441a06b","Type":"ContainerDied","Data":"e372fdaa4c251ae56db592724a968b9fe4de2d1beb9e3f03b1893234b518af29"} Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.828563 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hctn7" event={"ID":"c3fcfef0-5261-4b36-8507-e2049441a06b","Type":"ContainerDied","Data":"824c25d01d54c80160edd4c91679142820b21bb570d55dee444489d1eeb8a865"} Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.828580 4854 scope.go:117] "RemoveContainer" containerID="e372fdaa4c251ae56db592724a968b9fe4de2d1beb9e3f03b1893234b518af29" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.828919 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hctn7" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.901834 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rc62w"] Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.907255 4854 scope.go:117] "RemoveContainer" containerID="482dd1bf21d0dc91ea3252c574e856d2267f8429b11b9744e433ed81a165d1c7" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.938066 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hctn7"] Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.945524 4854 scope.go:117] "RemoveContainer" containerID="cbcfd2ea95162bf4eee69dccb0c749aa530ec037a03102317b2b856b89701cf2" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.950491 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hctn7"] Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.985253 4854 scope.go:117] "RemoveContainer" containerID="e372fdaa4c251ae56db592724a968b9fe4de2d1beb9e3f03b1893234b518af29" Oct 07 15:09:14 crc kubenswrapper[4854]: E1007 15:09:14.986656 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e372fdaa4c251ae56db592724a968b9fe4de2d1beb9e3f03b1893234b518af29\": container with ID starting with e372fdaa4c251ae56db592724a968b9fe4de2d1beb9e3f03b1893234b518af29 not found: ID does not exist" containerID="e372fdaa4c251ae56db592724a968b9fe4de2d1beb9e3f03b1893234b518af29" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.986710 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e372fdaa4c251ae56db592724a968b9fe4de2d1beb9e3f03b1893234b518af29"} err="failed to get container status \"e372fdaa4c251ae56db592724a968b9fe4de2d1beb9e3f03b1893234b518af29\": rpc error: code = NotFound desc = could not find container \"e372fdaa4c251ae56db592724a968b9fe4de2d1beb9e3f03b1893234b518af29\": container with ID starting with e372fdaa4c251ae56db592724a968b9fe4de2d1beb9e3f03b1893234b518af29 not found: ID does not exist" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.986741 4854 scope.go:117] "RemoveContainer" containerID="482dd1bf21d0dc91ea3252c574e856d2267f8429b11b9744e433ed81a165d1c7" Oct 07 15:09:14 crc kubenswrapper[4854]: E1007 15:09:14.987836 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"482dd1bf21d0dc91ea3252c574e856d2267f8429b11b9744e433ed81a165d1c7\": container with ID starting with 482dd1bf21d0dc91ea3252c574e856d2267f8429b11b9744e433ed81a165d1c7 not found: ID does not exist" containerID="482dd1bf21d0dc91ea3252c574e856d2267f8429b11b9744e433ed81a165d1c7" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.987999 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"482dd1bf21d0dc91ea3252c574e856d2267f8429b11b9744e433ed81a165d1c7"} err="failed to get container status \"482dd1bf21d0dc91ea3252c574e856d2267f8429b11b9744e433ed81a165d1c7\": rpc error: code = NotFound desc = could not find container \"482dd1bf21d0dc91ea3252c574e856d2267f8429b11b9744e433ed81a165d1c7\": container with ID starting with 482dd1bf21d0dc91ea3252c574e856d2267f8429b11b9744e433ed81a165d1c7 not found: ID does not exist" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.988032 4854 scope.go:117] "RemoveContainer" containerID="cbcfd2ea95162bf4eee69dccb0c749aa530ec037a03102317b2b856b89701cf2" Oct 07 15:09:14 crc kubenswrapper[4854]: E1007 15:09:14.989970 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbcfd2ea95162bf4eee69dccb0c749aa530ec037a03102317b2b856b89701cf2\": container with ID starting with cbcfd2ea95162bf4eee69dccb0c749aa530ec037a03102317b2b856b89701cf2 not found: ID does not exist" containerID="cbcfd2ea95162bf4eee69dccb0c749aa530ec037a03102317b2b856b89701cf2" Oct 07 15:09:14 crc kubenswrapper[4854]: I1007 15:09:14.989997 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbcfd2ea95162bf4eee69dccb0c749aa530ec037a03102317b2b856b89701cf2"} err="failed to get container status \"cbcfd2ea95162bf4eee69dccb0c749aa530ec037a03102317b2b856b89701cf2\": rpc error: code = NotFound desc = could not find container \"cbcfd2ea95162bf4eee69dccb0c749aa530ec037a03102317b2b856b89701cf2\": container with ID starting with cbcfd2ea95162bf4eee69dccb0c749aa530ec037a03102317b2b856b89701cf2 not found: ID does not exist" Oct 07 15:09:15 crc kubenswrapper[4854]: I1007 15:09:15.220891 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvktg"] Oct 07 15:09:15 crc kubenswrapper[4854]: I1007 15:09:15.221187 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rvktg" podUID="d9fd837b-e039-4782-8fa7-272f368cc9cd" containerName="registry-server" containerID="cri-o://6af263cee8cd6e2a56ca9b03164e63c28347aadfb6812058fc8e42504ee00bae" gracePeriod=2 Oct 07 15:09:15 crc kubenswrapper[4854]: I1007 15:09:15.840130 4854 generic.go:334] "Generic (PLEG): container finished" podID="d9fd837b-e039-4782-8fa7-272f368cc9cd" containerID="6af263cee8cd6e2a56ca9b03164e63c28347aadfb6812058fc8e42504ee00bae" exitCode=0 Oct 07 15:09:15 crc kubenswrapper[4854]: I1007 15:09:15.840194 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvktg" event={"ID":"d9fd837b-e039-4782-8fa7-272f368cc9cd","Type":"ContainerDied","Data":"6af263cee8cd6e2a56ca9b03164e63c28347aadfb6812058fc8e42504ee00bae"} Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.259288 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvktg" Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.311336 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fd837b-e039-4782-8fa7-272f368cc9cd-catalog-content\") pod \"d9fd837b-e039-4782-8fa7-272f368cc9cd\" (UID: \"d9fd837b-e039-4782-8fa7-272f368cc9cd\") " Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.311385 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fd837b-e039-4782-8fa7-272f368cc9cd-utilities\") pod \"d9fd837b-e039-4782-8fa7-272f368cc9cd\" (UID: \"d9fd837b-e039-4782-8fa7-272f368cc9cd\") " Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.311583 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx6qf\" (UniqueName: \"kubernetes.io/projected/d9fd837b-e039-4782-8fa7-272f368cc9cd-kube-api-access-bx6qf\") pod \"d9fd837b-e039-4782-8fa7-272f368cc9cd\" (UID: \"d9fd837b-e039-4782-8fa7-272f368cc9cd\") " Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.313449 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9fd837b-e039-4782-8fa7-272f368cc9cd-utilities" (OuterVolumeSpecName: "utilities") pod "d9fd837b-e039-4782-8fa7-272f368cc9cd" (UID: "d9fd837b-e039-4782-8fa7-272f368cc9cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.328856 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9fd837b-e039-4782-8fa7-272f368cc9cd-kube-api-access-bx6qf" (OuterVolumeSpecName: "kube-api-access-bx6qf") pod "d9fd837b-e039-4782-8fa7-272f368cc9cd" (UID: "d9fd837b-e039-4782-8fa7-272f368cc9cd"). InnerVolumeSpecName "kube-api-access-bx6qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.382578 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9fd837b-e039-4782-8fa7-272f368cc9cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9fd837b-e039-4782-8fa7-272f368cc9cd" (UID: "d9fd837b-e039-4782-8fa7-272f368cc9cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.415508 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx6qf\" (UniqueName: \"kubernetes.io/projected/d9fd837b-e039-4782-8fa7-272f368cc9cd-kube-api-access-bx6qf\") on node \"crc\" DevicePath \"\"" Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.415549 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9fd837b-e039-4782-8fa7-272f368cc9cd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.415562 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9fd837b-e039-4782-8fa7-272f368cc9cd-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.725024 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3fcfef0-5261-4b36-8507-e2049441a06b" path="/var/lib/kubelet/pods/c3fcfef0-5261-4b36-8507-e2049441a06b/volumes" Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.857892 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvktg" event={"ID":"d9fd837b-e039-4782-8fa7-272f368cc9cd","Type":"ContainerDied","Data":"3400c1de6b29ab77f339fc7237b3488743bb9ffa1bd34c0ca443fab73ccc23dd"} Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.857967 4854 scope.go:117] "RemoveContainer" containerID="6af263cee8cd6e2a56ca9b03164e63c28347aadfb6812058fc8e42504ee00bae" Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.857976 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvktg" Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.895928 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvktg"] Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.904714 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rvktg"] Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.911086 4854 scope.go:117] "RemoveContainer" containerID="ce19bb1f5b13bd5c033c578b4cef57cf5881c876b7e2938850502d8660340225" Oct 07 15:09:16 crc kubenswrapper[4854]: I1007 15:09:16.987522 4854 scope.go:117] "RemoveContainer" containerID="72d3f007478b952194d08c4b1d7617d92263456a89b670ef5d8a3cac50d3a1a4" Oct 07 15:09:18 crc kubenswrapper[4854]: I1007 15:09:18.718596 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9fd837b-e039-4782-8fa7-272f368cc9cd" path="/var/lib/kubelet/pods/d9fd837b-e039-4782-8fa7-272f368cc9cd/volumes" Oct 07 15:09:20 crc kubenswrapper[4854]: I1007 15:09:20.702720 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:09:20 crc kubenswrapper[4854]: E1007 15:09:20.703250 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:09:31 crc kubenswrapper[4854]: I1007 15:09:31.705006 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:09:31 crc kubenswrapper[4854]: E1007 15:09:31.705734 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:09:43 crc kubenswrapper[4854]: I1007 15:09:43.703215 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:09:44 crc kubenswrapper[4854]: I1007 15:09:44.161112 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"6f17d4c8ac8a72864ae50a4f170092990db3a5ea9c1c86e99d2e15789436580e"} Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.154283 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vnj8v/must-gather-nkk6l"] Oct 07 15:09:56 crc kubenswrapper[4854]: E1007 15:09:56.155390 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3fcfef0-5261-4b36-8507-e2049441a06b" containerName="extract-utilities" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.155405 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fcfef0-5261-4b36-8507-e2049441a06b" containerName="extract-utilities" Oct 07 15:09:56 crc kubenswrapper[4854]: E1007 15:09:56.155448 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9fd837b-e039-4782-8fa7-272f368cc9cd" containerName="extract-content" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.155456 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9fd837b-e039-4782-8fa7-272f368cc9cd" containerName="extract-content" Oct 07 15:09:56 crc kubenswrapper[4854]: E1007 15:09:56.155472 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9fd837b-e039-4782-8fa7-272f368cc9cd" containerName="extract-utilities" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.155480 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9fd837b-e039-4782-8fa7-272f368cc9cd" containerName="extract-utilities" Oct 07 15:09:56 crc kubenswrapper[4854]: E1007 15:09:56.155494 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3fcfef0-5261-4b36-8507-e2049441a06b" containerName="registry-server" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.155501 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fcfef0-5261-4b36-8507-e2049441a06b" containerName="registry-server" Oct 07 15:09:56 crc kubenswrapper[4854]: E1007 15:09:56.155517 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3fcfef0-5261-4b36-8507-e2049441a06b" containerName="extract-content" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.155525 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fcfef0-5261-4b36-8507-e2049441a06b" containerName="extract-content" Oct 07 15:09:56 crc kubenswrapper[4854]: E1007 15:09:56.155539 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9fd837b-e039-4782-8fa7-272f368cc9cd" containerName="registry-server" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.155546 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9fd837b-e039-4782-8fa7-272f368cc9cd" containerName="registry-server" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.155852 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3fcfef0-5261-4b36-8507-e2049441a06b" containerName="registry-server" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.155886 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9fd837b-e039-4782-8fa7-272f368cc9cd" containerName="registry-server" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.157529 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/must-gather-nkk6l" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.160213 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vnj8v"/"openshift-service-ca.crt" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.160283 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vnj8v"/"kube-root-ca.crt" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.174564 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vnj8v/must-gather-nkk6l"] Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.270138 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/615e70f0-2488-42bc-9983-aa68a2d699bd-must-gather-output\") pod \"must-gather-nkk6l\" (UID: \"615e70f0-2488-42bc-9983-aa68a2d699bd\") " pod="openshift-must-gather-vnj8v/must-gather-nkk6l" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.270235 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2d2x\" (UniqueName: \"kubernetes.io/projected/615e70f0-2488-42bc-9983-aa68a2d699bd-kube-api-access-t2d2x\") pod \"must-gather-nkk6l\" (UID: \"615e70f0-2488-42bc-9983-aa68a2d699bd\") " pod="openshift-must-gather-vnj8v/must-gather-nkk6l" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.372511 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/615e70f0-2488-42bc-9983-aa68a2d699bd-must-gather-output\") pod \"must-gather-nkk6l\" (UID: \"615e70f0-2488-42bc-9983-aa68a2d699bd\") " pod="openshift-must-gather-vnj8v/must-gather-nkk6l" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.372575 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2d2x\" (UniqueName: \"kubernetes.io/projected/615e70f0-2488-42bc-9983-aa68a2d699bd-kube-api-access-t2d2x\") pod \"must-gather-nkk6l\" (UID: \"615e70f0-2488-42bc-9983-aa68a2d699bd\") " pod="openshift-must-gather-vnj8v/must-gather-nkk6l" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.372992 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/615e70f0-2488-42bc-9983-aa68a2d699bd-must-gather-output\") pod \"must-gather-nkk6l\" (UID: \"615e70f0-2488-42bc-9983-aa68a2d699bd\") " pod="openshift-must-gather-vnj8v/must-gather-nkk6l" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.391074 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2d2x\" (UniqueName: \"kubernetes.io/projected/615e70f0-2488-42bc-9983-aa68a2d699bd-kube-api-access-t2d2x\") pod \"must-gather-nkk6l\" (UID: \"615e70f0-2488-42bc-9983-aa68a2d699bd\") " pod="openshift-must-gather-vnj8v/must-gather-nkk6l" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.484095 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/must-gather-nkk6l" Oct 07 15:09:56 crc kubenswrapper[4854]: I1007 15:09:56.960763 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vnj8v/must-gather-nkk6l"] Oct 07 15:09:57 crc kubenswrapper[4854]: I1007 15:09:57.302305 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnj8v/must-gather-nkk6l" event={"ID":"615e70f0-2488-42bc-9983-aa68a2d699bd","Type":"ContainerStarted","Data":"7aae8fc31def99bcef1531b831a5a845368e56c9987ed1a34e27c69442d9b067"} Oct 07 15:10:04 crc kubenswrapper[4854]: I1007 15:10:04.381528 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnj8v/must-gather-nkk6l" event={"ID":"615e70f0-2488-42bc-9983-aa68a2d699bd","Type":"ContainerStarted","Data":"41bcd2afd5423398a6a7776d518eda9adedca10e4a8fd8d9799f4348681dd803"} Oct 07 15:10:04 crc kubenswrapper[4854]: I1007 15:10:04.382126 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnj8v/must-gather-nkk6l" event={"ID":"615e70f0-2488-42bc-9983-aa68a2d699bd","Type":"ContainerStarted","Data":"356037d7fb2faab4a63b5ade74cb91cda4c4acd492dd1603aa97206836a3dca5"} Oct 07 15:10:04 crc kubenswrapper[4854]: I1007 15:10:04.394309 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vnj8v/must-gather-nkk6l" podStartSLOduration=2.048452616 podStartE2EDuration="8.39429056s" podCreationTimestamp="2025-10-07 15:09:56 +0000 UTC" firstStartedPulling="2025-10-07 15:09:56.962836901 +0000 UTC m=+9912.950669156" lastFinishedPulling="2025-10-07 15:10:03.308674845 +0000 UTC m=+9919.296507100" observedRunningTime="2025-10-07 15:10:04.392920691 +0000 UTC m=+9920.380752956" watchObservedRunningTime="2025-10-07 15:10:04.39429056 +0000 UTC m=+9920.382122815" Oct 07 15:10:08 crc kubenswrapper[4854]: I1007 15:10:08.019963 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vnj8v/crc-debug-pmhvt"] Oct 07 15:10:08 crc kubenswrapper[4854]: I1007 15:10:08.021671 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/crc-debug-pmhvt" Oct 07 15:10:08 crc kubenswrapper[4854]: I1007 15:10:08.023389 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vnj8v"/"default-dockercfg-k94sl" Oct 07 15:10:08 crc kubenswrapper[4854]: I1007 15:10:08.077747 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbb4129d-96de-47ea-9c49-b52403398164-host\") pod \"crc-debug-pmhvt\" (UID: \"dbb4129d-96de-47ea-9c49-b52403398164\") " pod="openshift-must-gather-vnj8v/crc-debug-pmhvt" Oct 07 15:10:08 crc kubenswrapper[4854]: I1007 15:10:08.078069 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqfdn\" (UniqueName: \"kubernetes.io/projected/dbb4129d-96de-47ea-9c49-b52403398164-kube-api-access-vqfdn\") pod \"crc-debug-pmhvt\" (UID: \"dbb4129d-96de-47ea-9c49-b52403398164\") " pod="openshift-must-gather-vnj8v/crc-debug-pmhvt" Oct 07 15:10:08 crc kubenswrapper[4854]: I1007 15:10:08.180636 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbb4129d-96de-47ea-9c49-b52403398164-host\") pod \"crc-debug-pmhvt\" (UID: \"dbb4129d-96de-47ea-9c49-b52403398164\") " pod="openshift-must-gather-vnj8v/crc-debug-pmhvt" Oct 07 15:10:08 crc kubenswrapper[4854]: I1007 15:10:08.180781 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqfdn\" (UniqueName: \"kubernetes.io/projected/dbb4129d-96de-47ea-9c49-b52403398164-kube-api-access-vqfdn\") pod \"crc-debug-pmhvt\" (UID: \"dbb4129d-96de-47ea-9c49-b52403398164\") " pod="openshift-must-gather-vnj8v/crc-debug-pmhvt" Oct 07 15:10:08 crc kubenswrapper[4854]: I1007 15:10:08.181247 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbb4129d-96de-47ea-9c49-b52403398164-host\") pod \"crc-debug-pmhvt\" (UID: \"dbb4129d-96de-47ea-9c49-b52403398164\") " pod="openshift-must-gather-vnj8v/crc-debug-pmhvt" Oct 07 15:10:08 crc kubenswrapper[4854]: I1007 15:10:08.206941 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqfdn\" (UniqueName: \"kubernetes.io/projected/dbb4129d-96de-47ea-9c49-b52403398164-kube-api-access-vqfdn\") pod \"crc-debug-pmhvt\" (UID: \"dbb4129d-96de-47ea-9c49-b52403398164\") " pod="openshift-must-gather-vnj8v/crc-debug-pmhvt" Oct 07 15:10:08 crc kubenswrapper[4854]: I1007 15:10:08.345376 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/crc-debug-pmhvt" Oct 07 15:10:08 crc kubenswrapper[4854]: I1007 15:10:08.424635 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnj8v/crc-debug-pmhvt" event={"ID":"dbb4129d-96de-47ea-9c49-b52403398164","Type":"ContainerStarted","Data":"c8870d4f0911dcd78ba05479c15f1027106b03090f5a23ba5b11c690706b93a1"} Oct 07 15:10:22 crc kubenswrapper[4854]: I1007 15:10:22.592615 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnj8v/crc-debug-pmhvt" event={"ID":"dbb4129d-96de-47ea-9c49-b52403398164","Type":"ContainerStarted","Data":"37ead8063b67b89ec34d379de1b368a4aea75cb313dd4dcd9dca44a01d5a82dd"} Oct 07 15:10:22 crc kubenswrapper[4854]: I1007 15:10:22.614949 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vnj8v/crc-debug-pmhvt" podStartSLOduration=0.700449957 podStartE2EDuration="14.614928252s" podCreationTimestamp="2025-10-07 15:10:08 +0000 UTC" firstStartedPulling="2025-10-07 15:10:08.394569181 +0000 UTC m=+9924.382401436" lastFinishedPulling="2025-10-07 15:10:22.309047456 +0000 UTC m=+9938.296879731" observedRunningTime="2025-10-07 15:10:22.606002613 +0000 UTC m=+9938.593834868" watchObservedRunningTime="2025-10-07 15:10:22.614928252 +0000 UTC m=+9938.602760507" Oct 07 15:11:52 crc kubenswrapper[4854]: I1007 15:11:52.861829 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_cc17a7f7-bac6-4b57-bf18-1f3110b14f29/init-config-reloader/0.log" Oct 07 15:11:53 crc kubenswrapper[4854]: I1007 15:11:53.168031 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_cc17a7f7-bac6-4b57-bf18-1f3110b14f29/init-config-reloader/0.log" Oct 07 15:11:53 crc kubenswrapper[4854]: I1007 15:11:53.478733 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_cc17a7f7-bac6-4b57-bf18-1f3110b14f29/config-reloader/0.log" Oct 07 15:11:53 crc kubenswrapper[4854]: I1007 15:11:53.569511 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_cc17a7f7-bac6-4b57-bf18-1f3110b14f29/alertmanager/0.log" Oct 07 15:11:53 crc kubenswrapper[4854]: I1007 15:11:53.849322 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c233cebb-5c68-4c2f-b875-5c26e2af4d6b/aodh-api/0.log" Oct 07 15:11:54 crc kubenswrapper[4854]: I1007 15:11:54.028876 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c233cebb-5c68-4c2f-b875-5c26e2af4d6b/aodh-evaluator/0.log" Oct 07 15:11:54 crc kubenswrapper[4854]: I1007 15:11:54.197534 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c233cebb-5c68-4c2f-b875-5c26e2af4d6b/aodh-listener/0.log" Oct 07 15:11:54 crc kubenswrapper[4854]: I1007 15:11:54.387047 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c233cebb-5c68-4c2f-b875-5c26e2af4d6b/aodh-notifier/0.log" Oct 07 15:11:54 crc kubenswrapper[4854]: I1007 15:11:54.619923 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-574d8f6594-5smkj_745402b7-4980-4af2-9c25-3195444f8960/barbican-api/0.log" Oct 07 15:11:54 crc kubenswrapper[4854]: I1007 15:11:54.758182 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-574d8f6594-5smkj_745402b7-4980-4af2-9c25-3195444f8960/barbican-api-log/0.log" Oct 07 15:11:54 crc kubenswrapper[4854]: I1007 15:11:54.963523 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c6fc899c8-gbvnz_43fe2335-824f-4cad-bfff-ea8487237d61/barbican-keystone-listener/0.log" Oct 07 15:11:55 crc kubenswrapper[4854]: I1007 15:11:55.107326 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c6fc899c8-gbvnz_43fe2335-824f-4cad-bfff-ea8487237d61/barbican-keystone-listener-log/0.log" Oct 07 15:11:55 crc kubenswrapper[4854]: I1007 15:11:55.157354 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6cb9795ff9-628qg_d7636878-b435-49bd-850a-c610d62c62fe/barbican-worker/0.log" Oct 07 15:11:55 crc kubenswrapper[4854]: I1007 15:11:55.482318 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6cb9795ff9-628qg_d7636878-b435-49bd-850a-c610d62c62fe/barbican-worker-log/0.log" Oct 07 15:11:55 crc kubenswrapper[4854]: I1007 15:11:55.614744 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-wsj8v_8727c9f1-ed17-4e2e-9124-bcbe7efbfcfa/bootstrap-openstack-openstack-cell1/0.log" Oct 07 15:11:55 crc kubenswrapper[4854]: I1007 15:11:55.801102 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_adaa0e6d-633e-469a-9327-ceb526997466/ceilometer-central-agent/0.log" Oct 07 15:11:55 crc kubenswrapper[4854]: I1007 15:11:55.856823 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_adaa0e6d-633e-469a-9327-ceb526997466/ceilometer-notification-agent/0.log" Oct 07 15:11:55 crc kubenswrapper[4854]: I1007 15:11:55.973268 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_adaa0e6d-633e-469a-9327-ceb526997466/proxy-httpd/0.log" Oct 07 15:11:55 crc kubenswrapper[4854]: I1007 15:11:55.973714 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_adaa0e6d-633e-469a-9327-ceb526997466/sg-core/0.log" Oct 07 15:11:56 crc kubenswrapper[4854]: I1007 15:11:56.167030 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-ql58q_85d924b2-2173-4e12-b922-28d8b0a2ef2e/ceph-client-openstack-openstack-cell1/0.log" Oct 07 15:11:56 crc kubenswrapper[4854]: I1007 15:11:56.344091 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9ebc9353-0fbe-4d1f-8e95-7b3a716adc28/cinder-api/0.log" Oct 07 15:11:56 crc kubenswrapper[4854]: I1007 15:11:56.367920 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9ebc9353-0fbe-4d1f-8e95-7b3a716adc28/cinder-api-log/0.log" Oct 07 15:11:56 crc kubenswrapper[4854]: I1007 15:11:56.628369 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_c03852fe-4f34-4fff-b7a4-7063ce3d2f29/cinder-backup/0.log" Oct 07 15:11:56 crc kubenswrapper[4854]: I1007 15:11:56.693920 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_c03852fe-4f34-4fff-b7a4-7063ce3d2f29/probe/0.log" Oct 07 15:11:56 crc kubenswrapper[4854]: I1007 15:11:56.821573 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5a6c31e6-c617-401c-a6c5-c76f945460b7/cinder-scheduler/0.log" Oct 07 15:11:56 crc kubenswrapper[4854]: I1007 15:11:56.947334 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5a6c31e6-c617-401c-a6c5-c76f945460b7/probe/0.log" Oct 07 15:11:57 crc kubenswrapper[4854]: I1007 15:11:57.090090 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_c22041bc-4e60-4e0c-8209-856fb1e2ba7a/cinder-volume/0.log" Oct 07 15:11:57 crc kubenswrapper[4854]: I1007 15:11:57.211762 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_c22041bc-4e60-4e0c-8209-856fb1e2ba7a/probe/0.log" Oct 07 15:11:57 crc kubenswrapper[4854]: I1007 15:11:57.348461 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-2tv87_ead9f298-61ce-4835-b285-6df1dd26b9e5/configure-network-openstack-openstack-cell1/0.log" Oct 07 15:11:57 crc kubenswrapper[4854]: I1007 15:11:57.585519 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-w4qv8_702bf0e1-c6e1-4718-8520-bf22b1aa913f/configure-os-openstack-openstack-cell1/0.log" Oct 07 15:11:57 crc kubenswrapper[4854]: I1007 15:11:57.705741 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-57d9d5d775-vnbhg_e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df/init/0.log" Oct 07 15:11:57 crc kubenswrapper[4854]: I1007 15:11:57.850055 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-57d9d5d775-vnbhg_e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df/init/0.log" Oct 07 15:11:57 crc kubenswrapper[4854]: I1007 15:11:57.916767 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-57d9d5d775-vnbhg_e84d3bb1-eb1a-4ced-9dd8-02d1cdc8a7df/dnsmasq-dns/0.log" Oct 07 15:11:58 crc kubenswrapper[4854]: I1007 15:11:58.051551 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-7kss6_7ab6fdb3-f2ee-4c72-bc19-2a62eb51b14a/download-cache-openstack-openstack-cell1/0.log" Oct 07 15:11:58 crc kubenswrapper[4854]: I1007 15:11:58.198823 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cf9aa2b1-2b05-44ab-acfc-09927fda7603/glance-httpd/0.log" Oct 07 15:11:58 crc kubenswrapper[4854]: I1007 15:11:58.939520 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cf9aa2b1-2b05-44ab-acfc-09927fda7603/glance-log/0.log" Oct 07 15:11:58 crc kubenswrapper[4854]: I1007 15:11:58.967796 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7c13c39f-8f80-4b98-884e-0bde905ab6f9/glance-httpd/0.log" Oct 07 15:11:59 crc kubenswrapper[4854]: I1007 15:11:59.124662 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7c13c39f-8f80-4b98-884e-0bde905ab6f9/glance-log/0.log" Oct 07 15:11:59 crc kubenswrapper[4854]: I1007 15:11:59.282209 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-6df4b7cd4d-c88k6_847a28d3-c688-4de9-8e03-5956cfdc1dd2/heat-api/0.log" Oct 07 15:11:59 crc kubenswrapper[4854]: I1007 15:11:59.522717 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-fcbd8b89f-8c6fd_4072ba4e-6d99-4149-be5d-fe68ccfd5622/heat-cfnapi/0.log" Oct 07 15:11:59 crc kubenswrapper[4854]: I1007 15:11:59.657044 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-79b95db96d-7jdbk_9f632003-d94c-4443-ab13-a0a3f1b50647/heat-engine/0.log" Oct 07 15:11:59 crc kubenswrapper[4854]: I1007 15:11:59.872917 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6fbd55998f-s8xh4_1d5b602a-fb2f-4b4a-8170-d64ee1e29f27/horizon/0.log" Oct 07 15:12:00 crc kubenswrapper[4854]: I1007 15:12:00.000717 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6fbd55998f-s8xh4_1d5b602a-fb2f-4b4a-8170-d64ee1e29f27/horizon-log/0.log" Oct 07 15:12:00 crc kubenswrapper[4854]: I1007 15:12:00.079632 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-kspw4_0549aa20-8f5f-4a3a-8c24-767fb2c69c65/install-certs-openstack-openstack-cell1/0.log" Oct 07 15:12:00 crc kubenswrapper[4854]: I1007 15:12:00.723783 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-hldrq_67034203-e483-4499-a262-82cd715d1459/install-os-openstack-openstack-cell1/0.log" Oct 07 15:12:01 crc kubenswrapper[4854]: I1007 15:12:01.044657 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29330761-b8lkv_52399450-0564-41c2-86d5-f9b533025254/keystone-cron/0.log" Oct 07 15:12:01 crc kubenswrapper[4854]: I1007 15:12:01.118517 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-845977846b-69rp6_ae5b44ef-2398-488e-bcef-c88d09cea90a/keystone-api/0.log" Oct 07 15:12:01 crc kubenswrapper[4854]: I1007 15:12:01.253482 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29330821-clqdr_5b5efd4b-3bc2-4b2f-864e-a5d64d84e593/keystone-cron/0.log" Oct 07 15:12:01 crc kubenswrapper[4854]: I1007 15:12:01.449355 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_c33e1e79-872b-4328-a544-f34779689934/kube-state-metrics/0.log" Oct 07 15:12:01 crc kubenswrapper[4854]: I1007 15:12:01.610708 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-6h9d9_42bf907e-8047-4f86-99df-41a920bec529/libvirt-openstack-openstack-cell1/0.log" Oct 07 15:12:01 crc kubenswrapper[4854]: I1007 15:12:01.821135 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_f2308954-dc7b-4806-8cb0-171b0fc0de08/manila-api-log/0.log" Oct 07 15:12:01 crc kubenswrapper[4854]: I1007 15:12:01.857548 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_f2308954-dc7b-4806-8cb0-171b0fc0de08/manila-api/0.log" Oct 07 15:12:02 crc kubenswrapper[4854]: I1007 15:12:02.069434 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_660e5edd-fac1-47c5-855e-d1f5dc5aa455/manila-scheduler/0.log" Oct 07 15:12:02 crc kubenswrapper[4854]: I1007 15:12:02.077467 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_660e5edd-fac1-47c5-855e-d1f5dc5aa455/probe/0.log" Oct 07 15:12:02 crc kubenswrapper[4854]: I1007 15:12:02.265598 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_d6d3a535-80ee-43a0-8f03-30206d07d28c/probe/0.log" Oct 07 15:12:02 crc kubenswrapper[4854]: I1007 15:12:02.297557 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_d6d3a535-80ee-43a0-8f03-30206d07d28c/manila-share/0.log" Oct 07 15:12:02 crc kubenswrapper[4854]: I1007 15:12:02.466405 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_991f7c53-8762-472e-b968-3b1a8fc55d8c/adoption/0.log" Oct 07 15:12:02 crc kubenswrapper[4854]: I1007 15:12:02.871048 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d49dfff85-kb4jj_8fe6746e-d7d8-41be-bb1a-63f0aa67044a/neutron-api/0.log" Oct 07 15:12:03 crc kubenswrapper[4854]: I1007 15:12:03.027420 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d49dfff85-kb4jj_8fe6746e-d7d8-41be-bb1a-63f0aa67044a/neutron-httpd/0.log" Oct 07 15:12:03 crc kubenswrapper[4854]: I1007 15:12:03.552779 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-ht5md_d7ff666b-5204-4865-9ba1-924d47a82480/neutron-dhcp-openstack-openstack-cell1/0.log" Oct 07 15:12:03 crc kubenswrapper[4854]: I1007 15:12:03.801624 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-vvd45_fddef710-1c6d-44cc-8184-613bf1aff29e/neutron-metadata-openstack-openstack-cell1/0.log" Oct 07 15:12:04 crc kubenswrapper[4854]: I1007 15:12:04.108894 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-zrspg_bfe44bd6-7978-41f7-afa8-bdfcafad2b49/neutron-sriov-openstack-openstack-cell1/0.log" Oct 07 15:12:04 crc kubenswrapper[4854]: I1007 15:12:04.495878 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f891daf4-fdec-4042-a6d7-e2b6519d69d4/nova-api-api/0.log" Oct 07 15:12:04 crc kubenswrapper[4854]: I1007 15:12:04.602418 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f891daf4-fdec-4042-a6d7-e2b6519d69d4/nova-api-log/0.log" Oct 07 15:12:05 crc kubenswrapper[4854]: I1007 15:12:05.048680 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_2b78f233-a649-4acd-a7fd-da9e1932d230/nova-cell0-conductor-conductor/0.log" Oct 07 15:12:05 crc kubenswrapper[4854]: I1007 15:12:05.349584 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e0825000-748f-40a8-a3b8-c1009d4b9f9e/nova-cell1-conductor-conductor/0.log" Oct 07 15:12:05 crc kubenswrapper[4854]: I1007 15:12:05.701459 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5fa2efaa-eac2-4dbe-8e97-0c053d3f3d92/nova-cell1-novncproxy-novncproxy/0.log" Oct 07 15:12:05 crc kubenswrapper[4854]: I1007 15:12:05.875274 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_277cd1b6-6ae4-48d5-9a5b-a6c314a11464/memcached/0.log" Oct 07 15:12:06 crc kubenswrapper[4854]: I1007 15:12:06.078125 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqxctp_0913c01c-83f0-4041-a160-3ab1f63d15f3/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Oct 07 15:12:06 crc kubenswrapper[4854]: I1007 15:12:06.276691 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-ktmw7_e9d7777b-b10c-44e6-970c-34dec79d193e/nova-cell1-openstack-openstack-cell1/0.log" Oct 07 15:12:06 crc kubenswrapper[4854]: I1007 15:12:06.553916 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c/nova-metadata-metadata/0.log" Oct 07 15:12:06 crc kubenswrapper[4854]: I1007 15:12:06.575905 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f3cd66cc-3ccf-4f55-ae63-f6874ca0b10c/nova-metadata-log/0.log" Oct 07 15:12:06 crc kubenswrapper[4854]: I1007 15:12:06.815787 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_2c745615-bc65-4a81-9cd0-04aeb6dc7dd1/nova-scheduler-scheduler/0.log" Oct 07 15:12:06 crc kubenswrapper[4854]: I1007 15:12:06.828272 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7ff49dfc98-fd4k5_ff25e0e6-68d6-4c15-8ba6-8582764830ce/init/0.log" Oct 07 15:12:07 crc kubenswrapper[4854]: I1007 15:12:07.003992 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7ff49dfc98-fd4k5_ff25e0e6-68d6-4c15-8ba6-8582764830ce/init/0.log" Oct 07 15:12:07 crc kubenswrapper[4854]: I1007 15:12:07.065740 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7ff49dfc98-fd4k5_ff25e0e6-68d6-4c15-8ba6-8582764830ce/octavia-api-provider-agent/0.log" Oct 07 15:12:07 crc kubenswrapper[4854]: I1007 15:12:07.204383 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7ff49dfc98-fd4k5_ff25e0e6-68d6-4c15-8ba6-8582764830ce/octavia-api/0.log" Oct 07 15:12:07 crc kubenswrapper[4854]: I1007 15:12:07.239113 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-z8lh4_7ec333a0-79da-4419-b190-d49fe761f40e/init/0.log" Oct 07 15:12:07 crc kubenswrapper[4854]: I1007 15:12:07.578619 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-z8lh4_7ec333a0-79da-4419-b190-d49fe761f40e/init/0.log" Oct 07 15:12:07 crc kubenswrapper[4854]: I1007 15:12:07.607425 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-z8lh4_7ec333a0-79da-4419-b190-d49fe761f40e/octavia-healthmanager/0.log" Oct 07 15:12:07 crc kubenswrapper[4854]: I1007 15:12:07.724161 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-qgrhl_d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0/init/0.log" Oct 07 15:12:07 crc kubenswrapper[4854]: I1007 15:12:07.934986 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-qgrhl_d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0/octavia-housekeeping/0.log" Oct 07 15:12:08 crc kubenswrapper[4854]: I1007 15:12:08.044558 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-qgrhl_d6d5f8c4-8b4c-4691-81c1-22a3e9985ba0/init/0.log" Oct 07 15:12:08 crc kubenswrapper[4854]: I1007 15:12:08.117948 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-c8jtv_cd7a15ef-3061-4eef-b5ec-f5933a93a797/init/0.log" Oct 07 15:12:08 crc kubenswrapper[4854]: I1007 15:12:08.304039 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-c8jtv_cd7a15ef-3061-4eef-b5ec-f5933a93a797/octavia-rsyslog/0.log" Oct 07 15:12:08 crc kubenswrapper[4854]: I1007 15:12:08.351827 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-c8jtv_cd7a15ef-3061-4eef-b5ec-f5933a93a797/init/0.log" Oct 07 15:12:08 crc kubenswrapper[4854]: I1007 15:12:08.477171 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-9dv8x_c4b77d11-39cd-4b6b-bfe9-39c06b5ac986/init/0.log" Oct 07 15:12:09 crc kubenswrapper[4854]: I1007 15:12:09.187481 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-9dv8x_c4b77d11-39cd-4b6b-bfe9-39c06b5ac986/octavia-worker/0.log" Oct 07 15:12:09 crc kubenswrapper[4854]: I1007 15:12:09.386923 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-9dv8x_c4b77d11-39cd-4b6b-bfe9-39c06b5ac986/init/0.log" Oct 07 15:12:09 crc kubenswrapper[4854]: I1007 15:12:09.503681 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c5a86825-ec56-46fe-9e53-98d5d66dc2a2/mysql-bootstrap/0.log" Oct 07 15:12:09 crc kubenswrapper[4854]: I1007 15:12:09.649854 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c5a86825-ec56-46fe-9e53-98d5d66dc2a2/mysql-bootstrap/0.log" Oct 07 15:12:09 crc kubenswrapper[4854]: I1007 15:12:09.675139 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c5a86825-ec56-46fe-9e53-98d5d66dc2a2/galera/0.log" Oct 07 15:12:09 crc kubenswrapper[4854]: I1007 15:12:09.856311 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5f5e5df6-7e72-4352-bb73-c30a9d3841dc/mysql-bootstrap/0.log" Oct 07 15:12:09 crc kubenswrapper[4854]: I1007 15:12:09.947103 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5f5e5df6-7e72-4352-bb73-c30a9d3841dc/mysql-bootstrap/0.log" Oct 07 15:12:10 crc kubenswrapper[4854]: I1007 15:12:10.019334 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5f5e5df6-7e72-4352-bb73-c30a9d3841dc/galera/0.log" Oct 07 15:12:10 crc kubenswrapper[4854]: I1007 15:12:10.165660 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_00a568f6-04ab-4d3c-a92f-6e4d35532950/openstackclient/0.log" Oct 07 15:12:10 crc kubenswrapper[4854]: I1007 15:12:10.356597 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-5cddp_3d9803b6-b7f2-461c-9d3b-1fb1f39839e9/ovn-controller/0.log" Oct 07 15:12:10 crc kubenswrapper[4854]: I1007 15:12:10.808238 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:12:10 crc kubenswrapper[4854]: I1007 15:12:10.808311 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:12:11 crc kubenswrapper[4854]: I1007 15:12:11.115047 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xvl8g_953c368f-f670-44f8-b0ee-62a86bb2f5a9/ovsdb-server-init/0.log" Oct 07 15:12:11 crc kubenswrapper[4854]: I1007 15:12:11.169186 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-chjfz_149c8d6e-1ee3-4211-af06-36a5eea10742/openstack-network-exporter/0.log" Oct 07 15:12:11 crc kubenswrapper[4854]: I1007 15:12:11.326395 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xvl8g_953c368f-f670-44f8-b0ee-62a86bb2f5a9/ovs-vswitchd/0.log" Oct 07 15:12:11 crc kubenswrapper[4854]: I1007 15:12:11.385554 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xvl8g_953c368f-f670-44f8-b0ee-62a86bb2f5a9/ovsdb-server/0.log" Oct 07 15:12:11 crc kubenswrapper[4854]: I1007 15:12:11.386785 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xvl8g_953c368f-f670-44f8-b0ee-62a86bb2f5a9/ovsdb-server-init/0.log" Oct 07 15:12:11 crc kubenswrapper[4854]: I1007 15:12:11.556783 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_257e4531-5661-4bad-a586-900a88cca502/adoption/0.log" Oct 07 15:12:11 crc kubenswrapper[4854]: I1007 15:12:11.773820 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_62719138-e7a7-4238-ad62-01b9fbed8739/ovn-northd/0.log" Oct 07 15:12:11 crc kubenswrapper[4854]: I1007 15:12:11.817006 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_62719138-e7a7-4238-ad62-01b9fbed8739/openstack-network-exporter/0.log" Oct 07 15:12:12 crc kubenswrapper[4854]: I1007 15:12:12.094709 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-255n2_be6c7b9e-28b6-490c-8e3d-c919b557df3c/ovn-openstack-openstack-cell1/0.log" Oct 07 15:12:12 crc kubenswrapper[4854]: I1007 15:12:12.106662 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6a9504f6-d3a8-4e28-b54e-d1c4446d39aa/openstack-network-exporter/0.log" Oct 07 15:12:12 crc kubenswrapper[4854]: I1007 15:12:12.259537 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6a9504f6-d3a8-4e28-b54e-d1c4446d39aa/ovsdbserver-nb/0.log" Oct 07 15:12:12 crc kubenswrapper[4854]: I1007 15:12:12.312409 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_1eccb89a-97d3-4621-9472-734665cd23c8/openstack-network-exporter/0.log" Oct 07 15:12:12 crc kubenswrapper[4854]: I1007 15:12:12.516117 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_1eccb89a-97d3-4621-9472-734665cd23c8/ovsdbserver-nb/0.log" Oct 07 15:12:12 crc kubenswrapper[4854]: I1007 15:12:12.559911 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_d940085c-87b0-4156-89ee-5ec89b1a7168/openstack-network-exporter/0.log" Oct 07 15:12:12 crc kubenswrapper[4854]: I1007 15:12:12.719370 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_d940085c-87b0-4156-89ee-5ec89b1a7168/ovsdbserver-nb/0.log" Oct 07 15:12:12 crc kubenswrapper[4854]: I1007 15:12:12.793873 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c8eb4bfe-e9a2-412b-a166-431476bbcc10/openstack-network-exporter/0.log" Oct 07 15:12:12 crc kubenswrapper[4854]: I1007 15:12:12.918898 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c8eb4bfe-e9a2-412b-a166-431476bbcc10/ovsdbserver-sb/0.log" Oct 07 15:12:13 crc kubenswrapper[4854]: I1007 15:12:13.007992 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_a35c4fc5-bf24-44b2-be7a-0da329e40adc/openstack-network-exporter/0.log" Oct 07 15:12:13 crc kubenswrapper[4854]: I1007 15:12:13.094390 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_a35c4fc5-bf24-44b2-be7a-0da329e40adc/ovsdbserver-sb/0.log" Oct 07 15:12:13 crc kubenswrapper[4854]: I1007 15:12:13.262645 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_02e41274-64df-46d9-bec2-4645006646c3/openstack-network-exporter/0.log" Oct 07 15:12:13 crc kubenswrapper[4854]: I1007 15:12:13.278622 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_02e41274-64df-46d9-bec2-4645006646c3/ovsdbserver-sb/0.log" Oct 07 15:12:13 crc kubenswrapper[4854]: I1007 15:12:13.645852 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-bc8c5c6c6-8plcb_803e95e2-2ec6-4a78-8083-e327c47e478f/placement-api/0.log" Oct 07 15:12:13 crc kubenswrapper[4854]: I1007 15:12:13.660185 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-bc8c5c6c6-8plcb_803e95e2-2ec6-4a78-8083-e327c47e478f/placement-log/0.log" Oct 07 15:12:13 crc kubenswrapper[4854]: I1007 15:12:13.876799 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cmjvnh_e2cc8cba-1b03-4817-9930-1a31f1971d9a/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Oct 07 15:12:13 crc kubenswrapper[4854]: I1007 15:12:13.977998 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3edb391a-9ddf-4fed-bc20-51d79f783380/init-config-reloader/0.log" Oct 07 15:12:14 crc kubenswrapper[4854]: I1007 15:12:14.177430 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3edb391a-9ddf-4fed-bc20-51d79f783380/prometheus/0.log" Oct 07 15:12:14 crc kubenswrapper[4854]: I1007 15:12:14.185604 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3edb391a-9ddf-4fed-bc20-51d79f783380/init-config-reloader/0.log" Oct 07 15:12:14 crc kubenswrapper[4854]: I1007 15:12:14.202535 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3edb391a-9ddf-4fed-bc20-51d79f783380/config-reloader/0.log" Oct 07 15:12:14 crc kubenswrapper[4854]: I1007 15:12:14.381852 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d2c1bc67-9713-443b-90af-57a362c1d358/setup-container/0.log" Oct 07 15:12:14 crc kubenswrapper[4854]: I1007 15:12:14.384597 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_3edb391a-9ddf-4fed-bc20-51d79f783380/thanos-sidecar/0.log" Oct 07 15:12:14 crc kubenswrapper[4854]: I1007 15:12:14.624359 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d2c1bc67-9713-443b-90af-57a362c1d358/setup-container/0.log" Oct 07 15:12:14 crc kubenswrapper[4854]: I1007 15:12:14.816419 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_463af3df-b7a7-45fe-a892-19b29608505d/setup-container/0.log" Oct 07 15:12:14 crc kubenswrapper[4854]: I1007 15:12:14.849750 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d2c1bc67-9713-443b-90af-57a362c1d358/rabbitmq/0.log" Oct 07 15:12:15 crc kubenswrapper[4854]: I1007 15:12:15.036743 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_463af3df-b7a7-45fe-a892-19b29608505d/setup-container/0.log" Oct 07 15:12:15 crc kubenswrapper[4854]: I1007 15:12:15.047538 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_463af3df-b7a7-45fe-a892-19b29608505d/rabbitmq/0.log" Oct 07 15:12:15 crc kubenswrapper[4854]: I1007 15:12:15.199348 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-2mkt5_c94ab764-14e7-4100-8179-c00c088a611d/reboot-os-openstack-openstack-cell1/0.log" Oct 07 15:12:15 crc kubenswrapper[4854]: I1007 15:12:15.316943 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-vxvr4_7acd6b6d-b64d-48a9-b71e-1ff81fffa78a/run-os-openstack-openstack-cell1/0.log" Oct 07 15:12:15 crc kubenswrapper[4854]: I1007 15:12:15.443996 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-d6j9r_9955b3eb-74c8-43b6-bc6f-8b6a8021a3b4/ssh-known-hosts-openstack/0.log" Oct 07 15:12:15 crc kubenswrapper[4854]: I1007 15:12:15.653182 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-csjlt_41fb94cd-3209-4d7e-803b-85f122d3800b/telemetry-openstack-openstack-cell1/0.log" Oct 07 15:12:15 crc kubenswrapper[4854]: I1007 15:12:15.859975 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-528sm_17027554-19b0-44c4-8798-f3f0025605ca/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Oct 07 15:12:15 crc kubenswrapper[4854]: I1007 15:12:15.943559 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-zxx2n_b6281fe5-e710-49a7-88e0-07c925cffc8e/validate-network-openstack-openstack-cell1/0.log" Oct 07 15:12:40 crc kubenswrapper[4854]: I1007 15:12:40.808608 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:12:40 crc kubenswrapper[4854]: I1007 15:12:40.809259 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:12:57 crc kubenswrapper[4854]: I1007 15:12:57.362427 4854 generic.go:334] "Generic (PLEG): container finished" podID="dbb4129d-96de-47ea-9c49-b52403398164" containerID="37ead8063b67b89ec34d379de1b368a4aea75cb313dd4dcd9dca44a01d5a82dd" exitCode=0 Oct 07 15:12:57 crc kubenswrapper[4854]: I1007 15:12:57.362519 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnj8v/crc-debug-pmhvt" event={"ID":"dbb4129d-96de-47ea-9c49-b52403398164","Type":"ContainerDied","Data":"37ead8063b67b89ec34d379de1b368a4aea75cb313dd4dcd9dca44a01d5a82dd"} Oct 07 15:12:58 crc kubenswrapper[4854]: I1007 15:12:58.875055 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/crc-debug-pmhvt" Oct 07 15:12:58 crc kubenswrapper[4854]: I1007 15:12:58.916307 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vnj8v/crc-debug-pmhvt"] Oct 07 15:12:58 crc kubenswrapper[4854]: I1007 15:12:58.926091 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vnj8v/crc-debug-pmhvt"] Oct 07 15:12:59 crc kubenswrapper[4854]: I1007 15:12:59.056783 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqfdn\" (UniqueName: \"kubernetes.io/projected/dbb4129d-96de-47ea-9c49-b52403398164-kube-api-access-vqfdn\") pod \"dbb4129d-96de-47ea-9c49-b52403398164\" (UID: \"dbb4129d-96de-47ea-9c49-b52403398164\") " Oct 07 15:12:59 crc kubenswrapper[4854]: I1007 15:12:59.056888 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbb4129d-96de-47ea-9c49-b52403398164-host\") pod \"dbb4129d-96de-47ea-9c49-b52403398164\" (UID: \"dbb4129d-96de-47ea-9c49-b52403398164\") " Oct 07 15:12:59 crc kubenswrapper[4854]: I1007 15:12:59.056970 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbb4129d-96de-47ea-9c49-b52403398164-host" (OuterVolumeSpecName: "host") pod "dbb4129d-96de-47ea-9c49-b52403398164" (UID: "dbb4129d-96de-47ea-9c49-b52403398164"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:12:59 crc kubenswrapper[4854]: I1007 15:12:59.058813 4854 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbb4129d-96de-47ea-9c49-b52403398164-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:59 crc kubenswrapper[4854]: I1007 15:12:59.066463 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb4129d-96de-47ea-9c49-b52403398164-kube-api-access-vqfdn" (OuterVolumeSpecName: "kube-api-access-vqfdn") pod "dbb4129d-96de-47ea-9c49-b52403398164" (UID: "dbb4129d-96de-47ea-9c49-b52403398164"). InnerVolumeSpecName "kube-api-access-vqfdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:12:59 crc kubenswrapper[4854]: I1007 15:12:59.179486 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqfdn\" (UniqueName: \"kubernetes.io/projected/dbb4129d-96de-47ea-9c49-b52403398164-kube-api-access-vqfdn\") on node \"crc\" DevicePath \"\"" Oct 07 15:12:59 crc kubenswrapper[4854]: I1007 15:12:59.393428 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8870d4f0911dcd78ba05479c15f1027106b03090f5a23ba5b11c690706b93a1" Oct 07 15:12:59 crc kubenswrapper[4854]: I1007 15:12:59.393553 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/crc-debug-pmhvt" Oct 07 15:13:00 crc kubenswrapper[4854]: I1007 15:13:00.137663 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vnj8v/crc-debug-rlk8w"] Oct 07 15:13:00 crc kubenswrapper[4854]: E1007 15:13:00.138228 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb4129d-96de-47ea-9c49-b52403398164" containerName="container-00" Oct 07 15:13:00 crc kubenswrapper[4854]: I1007 15:13:00.138245 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb4129d-96de-47ea-9c49-b52403398164" containerName="container-00" Oct 07 15:13:00 crc kubenswrapper[4854]: I1007 15:13:00.138491 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbb4129d-96de-47ea-9c49-b52403398164" containerName="container-00" Oct 07 15:13:00 crc kubenswrapper[4854]: I1007 15:13:00.139603 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" Oct 07 15:13:00 crc kubenswrapper[4854]: I1007 15:13:00.144551 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vnj8v"/"default-dockercfg-k94sl" Oct 07 15:13:00 crc kubenswrapper[4854]: I1007 15:13:00.304127 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47jlt\" (UniqueName: \"kubernetes.io/projected/befac674-71c9-4faf-9dcf-52aee7c4fdbc-kube-api-access-47jlt\") pod \"crc-debug-rlk8w\" (UID: \"befac674-71c9-4faf-9dcf-52aee7c4fdbc\") " pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" Oct 07 15:13:00 crc kubenswrapper[4854]: I1007 15:13:00.304572 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/befac674-71c9-4faf-9dcf-52aee7c4fdbc-host\") pod \"crc-debug-rlk8w\" (UID: \"befac674-71c9-4faf-9dcf-52aee7c4fdbc\") " pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" Oct 07 15:13:00 crc kubenswrapper[4854]: I1007 15:13:00.406924 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47jlt\" (UniqueName: \"kubernetes.io/projected/befac674-71c9-4faf-9dcf-52aee7c4fdbc-kube-api-access-47jlt\") pod \"crc-debug-rlk8w\" (UID: \"befac674-71c9-4faf-9dcf-52aee7c4fdbc\") " pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" Oct 07 15:13:00 crc kubenswrapper[4854]: I1007 15:13:00.407247 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/befac674-71c9-4faf-9dcf-52aee7c4fdbc-host\") pod \"crc-debug-rlk8w\" (UID: \"befac674-71c9-4faf-9dcf-52aee7c4fdbc\") " pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" Oct 07 15:13:00 crc kubenswrapper[4854]: I1007 15:13:00.407398 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/befac674-71c9-4faf-9dcf-52aee7c4fdbc-host\") pod \"crc-debug-rlk8w\" (UID: \"befac674-71c9-4faf-9dcf-52aee7c4fdbc\") " pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" Oct 07 15:13:00 crc kubenswrapper[4854]: I1007 15:13:00.440535 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47jlt\" (UniqueName: \"kubernetes.io/projected/befac674-71c9-4faf-9dcf-52aee7c4fdbc-kube-api-access-47jlt\") pod \"crc-debug-rlk8w\" (UID: \"befac674-71c9-4faf-9dcf-52aee7c4fdbc\") " pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" Oct 07 15:13:00 crc kubenswrapper[4854]: I1007 15:13:00.466518 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" Oct 07 15:13:00 crc kubenswrapper[4854]: I1007 15:13:00.723022 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbb4129d-96de-47ea-9c49-b52403398164" path="/var/lib/kubelet/pods/dbb4129d-96de-47ea-9c49-b52403398164/volumes" Oct 07 15:13:01 crc kubenswrapper[4854]: I1007 15:13:01.420716 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" event={"ID":"befac674-71c9-4faf-9dcf-52aee7c4fdbc","Type":"ContainerStarted","Data":"946ec129de886251554ef4fc2234100cd1256244d7d1282e99e83013cf40bad7"} Oct 07 15:13:01 crc kubenswrapper[4854]: I1007 15:13:01.420778 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" event={"ID":"befac674-71c9-4faf-9dcf-52aee7c4fdbc","Type":"ContainerStarted","Data":"6f8aba338aabb7304dbf25a4147aa7e4490a5a37a1267a34d6f317df06f8fab3"} Oct 07 15:13:01 crc kubenswrapper[4854]: I1007 15:13:01.435666 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" podStartSLOduration=1.435641455 podStartE2EDuration="1.435641455s" podCreationTimestamp="2025-10-07 15:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:13:01.43479435 +0000 UTC m=+10097.422626615" watchObservedRunningTime="2025-10-07 15:13:01.435641455 +0000 UTC m=+10097.423473720" Oct 07 15:13:03 crc kubenswrapper[4854]: I1007 15:13:03.471920 4854 generic.go:334] "Generic (PLEG): container finished" podID="befac674-71c9-4faf-9dcf-52aee7c4fdbc" containerID="946ec129de886251554ef4fc2234100cd1256244d7d1282e99e83013cf40bad7" exitCode=0 Oct 07 15:13:03 crc kubenswrapper[4854]: I1007 15:13:03.471982 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" event={"ID":"befac674-71c9-4faf-9dcf-52aee7c4fdbc","Type":"ContainerDied","Data":"946ec129de886251554ef4fc2234100cd1256244d7d1282e99e83013cf40bad7"} Oct 07 15:13:04 crc kubenswrapper[4854]: I1007 15:13:04.609195 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" Oct 07 15:13:04 crc kubenswrapper[4854]: I1007 15:13:04.719678 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/befac674-71c9-4faf-9dcf-52aee7c4fdbc-host\") pod \"befac674-71c9-4faf-9dcf-52aee7c4fdbc\" (UID: \"befac674-71c9-4faf-9dcf-52aee7c4fdbc\") " Oct 07 15:13:04 crc kubenswrapper[4854]: I1007 15:13:04.719862 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/befac674-71c9-4faf-9dcf-52aee7c4fdbc-host" (OuterVolumeSpecName: "host") pod "befac674-71c9-4faf-9dcf-52aee7c4fdbc" (UID: "befac674-71c9-4faf-9dcf-52aee7c4fdbc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:13:04 crc kubenswrapper[4854]: I1007 15:13:04.719914 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47jlt\" (UniqueName: \"kubernetes.io/projected/befac674-71c9-4faf-9dcf-52aee7c4fdbc-kube-api-access-47jlt\") pod \"befac674-71c9-4faf-9dcf-52aee7c4fdbc\" (UID: \"befac674-71c9-4faf-9dcf-52aee7c4fdbc\") " Oct 07 15:13:04 crc kubenswrapper[4854]: I1007 15:13:04.720651 4854 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/befac674-71c9-4faf-9dcf-52aee7c4fdbc-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:04 crc kubenswrapper[4854]: I1007 15:13:04.733402 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/befac674-71c9-4faf-9dcf-52aee7c4fdbc-kube-api-access-47jlt" (OuterVolumeSpecName: "kube-api-access-47jlt") pod "befac674-71c9-4faf-9dcf-52aee7c4fdbc" (UID: "befac674-71c9-4faf-9dcf-52aee7c4fdbc"). InnerVolumeSpecName "kube-api-access-47jlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:13:04 crc kubenswrapper[4854]: I1007 15:13:04.821697 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47jlt\" (UniqueName: \"kubernetes.io/projected/befac674-71c9-4faf-9dcf-52aee7c4fdbc-kube-api-access-47jlt\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:05 crc kubenswrapper[4854]: I1007 15:13:05.495296 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" event={"ID":"befac674-71c9-4faf-9dcf-52aee7c4fdbc","Type":"ContainerDied","Data":"6f8aba338aabb7304dbf25a4147aa7e4490a5a37a1267a34d6f317df06f8fab3"} Oct 07 15:13:05 crc kubenswrapper[4854]: I1007 15:13:05.495337 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f8aba338aabb7304dbf25a4147aa7e4490a5a37a1267a34d6f317df06f8fab3" Oct 07 15:13:05 crc kubenswrapper[4854]: I1007 15:13:05.495382 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/crc-debug-rlk8w" Oct 07 15:13:10 crc kubenswrapper[4854]: I1007 15:13:10.810653 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:13:10 crc kubenswrapper[4854]: I1007 15:13:10.811262 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:13:10 crc kubenswrapper[4854]: I1007 15:13:10.811309 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 15:13:10 crc kubenswrapper[4854]: I1007 15:13:10.812119 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f17d4c8ac8a72864ae50a4f170092990db3a5ea9c1c86e99d2e15789436580e"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:13:10 crc kubenswrapper[4854]: I1007 15:13:10.812179 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://6f17d4c8ac8a72864ae50a4f170092990db3a5ea9c1c86e99d2e15789436580e" gracePeriod=600 Oct 07 15:13:11 crc kubenswrapper[4854]: I1007 15:13:11.577113 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="6f17d4c8ac8a72864ae50a4f170092990db3a5ea9c1c86e99d2e15789436580e" exitCode=0 Oct 07 15:13:11 crc kubenswrapper[4854]: I1007 15:13:11.577175 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"6f17d4c8ac8a72864ae50a4f170092990db3a5ea9c1c86e99d2e15789436580e"} Oct 07 15:13:11 crc kubenswrapper[4854]: I1007 15:13:11.577475 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerStarted","Data":"c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9"} Oct 07 15:13:11 crc kubenswrapper[4854]: I1007 15:13:11.577497 4854 scope.go:117] "RemoveContainer" containerID="c9aa2a3ee4561b0972ff5e54f922a75687719cdae51f1e918393112c2e5d05e1" Oct 07 15:13:11 crc kubenswrapper[4854]: I1007 15:13:11.810771 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vnj8v/crc-debug-rlk8w"] Oct 07 15:13:11 crc kubenswrapper[4854]: I1007 15:13:11.819375 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vnj8v/crc-debug-rlk8w"] Oct 07 15:13:12 crc kubenswrapper[4854]: I1007 15:13:12.714145 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="befac674-71c9-4faf-9dcf-52aee7c4fdbc" path="/var/lib/kubelet/pods/befac674-71c9-4faf-9dcf-52aee7c4fdbc/volumes" Oct 07 15:13:12 crc kubenswrapper[4854]: I1007 15:13:12.986325 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vnj8v/crc-debug-h94s7"] Oct 07 15:13:12 crc kubenswrapper[4854]: E1007 15:13:12.986765 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="befac674-71c9-4faf-9dcf-52aee7c4fdbc" containerName="container-00" Oct 07 15:13:12 crc kubenswrapper[4854]: I1007 15:13:12.986778 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="befac674-71c9-4faf-9dcf-52aee7c4fdbc" containerName="container-00" Oct 07 15:13:12 crc kubenswrapper[4854]: I1007 15:13:12.986964 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="befac674-71c9-4faf-9dcf-52aee7c4fdbc" containerName="container-00" Oct 07 15:13:12 crc kubenswrapper[4854]: I1007 15:13:12.987756 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/crc-debug-h94s7" Oct 07 15:13:12 crc kubenswrapper[4854]: I1007 15:13:12.991856 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vnj8v"/"default-dockercfg-k94sl" Oct 07 15:13:13 crc kubenswrapper[4854]: I1007 15:13:13.098045 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b235b828-28b3-4ed7-b5f9-c6665a5bae56-host\") pod \"crc-debug-h94s7\" (UID: \"b235b828-28b3-4ed7-b5f9-c6665a5bae56\") " pod="openshift-must-gather-vnj8v/crc-debug-h94s7" Oct 07 15:13:13 crc kubenswrapper[4854]: I1007 15:13:13.098098 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctgt4\" (UniqueName: \"kubernetes.io/projected/b235b828-28b3-4ed7-b5f9-c6665a5bae56-kube-api-access-ctgt4\") pod \"crc-debug-h94s7\" (UID: \"b235b828-28b3-4ed7-b5f9-c6665a5bae56\") " pod="openshift-must-gather-vnj8v/crc-debug-h94s7" Oct 07 15:13:13 crc kubenswrapper[4854]: I1007 15:13:13.200105 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b235b828-28b3-4ed7-b5f9-c6665a5bae56-host\") pod \"crc-debug-h94s7\" (UID: \"b235b828-28b3-4ed7-b5f9-c6665a5bae56\") " pod="openshift-must-gather-vnj8v/crc-debug-h94s7" Oct 07 15:13:13 crc kubenswrapper[4854]: I1007 15:13:13.200192 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctgt4\" (UniqueName: \"kubernetes.io/projected/b235b828-28b3-4ed7-b5f9-c6665a5bae56-kube-api-access-ctgt4\") pod \"crc-debug-h94s7\" (UID: \"b235b828-28b3-4ed7-b5f9-c6665a5bae56\") " pod="openshift-must-gather-vnj8v/crc-debug-h94s7" Oct 07 15:13:13 crc kubenswrapper[4854]: I1007 15:13:13.200218 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b235b828-28b3-4ed7-b5f9-c6665a5bae56-host\") pod \"crc-debug-h94s7\" (UID: \"b235b828-28b3-4ed7-b5f9-c6665a5bae56\") " pod="openshift-must-gather-vnj8v/crc-debug-h94s7" Oct 07 15:13:13 crc kubenswrapper[4854]: I1007 15:13:13.225844 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctgt4\" (UniqueName: \"kubernetes.io/projected/b235b828-28b3-4ed7-b5f9-c6665a5bae56-kube-api-access-ctgt4\") pod \"crc-debug-h94s7\" (UID: \"b235b828-28b3-4ed7-b5f9-c6665a5bae56\") " pod="openshift-must-gather-vnj8v/crc-debug-h94s7" Oct 07 15:13:13 crc kubenswrapper[4854]: I1007 15:13:13.304382 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/crc-debug-h94s7" Oct 07 15:13:13 crc kubenswrapper[4854]: W1007 15:13:13.334313 4854 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb235b828_28b3_4ed7_b5f9_c6665a5bae56.slice/crio-fce6026bb916f7ce259aef172bc201d37740be57fdf971f6d88cf5e1a45d7159 WatchSource:0}: Error finding container fce6026bb916f7ce259aef172bc201d37740be57fdf971f6d88cf5e1a45d7159: Status 404 returned error can't find the container with id fce6026bb916f7ce259aef172bc201d37740be57fdf971f6d88cf5e1a45d7159 Oct 07 15:13:13 crc kubenswrapper[4854]: I1007 15:13:13.604073 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnj8v/crc-debug-h94s7" event={"ID":"b235b828-28b3-4ed7-b5f9-c6665a5bae56","Type":"ContainerStarted","Data":"801b703c9b10e1f9ac6da404e03c648a8febeb6c2d328b1783447dc5f3dfd9eb"} Oct 07 15:13:13 crc kubenswrapper[4854]: I1007 15:13:13.604360 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnj8v/crc-debug-h94s7" event={"ID":"b235b828-28b3-4ed7-b5f9-c6665a5bae56","Type":"ContainerStarted","Data":"fce6026bb916f7ce259aef172bc201d37740be57fdf971f6d88cf5e1a45d7159"} Oct 07 15:13:13 crc kubenswrapper[4854]: I1007 15:13:13.626939 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vnj8v/crc-debug-h94s7" podStartSLOduration=1.626922976 podStartE2EDuration="1.626922976s" podCreationTimestamp="2025-10-07 15:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:13:13.618286406 +0000 UTC m=+10109.606118671" watchObservedRunningTime="2025-10-07 15:13:13.626922976 +0000 UTC m=+10109.614755231" Oct 07 15:13:14 crc kubenswrapper[4854]: I1007 15:13:14.619381 4854 generic.go:334] "Generic (PLEG): container finished" podID="b235b828-28b3-4ed7-b5f9-c6665a5bae56" containerID="801b703c9b10e1f9ac6da404e03c648a8febeb6c2d328b1783447dc5f3dfd9eb" exitCode=0 Oct 07 15:13:14 crc kubenswrapper[4854]: I1007 15:13:14.619451 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnj8v/crc-debug-h94s7" event={"ID":"b235b828-28b3-4ed7-b5f9-c6665a5bae56","Type":"ContainerDied","Data":"801b703c9b10e1f9ac6da404e03c648a8febeb6c2d328b1783447dc5f3dfd9eb"} Oct 07 15:13:15 crc kubenswrapper[4854]: I1007 15:13:15.755780 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/crc-debug-h94s7" Oct 07 15:13:15 crc kubenswrapper[4854]: I1007 15:13:15.797338 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vnj8v/crc-debug-h94s7"] Oct 07 15:13:15 crc kubenswrapper[4854]: I1007 15:13:15.807768 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vnj8v/crc-debug-h94s7"] Oct 07 15:13:15 crc kubenswrapper[4854]: I1007 15:13:15.855461 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b235b828-28b3-4ed7-b5f9-c6665a5bae56-host\") pod \"b235b828-28b3-4ed7-b5f9-c6665a5bae56\" (UID: \"b235b828-28b3-4ed7-b5f9-c6665a5bae56\") " Oct 07 15:13:15 crc kubenswrapper[4854]: I1007 15:13:15.855593 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b235b828-28b3-4ed7-b5f9-c6665a5bae56-host" (OuterVolumeSpecName: "host") pod "b235b828-28b3-4ed7-b5f9-c6665a5bae56" (UID: "b235b828-28b3-4ed7-b5f9-c6665a5bae56"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:13:15 crc kubenswrapper[4854]: I1007 15:13:15.855770 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctgt4\" (UniqueName: \"kubernetes.io/projected/b235b828-28b3-4ed7-b5f9-c6665a5bae56-kube-api-access-ctgt4\") pod \"b235b828-28b3-4ed7-b5f9-c6665a5bae56\" (UID: \"b235b828-28b3-4ed7-b5f9-c6665a5bae56\") " Oct 07 15:13:15 crc kubenswrapper[4854]: I1007 15:13:15.856397 4854 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b235b828-28b3-4ed7-b5f9-c6665a5bae56-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:15 crc kubenswrapper[4854]: I1007 15:13:15.862818 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b235b828-28b3-4ed7-b5f9-c6665a5bae56-kube-api-access-ctgt4" (OuterVolumeSpecName: "kube-api-access-ctgt4") pod "b235b828-28b3-4ed7-b5f9-c6665a5bae56" (UID: "b235b828-28b3-4ed7-b5f9-c6665a5bae56"). InnerVolumeSpecName "kube-api-access-ctgt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:13:15 crc kubenswrapper[4854]: I1007 15:13:15.958021 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctgt4\" (UniqueName: \"kubernetes.io/projected/b235b828-28b3-4ed7-b5f9-c6665a5bae56-kube-api-access-ctgt4\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:16 crc kubenswrapper[4854]: I1007 15:13:16.643439 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fce6026bb916f7ce259aef172bc201d37740be57fdf971f6d88cf5e1a45d7159" Oct 07 15:13:16 crc kubenswrapper[4854]: I1007 15:13:16.643503 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/crc-debug-h94s7" Oct 07 15:13:16 crc kubenswrapper[4854]: I1007 15:13:16.758183 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b235b828-28b3-4ed7-b5f9-c6665a5bae56" path="/var/lib/kubelet/pods/b235b828-28b3-4ed7-b5f9-c6665a5bae56/volumes" Oct 07 15:13:32 crc kubenswrapper[4854]: I1007 15:13:32.125775 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb_0ba19497-e9e9-4587-b726-cadfb140df77/util/0.log" Oct 07 15:13:32 crc kubenswrapper[4854]: I1007 15:13:32.937494 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb_0ba19497-e9e9-4587-b726-cadfb140df77/util/0.log" Oct 07 15:13:32 crc kubenswrapper[4854]: I1007 15:13:32.980859 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb_0ba19497-e9e9-4587-b726-cadfb140df77/pull/0.log" Oct 07 15:13:33 crc kubenswrapper[4854]: I1007 15:13:33.014965 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb_0ba19497-e9e9-4587-b726-cadfb140df77/pull/0.log" Oct 07 15:13:33 crc kubenswrapper[4854]: I1007 15:13:33.180767 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb_0ba19497-e9e9-4587-b726-cadfb140df77/util/0.log" Oct 07 15:13:33 crc kubenswrapper[4854]: I1007 15:13:33.196727 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb_0ba19497-e9e9-4587-b726-cadfb140df77/pull/0.log" Oct 07 15:13:33 crc kubenswrapper[4854]: I1007 15:13:33.200100 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_03190830a20b4f4d382164e60d93db76834db5b34babf3517d759ffbdb8qswb_0ba19497-e9e9-4587-b726-cadfb140df77/extract/0.log" Oct 07 15:13:33 crc kubenswrapper[4854]: I1007 15:13:33.411004 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-bstfk_c345db1e-94bb-4650-80af-e0c3dac97dbe/kube-rbac-proxy/0.log" Oct 07 15:13:33 crc kubenswrapper[4854]: I1007 15:13:33.509813 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-bstfk_c345db1e-94bb-4650-80af-e0c3dac97dbe/manager/0.log" Oct 07 15:13:33 crc kubenswrapper[4854]: I1007 15:13:33.524416 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-6tg4x_b81a8a7a-ce97-4020-9cad-038a81ea3f79/kube-rbac-proxy/0.log" Oct 07 15:13:33 crc kubenswrapper[4854]: I1007 15:13:33.712374 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-6tg4x_b81a8a7a-ce97-4020-9cad-038a81ea3f79/manager/0.log" Oct 07 15:13:33 crc kubenswrapper[4854]: I1007 15:13:33.726048 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-jw8gn_52f4d849-2233-4984-8afb-4ccf15d94914/kube-rbac-proxy/0.log" Oct 07 15:13:33 crc kubenswrapper[4854]: I1007 15:13:33.752727 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-jw8gn_52f4d849-2233-4984-8afb-4ccf15d94914/manager/0.log" Oct 07 15:13:33 crc kubenswrapper[4854]: I1007 15:13:33.911033 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-ml6xv_6e979fa2-9b10-4bd5-9adc-8d9e116da401/kube-rbac-proxy/0.log" Oct 07 15:13:34 crc kubenswrapper[4854]: I1007 15:13:34.036114 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-ml6xv_6e979fa2-9b10-4bd5-9adc-8d9e116da401/manager/0.log" Oct 07 15:13:34 crc kubenswrapper[4854]: I1007 15:13:34.121244 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-hqdgf_3ddf497a-0424-4168-ae55-47a94d4d5124/kube-rbac-proxy/0.log" Oct 07 15:13:34 crc kubenswrapper[4854]: I1007 15:13:34.198522 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-hqdgf_3ddf497a-0424-4168-ae55-47a94d4d5124/manager/0.log" Oct 07 15:13:34 crc kubenswrapper[4854]: I1007 15:13:34.265058 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-lknk5_d09c6570-0b86-42a3-aa24-cd139b85c0fb/kube-rbac-proxy/0.log" Oct 07 15:13:34 crc kubenswrapper[4854]: I1007 15:13:34.350934 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-lknk5_d09c6570-0b86-42a3-aa24-cd139b85c0fb/manager/0.log" Oct 07 15:13:35 crc kubenswrapper[4854]: I1007 15:13:35.207551 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-5n2wq_9f33763b-0578-4fa9-8d46-51202dfb0b12/kube-rbac-proxy/0.log" Oct 07 15:13:35 crc kubenswrapper[4854]: I1007 15:13:35.291614 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-h4lkl_80562dd3-4b06-4e44-88e5-50febc56fa3d/kube-rbac-proxy/0.log" Oct 07 15:13:35 crc kubenswrapper[4854]: I1007 15:13:35.410697 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-h4lkl_80562dd3-4b06-4e44-88e5-50febc56fa3d/manager/0.log" Oct 07 15:13:35 crc kubenswrapper[4854]: I1007 15:13:35.435517 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-5n2wq_9f33763b-0578-4fa9-8d46-51202dfb0b12/manager/0.log" Oct 07 15:13:35 crc kubenswrapper[4854]: I1007 15:13:35.462742 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-pnm28_b4d43b9e-cdd5-4513-aa57-002a29687247/kube-rbac-proxy/0.log" Oct 07 15:13:35 crc kubenswrapper[4854]: I1007 15:13:35.609917 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-pnm28_b4d43b9e-cdd5-4513-aa57-002a29687247/manager/0.log" Oct 07 15:13:35 crc kubenswrapper[4854]: I1007 15:13:35.676991 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-ct8rs_c7263a64-2cf7-404c-b690-d2042b34d0cd/manager/0.log" Oct 07 15:13:35 crc kubenswrapper[4854]: I1007 15:13:35.685939 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-ct8rs_c7263a64-2cf7-404c-b690-d2042b34d0cd/kube-rbac-proxy/0.log" Oct 07 15:13:35 crc kubenswrapper[4854]: I1007 15:13:35.834506 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-nvh92_47a57505-60e3-4139-95d9-426eb48e4e56/kube-rbac-proxy/0.log" Oct 07 15:13:35 crc kubenswrapper[4854]: I1007 15:13:35.903757 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-4774x_1f529c42-e29a-4dfb-972b-5e143c2589b7/kube-rbac-proxy/0.log" Oct 07 15:13:35 crc kubenswrapper[4854]: I1007 15:13:35.910475 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-nvh92_47a57505-60e3-4139-95d9-426eb48e4e56/manager/0.log" Oct 07 15:13:36 crc kubenswrapper[4854]: I1007 15:13:36.019682 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-4774x_1f529c42-e29a-4dfb-972b-5e143c2589b7/manager/0.log" Oct 07 15:13:36 crc kubenswrapper[4854]: I1007 15:13:36.086287 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-xdhf9_dee05e6c-eda8-4059-a9b0-88b3ab2eb219/kube-rbac-proxy/0.log" Oct 07 15:13:36 crc kubenswrapper[4854]: I1007 15:13:36.299635 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-kcmxb_101bbc87-6377-436d-b179-e0368ce39e68/kube-rbac-proxy/0.log" Oct 07 15:13:36 crc kubenswrapper[4854]: I1007 15:13:36.338165 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-xdhf9_dee05e6c-eda8-4059-a9b0-88b3ab2eb219/manager/0.log" Oct 07 15:13:36 crc kubenswrapper[4854]: I1007 15:13:36.397213 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-kcmxb_101bbc87-6377-436d-b179-e0368ce39e68/manager/0.log" Oct 07 15:13:36 crc kubenswrapper[4854]: I1007 15:13:36.432042 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c_f66ed5a7-8fda-4f43-bf10-c1709f30a858/kube-rbac-proxy/0.log" Oct 07 15:13:36 crc kubenswrapper[4854]: I1007 15:13:36.493719 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cf9n6c_f66ed5a7-8fda-4f43-bf10-c1709f30a858/manager/0.log" Oct 07 15:13:36 crc kubenswrapper[4854]: I1007 15:13:36.640325 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6559cd6d74-82w2g_3b140fde-b7d7-4a32-b80e-0cfa788c09b5/kube-rbac-proxy/0.log" Oct 07 15:13:36 crc kubenswrapper[4854]: I1007 15:13:36.688310 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-746fd59886-q46bc_07a9d82f-5beb-479b-8d8f-9650a3cb1a4e/kube-rbac-proxy/0.log" Oct 07 15:13:36 crc kubenswrapper[4854]: I1007 15:13:36.889612 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-746fd59886-q46bc_07a9d82f-5beb-479b-8d8f-9650a3cb1a4e/operator/0.log" Oct 07 15:13:36 crc kubenswrapper[4854]: I1007 15:13:36.896953 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-tlx58_b71069c1-735b-4896-94f3-82913ac9dbf0/registry-server/0.log" Oct 07 15:13:37 crc kubenswrapper[4854]: I1007 15:13:37.138483 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-sc7j5_35f2082e-767b-4de0-9c71-76d1d1cb020a/kube-rbac-proxy/0.log" Oct 07 15:13:37 crc kubenswrapper[4854]: I1007 15:13:37.219346 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-sc7j5_35f2082e-767b-4de0-9c71-76d1d1cb020a/manager/0.log" Oct 07 15:13:37 crc kubenswrapper[4854]: I1007 15:13:37.263767 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-m6qn8_d3b199b3-50a7-4e60-b377-101e8c0d1882/kube-rbac-proxy/0.log" Oct 07 15:13:37 crc kubenswrapper[4854]: I1007 15:13:37.429687 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-m6qn8_d3b199b3-50a7-4e60-b377-101e8c0d1882/manager/0.log" Oct 07 15:13:37 crc kubenswrapper[4854]: I1007 15:13:37.470357 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-7nqjj_e7938e36-fecf-4df1-9e1d-886f84e4c597/operator/0.log" Oct 07 15:13:37 crc kubenswrapper[4854]: I1007 15:13:37.623096 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-x5n7x_602ad7aa-3b22-476b-849c-c1138b5e6223/kube-rbac-proxy/0.log" Oct 07 15:13:37 crc kubenswrapper[4854]: I1007 15:13:37.712701 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-x5n7x_602ad7aa-3b22-476b-849c-c1138b5e6223/manager/0.log" Oct 07 15:13:37 crc kubenswrapper[4854]: I1007 15:13:37.919654 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-d4s4h_bae9e78e-d7e2-4b0b-851b-0705610a640b/kube-rbac-proxy/0.log" Oct 07 15:13:38 crc kubenswrapper[4854]: I1007 15:13:38.205394 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-vvcpc_3cf5fba8-8e19-4c34-bea9-5b910302d68f/kube-rbac-proxy/0.log" Oct 07 15:13:38 crc kubenswrapper[4854]: I1007 15:13:38.376139 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-vvcpc_3cf5fba8-8e19-4c34-bea9-5b910302d68f/manager/0.log" Oct 07 15:13:38 crc kubenswrapper[4854]: I1007 15:13:38.420685 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-dkw22_68f7e193-d05e-4ae9-a3b8-c075b608dfa9/kube-rbac-proxy/0.log" Oct 07 15:13:38 crc kubenswrapper[4854]: I1007 15:13:38.457909 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-d4s4h_bae9e78e-d7e2-4b0b-851b-0705610a640b/manager/0.log" Oct 07 15:13:38 crc kubenswrapper[4854]: I1007 15:13:38.560912 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-dkw22_68f7e193-d05e-4ae9-a3b8-c075b608dfa9/manager/0.log" Oct 07 15:13:38 crc kubenswrapper[4854]: I1007 15:13:38.789094 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6559cd6d74-82w2g_3b140fde-b7d7-4a32-b80e-0cfa788c09b5/manager/0.log" Oct 07 15:13:56 crc kubenswrapper[4854]: I1007 15:13:56.150430 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dc2nk_c9afb6d4-a946-4c1c-995e-330f39a1f346/control-plane-machine-set-operator/0.log" Oct 07 15:13:56 crc kubenswrapper[4854]: I1007 15:13:56.349293 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tsdfx_a4d3106d-58e9-4cb6-bcfd-6e151b16969b/kube-rbac-proxy/0.log" Oct 07 15:13:56 crc kubenswrapper[4854]: I1007 15:13:56.372060 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tsdfx_a4d3106d-58e9-4cb6-bcfd-6e151b16969b/machine-api-operator/0.log" Oct 07 15:14:09 crc kubenswrapper[4854]: I1007 15:14:09.315685 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-fvqd8_3d3237da-c856-4707-956e-2a25e381506e/cert-manager-controller/0.log" Oct 07 15:14:09 crc kubenswrapper[4854]: I1007 15:14:09.444752 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-8gdkc_17874387-19ac-41f2-b359-1407fb8f09dc/cert-manager-cainjector/0.log" Oct 07 15:14:09 crc kubenswrapper[4854]: I1007 15:14:09.482620 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-qgclp_33fd4b49-dcb7-4ebd-a5ca-15ba83c90e3a/cert-manager-webhook/0.log" Oct 07 15:14:21 crc kubenswrapper[4854]: I1007 15:14:21.468471 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-h2m6m_20300568-d59a-4c2e-9311-56d1c20419c2/nmstate-console-plugin/0.log" Oct 07 15:14:21 crc kubenswrapper[4854]: I1007 15:14:21.580002 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-xx54b_173b0967-6286-4bff-94b6-b77502d681c3/nmstate-handler/0.log" Oct 07 15:14:21 crc kubenswrapper[4854]: I1007 15:14:21.649652 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-gs6dc_2beeb70d-afc7-4e87-b8a3-3d0398207f60/kube-rbac-proxy/0.log" Oct 07 15:14:21 crc kubenswrapper[4854]: I1007 15:14:21.703441 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-gs6dc_2beeb70d-afc7-4e87-b8a3-3d0398207f60/nmstate-metrics/0.log" Oct 07 15:14:21 crc kubenswrapper[4854]: I1007 15:14:21.820722 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-jp5sx_c77584c6-4cc6-4e7b-89b4-368d9a8775f5/nmstate-operator/0.log" Oct 07 15:14:21 crc kubenswrapper[4854]: I1007 15:14:21.858992 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-4zcnr_b7917a01-6d98-45d9-b647-605da890ce27/nmstate-webhook/0.log" Oct 07 15:14:36 crc kubenswrapper[4854]: I1007 15:14:36.955433 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-8szdq_660bca3f-29ea-4ead-a16f-6287792ef72b/kube-rbac-proxy/0.log" Oct 07 15:14:37 crc kubenswrapper[4854]: I1007 15:14:37.229028 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/cp-frr-files/0.log" Oct 07 15:14:37 crc kubenswrapper[4854]: I1007 15:14:37.411323 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/cp-frr-files/0.log" Oct 07 15:14:37 crc kubenswrapper[4854]: I1007 15:14:37.446912 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/cp-reloader/0.log" Oct 07 15:14:37 crc kubenswrapper[4854]: I1007 15:14:37.511771 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/cp-metrics/0.log" Oct 07 15:14:37 crc kubenswrapper[4854]: I1007 15:14:37.571542 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-8szdq_660bca3f-29ea-4ead-a16f-6287792ef72b/controller/0.log" Oct 07 15:14:37 crc kubenswrapper[4854]: I1007 15:14:37.600641 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/cp-reloader/0.log" Oct 07 15:14:37 crc kubenswrapper[4854]: I1007 15:14:37.843065 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/cp-reloader/0.log" Oct 07 15:14:37 crc kubenswrapper[4854]: I1007 15:14:37.847204 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/cp-metrics/0.log" Oct 07 15:14:37 crc kubenswrapper[4854]: I1007 15:14:37.878813 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/cp-metrics/0.log" Oct 07 15:14:37 crc kubenswrapper[4854]: I1007 15:14:37.884912 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/cp-frr-files/0.log" Oct 07 15:14:38 crc kubenswrapper[4854]: I1007 15:14:38.085582 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/cp-reloader/0.log" Oct 07 15:14:38 crc kubenswrapper[4854]: I1007 15:14:38.109791 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/cp-metrics/0.log" Oct 07 15:14:38 crc kubenswrapper[4854]: I1007 15:14:38.118788 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/controller/0.log" Oct 07 15:14:38 crc kubenswrapper[4854]: I1007 15:14:38.124305 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/cp-frr-files/0.log" Oct 07 15:14:38 crc kubenswrapper[4854]: I1007 15:14:38.331959 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/frr-metrics/0.log" Oct 07 15:14:38 crc kubenswrapper[4854]: I1007 15:14:38.340796 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/kube-rbac-proxy-frr/0.log" Oct 07 15:14:38 crc kubenswrapper[4854]: I1007 15:14:38.376140 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/kube-rbac-proxy/0.log" Oct 07 15:14:38 crc kubenswrapper[4854]: I1007 15:14:38.515656 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/reloader/0.log" Oct 07 15:14:38 crc kubenswrapper[4854]: I1007 15:14:38.591286 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-dsv2d_7dd0aa9d-7e88-4764-8cea-332b935e11ea/frr-k8s-webhook-server/0.log" Oct 07 15:14:38 crc kubenswrapper[4854]: I1007 15:14:38.878717 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-86f7bb879c-vrdbp_182c4b25-b992-4471-9264-fe61313b869d/manager/0.log" Oct 07 15:14:38 crc kubenswrapper[4854]: I1007 15:14:38.992294 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6b7dfdfc77-wf276_1cb7ac39-9e71-4037-990d-1672438371c7/webhook-server/0.log" Oct 07 15:14:39 crc kubenswrapper[4854]: I1007 15:14:39.171573 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w4gkv_2206fc76-d0a8-4943-9efe-5378e7ee73f6/kube-rbac-proxy/0.log" Oct 07 15:14:41 crc kubenswrapper[4854]: I1007 15:14:41.035745 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-w4gkv_2206fc76-d0a8-4943-9efe-5378e7ee73f6/speaker/0.log" Oct 07 15:14:42 crc kubenswrapper[4854]: I1007 15:14:42.062977 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9d94j_e36001a5-79ba-4f9b-9e5a-012494f505f7/frr/0.log" Oct 07 15:14:53 crc kubenswrapper[4854]: I1007 15:14:53.751121 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz_718dcecb-56b4-49cc-8992-d8d4c6447602/util/0.log" Oct 07 15:14:53 crc kubenswrapper[4854]: I1007 15:14:53.931292 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz_718dcecb-56b4-49cc-8992-d8d4c6447602/pull/0.log" Oct 07 15:14:53 crc kubenswrapper[4854]: I1007 15:14:53.963896 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz_718dcecb-56b4-49cc-8992-d8d4c6447602/pull/0.log" Oct 07 15:14:53 crc kubenswrapper[4854]: I1007 15:14:53.967293 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz_718dcecb-56b4-49cc-8992-d8d4c6447602/util/0.log" Oct 07 15:14:54 crc kubenswrapper[4854]: I1007 15:14:54.093161 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz_718dcecb-56b4-49cc-8992-d8d4c6447602/util/0.log" Oct 07 15:14:54 crc kubenswrapper[4854]: I1007 15:14:54.143960 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz_718dcecb-56b4-49cc-8992-d8d4c6447602/extract/0.log" Oct 07 15:14:54 crc kubenswrapper[4854]: I1007 15:14:54.185423 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69kdrmz_718dcecb-56b4-49cc-8992-d8d4c6447602/pull/0.log" Oct 07 15:14:54 crc kubenswrapper[4854]: I1007 15:14:54.315279 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6_ee4688b0-4708-49b7-8e15-a18a343a9a98/util/0.log" Oct 07 15:14:54 crc kubenswrapper[4854]: I1007 15:14:54.526815 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6_ee4688b0-4708-49b7-8e15-a18a343a9a98/pull/0.log" Oct 07 15:14:54 crc kubenswrapper[4854]: I1007 15:14:54.527051 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6_ee4688b0-4708-49b7-8e15-a18a343a9a98/pull/0.log" Oct 07 15:14:54 crc kubenswrapper[4854]: I1007 15:14:54.529425 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6_ee4688b0-4708-49b7-8e15-a18a343a9a98/util/0.log" Oct 07 15:14:54 crc kubenswrapper[4854]: I1007 15:14:54.694525 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6_ee4688b0-4708-49b7-8e15-a18a343a9a98/util/0.log" Oct 07 15:14:54 crc kubenswrapper[4854]: I1007 15:14:54.717949 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6_ee4688b0-4708-49b7-8e15-a18a343a9a98/pull/0.log" Oct 07 15:14:54 crc kubenswrapper[4854]: I1007 15:14:54.745918 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bnct6_ee4688b0-4708-49b7-8e15-a18a343a9a98/extract/0.log" Oct 07 15:14:54 crc kubenswrapper[4854]: I1007 15:14:54.902344 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64_8b37d062-3006-456c-8cea-ca674cc3ce32/util/0.log" Oct 07 15:14:55 crc kubenswrapper[4854]: I1007 15:14:55.039435 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64_8b37d062-3006-456c-8cea-ca674cc3ce32/util/0.log" Oct 07 15:14:55 crc kubenswrapper[4854]: I1007 15:14:55.057783 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64_8b37d062-3006-456c-8cea-ca674cc3ce32/pull/0.log" Oct 07 15:14:55 crc kubenswrapper[4854]: I1007 15:14:55.061510 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64_8b37d062-3006-456c-8cea-ca674cc3ce32/pull/0.log" Oct 07 15:14:55 crc kubenswrapper[4854]: I1007 15:14:55.232002 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64_8b37d062-3006-456c-8cea-ca674cc3ce32/util/0.log" Oct 07 15:14:55 crc kubenswrapper[4854]: I1007 15:14:55.232831 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64_8b37d062-3006-456c-8cea-ca674cc3ce32/pull/0.log" Oct 07 15:14:55 crc kubenswrapper[4854]: I1007 15:14:55.255253 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcwl64_8b37d062-3006-456c-8cea-ca674cc3ce32/extract/0.log" Oct 07 15:14:55 crc kubenswrapper[4854]: I1007 15:14:55.385055 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rc62w_4a57db85-5ef6-44f4-9265-965e2626a116/extract-utilities/0.log" Oct 07 15:14:55 crc kubenswrapper[4854]: I1007 15:14:55.608597 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rc62w_4a57db85-5ef6-44f4-9265-965e2626a116/extract-utilities/0.log" Oct 07 15:14:55 crc kubenswrapper[4854]: I1007 15:14:55.615427 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rc62w_4a57db85-5ef6-44f4-9265-965e2626a116/extract-content/0.log" Oct 07 15:14:55 crc kubenswrapper[4854]: I1007 15:14:55.626994 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rc62w_4a57db85-5ef6-44f4-9265-965e2626a116/extract-content/0.log" Oct 07 15:14:55 crc kubenswrapper[4854]: I1007 15:14:55.769593 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rc62w_4a57db85-5ef6-44f4-9265-965e2626a116/extract-content/0.log" Oct 07 15:14:55 crc kubenswrapper[4854]: I1007 15:14:55.781438 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rc62w_4a57db85-5ef6-44f4-9265-965e2626a116/extract-utilities/0.log" Oct 07 15:14:55 crc kubenswrapper[4854]: I1007 15:14:55.982450 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rc62w_4a57db85-5ef6-44f4-9265-965e2626a116/registry-server/0.log" Oct 07 15:14:55 crc kubenswrapper[4854]: I1007 15:14:55.998415 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ttrh6_539f954c-bee8-4661-af57-1ba452d3dddb/extract-utilities/0.log" Oct 07 15:14:56 crc kubenswrapper[4854]: I1007 15:14:56.290520 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ttrh6_539f954c-bee8-4661-af57-1ba452d3dddb/extract-content/0.log" Oct 07 15:14:56 crc kubenswrapper[4854]: I1007 15:14:56.290674 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ttrh6_539f954c-bee8-4661-af57-1ba452d3dddb/extract-utilities/0.log" Oct 07 15:14:56 crc kubenswrapper[4854]: I1007 15:14:56.297839 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ttrh6_539f954c-bee8-4661-af57-1ba452d3dddb/extract-content/0.log" Oct 07 15:14:56 crc kubenswrapper[4854]: I1007 15:14:56.434867 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ttrh6_539f954c-bee8-4661-af57-1ba452d3dddb/extract-utilities/0.log" Oct 07 15:14:56 crc kubenswrapper[4854]: I1007 15:14:56.478646 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ttrh6_539f954c-bee8-4661-af57-1ba452d3dddb/extract-content/0.log" Oct 07 15:14:56 crc kubenswrapper[4854]: I1007 15:14:56.695122 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f_6153b740-17ba-4004-8da8-002b8717dcc2/util/0.log" Oct 07 15:14:56 crc kubenswrapper[4854]: I1007 15:14:56.830917 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f_6153b740-17ba-4004-8da8-002b8717dcc2/util/0.log" Oct 07 15:14:56 crc kubenswrapper[4854]: I1007 15:14:56.866780 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f_6153b740-17ba-4004-8da8-002b8717dcc2/pull/0.log" Oct 07 15:14:56 crc kubenswrapper[4854]: I1007 15:14:56.905774 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f_6153b740-17ba-4004-8da8-002b8717dcc2/pull/0.log" Oct 07 15:14:57 crc kubenswrapper[4854]: I1007 15:14:57.175386 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f_6153b740-17ba-4004-8da8-002b8717dcc2/extract/0.log" Oct 07 15:14:57 crc kubenswrapper[4854]: I1007 15:14:57.200973 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f_6153b740-17ba-4004-8da8-002b8717dcc2/pull/0.log" Oct 07 15:14:57 crc kubenswrapper[4854]: I1007 15:14:57.215355 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nz6f_6153b740-17ba-4004-8da8-002b8717dcc2/util/0.log" Oct 07 15:14:57 crc kubenswrapper[4854]: I1007 15:14:57.391290 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fddbx_7078ad02-317f-4f7f-a11b-7c1d24adac56/extract-utilities/0.log" Oct 07 15:14:57 crc kubenswrapper[4854]: I1007 15:14:57.406094 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bplpj_edab8120-5f75-4055-8538-bba0045cd1f2/marketplace-operator/0.log" Oct 07 15:14:57 crc kubenswrapper[4854]: I1007 15:14:57.655007 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fddbx_7078ad02-317f-4f7f-a11b-7c1d24adac56/extract-utilities/0.log" Oct 07 15:14:57 crc kubenswrapper[4854]: I1007 15:14:57.691489 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fddbx_7078ad02-317f-4f7f-a11b-7c1d24adac56/extract-content/0.log" Oct 07 15:14:57 crc kubenswrapper[4854]: I1007 15:14:57.698441 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fddbx_7078ad02-317f-4f7f-a11b-7c1d24adac56/extract-content/0.log" Oct 07 15:14:57 crc kubenswrapper[4854]: I1007 15:14:57.930823 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fddbx_7078ad02-317f-4f7f-a11b-7c1d24adac56/extract-content/0.log" Oct 07 15:14:57 crc kubenswrapper[4854]: I1007 15:14:57.953593 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fddbx_7078ad02-317f-4f7f-a11b-7c1d24adac56/extract-utilities/0.log" Oct 07 15:14:58 crc kubenswrapper[4854]: I1007 15:14:58.178816 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qtms9_9299714f-83d3-487f-9173-f750d4b5f185/extract-utilities/0.log" Oct 07 15:14:58 crc kubenswrapper[4854]: I1007 15:14:58.332431 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qtms9_9299714f-83d3-487f-9173-f750d4b5f185/extract-content/0.log" Oct 07 15:14:58 crc kubenswrapper[4854]: I1007 15:14:58.375952 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qtms9_9299714f-83d3-487f-9173-f750d4b5f185/extract-content/0.log" Oct 07 15:14:58 crc kubenswrapper[4854]: I1007 15:14:58.415671 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fddbx_7078ad02-317f-4f7f-a11b-7c1d24adac56/registry-server/0.log" Oct 07 15:14:58 crc kubenswrapper[4854]: I1007 15:14:58.425246 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qtms9_9299714f-83d3-487f-9173-f750d4b5f185/extract-utilities/0.log" Oct 07 15:14:58 crc kubenswrapper[4854]: I1007 15:14:58.559761 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qtms9_9299714f-83d3-487f-9173-f750d4b5f185/extract-content/0.log" Oct 07 15:14:58 crc kubenswrapper[4854]: I1007 15:14:58.565027 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qtms9_9299714f-83d3-487f-9173-f750d4b5f185/extract-utilities/0.log" Oct 07 15:14:58 crc kubenswrapper[4854]: I1007 15:14:58.775443 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ttrh6_539f954c-bee8-4661-af57-1ba452d3dddb/registry-server/0.log" Oct 07 15:14:59 crc kubenswrapper[4854]: I1007 15:14:59.217935 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qtms9_9299714f-83d3-487f-9173-f750d4b5f185/registry-server/0.log" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.182358 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7"] Oct 07 15:15:00 crc kubenswrapper[4854]: E1007 15:15:00.183090 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b235b828-28b3-4ed7-b5f9-c6665a5bae56" containerName="container-00" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.183112 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b235b828-28b3-4ed7-b5f9-c6665a5bae56" containerName="container-00" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.183397 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="b235b828-28b3-4ed7-b5f9-c6665a5bae56" containerName="container-00" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.184174 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.186901 4854 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.187126 4854 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.205328 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7"] Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.275706 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-config-volume\") pod \"collect-profiles-29330835-wv9s7\" (UID: \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.275749 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clgv7\" (UniqueName: \"kubernetes.io/projected/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-kube-api-access-clgv7\") pod \"collect-profiles-29330835-wv9s7\" (UID: \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.275952 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-secret-volume\") pod \"collect-profiles-29330835-wv9s7\" (UID: \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.377581 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-secret-volume\") pod \"collect-profiles-29330835-wv9s7\" (UID: \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.377674 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-config-volume\") pod \"collect-profiles-29330835-wv9s7\" (UID: \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.377695 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clgv7\" (UniqueName: \"kubernetes.io/projected/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-kube-api-access-clgv7\") pod \"collect-profiles-29330835-wv9s7\" (UID: \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.378812 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-config-volume\") pod \"collect-profiles-29330835-wv9s7\" (UID: \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.939549 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-secret-volume\") pod \"collect-profiles-29330835-wv9s7\" (UID: \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" Oct 07 15:15:00 crc kubenswrapper[4854]: I1007 15:15:00.939838 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clgv7\" (UniqueName: \"kubernetes.io/projected/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-kube-api-access-clgv7\") pod \"collect-profiles-29330835-wv9s7\" (UID: \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" Oct 07 15:15:01 crc kubenswrapper[4854]: I1007 15:15:01.112911 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" Oct 07 15:15:02 crc kubenswrapper[4854]: I1007 15:15:02.069956 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7"] Oct 07 15:15:02 crc kubenswrapper[4854]: I1007 15:15:02.771380 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" event={"ID":"b2961b4d-2fb1-4a37-a001-bfc54322a7ee","Type":"ContainerStarted","Data":"1e05778643f6ed7b3b538300cc1c532c3639e0d29498bb12ca20c2b18f8864d4"} Oct 07 15:15:02 crc kubenswrapper[4854]: I1007 15:15:02.771744 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" event={"ID":"b2961b4d-2fb1-4a37-a001-bfc54322a7ee","Type":"ContainerStarted","Data":"79ba92dc72c89628fbb5bc73c1cde92ad8b800ebc712ce0ce9a6a7662202d011"} Oct 07 15:15:02 crc kubenswrapper[4854]: I1007 15:15:02.792933 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" podStartSLOduration=2.7929129059999998 podStartE2EDuration="2.792912906s" podCreationTimestamp="2025-10-07 15:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:15:02.786371036 +0000 UTC m=+10218.774203301" watchObservedRunningTime="2025-10-07 15:15:02.792912906 +0000 UTC m=+10218.780745171" Oct 07 15:15:03 crc kubenswrapper[4854]: I1007 15:15:03.783601 4854 generic.go:334] "Generic (PLEG): container finished" podID="b2961b4d-2fb1-4a37-a001-bfc54322a7ee" containerID="1e05778643f6ed7b3b538300cc1c532c3639e0d29498bb12ca20c2b18f8864d4" exitCode=0 Oct 07 15:15:03 crc kubenswrapper[4854]: I1007 15:15:03.783732 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" event={"ID":"b2961b4d-2fb1-4a37-a001-bfc54322a7ee","Type":"ContainerDied","Data":"1e05778643f6ed7b3b538300cc1c532c3639e0d29498bb12ca20c2b18f8864d4"} Oct 07 15:15:05 crc kubenswrapper[4854]: I1007 15:15:05.220277 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" Oct 07 15:15:05 crc kubenswrapper[4854]: I1007 15:15:05.325475 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-config-volume\") pod \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\" (UID: \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\") " Oct 07 15:15:05 crc kubenswrapper[4854]: I1007 15:15:05.325662 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-secret-volume\") pod \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\" (UID: \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\") " Oct 07 15:15:05 crc kubenswrapper[4854]: I1007 15:15:05.325736 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clgv7\" (UniqueName: \"kubernetes.io/projected/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-kube-api-access-clgv7\") pod \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\" (UID: \"b2961b4d-2fb1-4a37-a001-bfc54322a7ee\") " Oct 07 15:15:05 crc kubenswrapper[4854]: I1007 15:15:05.326811 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-config-volume" (OuterVolumeSpecName: "config-volume") pod "b2961b4d-2fb1-4a37-a001-bfc54322a7ee" (UID: "b2961b4d-2fb1-4a37-a001-bfc54322a7ee"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:15:05 crc kubenswrapper[4854]: I1007 15:15:05.326950 4854 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:05 crc kubenswrapper[4854]: I1007 15:15:05.334020 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-kube-api-access-clgv7" (OuterVolumeSpecName: "kube-api-access-clgv7") pod "b2961b4d-2fb1-4a37-a001-bfc54322a7ee" (UID: "b2961b4d-2fb1-4a37-a001-bfc54322a7ee"). InnerVolumeSpecName "kube-api-access-clgv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:15:05 crc kubenswrapper[4854]: I1007 15:15:05.335109 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b2961b4d-2fb1-4a37-a001-bfc54322a7ee" (UID: "b2961b4d-2fb1-4a37-a001-bfc54322a7ee"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:15:05 crc kubenswrapper[4854]: I1007 15:15:05.429130 4854 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:05 crc kubenswrapper[4854]: I1007 15:15:05.429196 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clgv7\" (UniqueName: \"kubernetes.io/projected/b2961b4d-2fb1-4a37-a001-bfc54322a7ee-kube-api-access-clgv7\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:05 crc kubenswrapper[4854]: I1007 15:15:05.803384 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" event={"ID":"b2961b4d-2fb1-4a37-a001-bfc54322a7ee","Type":"ContainerDied","Data":"79ba92dc72c89628fbb5bc73c1cde92ad8b800ebc712ce0ce9a6a7662202d011"} Oct 07 15:15:05 crc kubenswrapper[4854]: I1007 15:15:05.803423 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-wv9s7" Oct 07 15:15:05 crc kubenswrapper[4854]: I1007 15:15:05.803422 4854 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79ba92dc72c89628fbb5bc73c1cde92ad8b800ebc712ce0ce9a6a7662202d011" Oct 07 15:15:06 crc kubenswrapper[4854]: I1007 15:15:06.295281 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t"] Oct 07 15:15:06 crc kubenswrapper[4854]: I1007 15:15:06.308029 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-cnh9t"] Oct 07 15:15:06 crc kubenswrapper[4854]: I1007 15:15:06.714645 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daac9c87-5b60-4b84-9328-86291ae2ce3c" path="/var/lib/kubelet/pods/daac9c87-5b60-4b84-9328-86291ae2ce3c/volumes" Oct 07 15:15:12 crc kubenswrapper[4854]: I1007 15:15:12.038584 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-s6sn8_1dd6e29f-49fc-4584-9861-48432332df45/prometheus-operator/0.log" Oct 07 15:15:12 crc kubenswrapper[4854]: I1007 15:15:12.837773 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-895d8d84b-kv5q9_561c80f6-2f43-4922-8413-9f7ae61b0814/prometheus-operator-admission-webhook/0.log" Oct 07 15:15:12 crc kubenswrapper[4854]: I1007 15:15:12.884500 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-895d8d84b-ln6ld_3675a20e-0013-48cc-9adb-7baf69a36ac4/prometheus-operator-admission-webhook/0.log" Oct 07 15:15:13 crc kubenswrapper[4854]: I1007 15:15:13.046495 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-2mdwc_1e1a5094-5947-48c7-ad37-e5e8de5c8459/operator/0.log" Oct 07 15:15:13 crc kubenswrapper[4854]: I1007 15:15:13.125440 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-9h565_5a39c021-c124-4cc5-a90b-1a86531f6143/perses-operator/0.log" Oct 07 15:15:29 crc kubenswrapper[4854]: I1007 15:15:29.946652 4854 scope.go:117] "RemoveContainer" containerID="52cc3c9d1bcc4b7422e291044ef27c5f029123fabb04f2236ecc6b70f5dbc74d" Oct 07 15:15:39 crc kubenswrapper[4854]: E1007 15:15:39.422801 4854 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.243:34312->38.102.83.243:43445: write tcp 38.102.83.243:34312->38.102.83.243:43445: write: broken pipe Oct 07 15:15:40 crc kubenswrapper[4854]: I1007 15:15:40.810623 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:15:40 crc kubenswrapper[4854]: I1007 15:15:40.811006 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:15:44 crc kubenswrapper[4854]: E1007 15:15:44.377395 4854 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.243:57254->38.102.83.243:43445: read tcp 38.102.83.243:57254->38.102.83.243:43445: read: connection reset by peer Oct 07 15:15:44 crc kubenswrapper[4854]: E1007 15:15:44.379300 4854 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.243:57254->38.102.83.243:43445: write tcp 38.102.83.243:57254->38.102.83.243:43445: write: broken pipe Oct 07 15:16:10 crc kubenswrapper[4854]: I1007 15:16:10.807761 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:16:10 crc kubenswrapper[4854]: I1007 15:16:10.808279 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:16:30 crc kubenswrapper[4854]: I1007 15:16:30.025643 4854 scope.go:117] "RemoveContainer" containerID="37ead8063b67b89ec34d379de1b368a4aea75cb313dd4dcd9dca44a01d5a82dd" Oct 07 15:16:40 crc kubenswrapper[4854]: I1007 15:16:40.808132 4854 patch_prober.go:28] interesting pod/machine-config-daemon-vbjnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:16:40 crc kubenswrapper[4854]: I1007 15:16:40.808900 4854 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:16:40 crc kubenswrapper[4854]: I1007 15:16:40.808959 4854 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" Oct 07 15:16:40 crc kubenswrapper[4854]: I1007 15:16:40.810186 4854 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9"} pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:16:40 crc kubenswrapper[4854]: I1007 15:16:40.810342 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerName="machine-config-daemon" containerID="cri-o://c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" gracePeriod=600 Oct 07 15:16:40 crc kubenswrapper[4854]: E1007 15:16:40.941846 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:16:41 crc kubenswrapper[4854]: I1007 15:16:41.941120 4854 generic.go:334] "Generic (PLEG): container finished" podID="40b8b82d-cfd5-41d7-8673-5774db092c85" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" exitCode=0 Oct 07 15:16:41 crc kubenswrapper[4854]: I1007 15:16:41.941878 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" event={"ID":"40b8b82d-cfd5-41d7-8673-5774db092c85","Type":"ContainerDied","Data":"c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9"} Oct 07 15:16:41 crc kubenswrapper[4854]: I1007 15:16:41.941957 4854 scope.go:117] "RemoveContainer" containerID="6f17d4c8ac8a72864ae50a4f170092990db3a5ea9c1c86e99d2e15789436580e" Oct 07 15:16:41 crc kubenswrapper[4854]: I1007 15:16:41.943804 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:16:41 crc kubenswrapper[4854]: E1007 15:16:41.944508 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:16:55 crc kubenswrapper[4854]: I1007 15:16:55.703316 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:16:55 crc kubenswrapper[4854]: E1007 15:16:55.705700 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.131101 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7f8rr"] Oct 07 15:16:57 crc kubenswrapper[4854]: E1007 15:16:57.137972 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2961b4d-2fb1-4a37-a001-bfc54322a7ee" containerName="collect-profiles" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.138010 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2961b4d-2fb1-4a37-a001-bfc54322a7ee" containerName="collect-profiles" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.138451 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2961b4d-2fb1-4a37-a001-bfc54322a7ee" containerName="collect-profiles" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.140522 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.151646 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7f8rr"] Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.250979 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67498bab-a509-4a66-b955-40ccf9bbba66-utilities\") pod \"redhat-operators-7f8rr\" (UID: \"67498bab-a509-4a66-b955-40ccf9bbba66\") " pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.251035 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf7ct\" (UniqueName: \"kubernetes.io/projected/67498bab-a509-4a66-b955-40ccf9bbba66-kube-api-access-nf7ct\") pod \"redhat-operators-7f8rr\" (UID: \"67498bab-a509-4a66-b955-40ccf9bbba66\") " pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.251100 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67498bab-a509-4a66-b955-40ccf9bbba66-catalog-content\") pod \"redhat-operators-7f8rr\" (UID: \"67498bab-a509-4a66-b955-40ccf9bbba66\") " pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.352981 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67498bab-a509-4a66-b955-40ccf9bbba66-utilities\") pod \"redhat-operators-7f8rr\" (UID: \"67498bab-a509-4a66-b955-40ccf9bbba66\") " pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.353064 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf7ct\" (UniqueName: \"kubernetes.io/projected/67498bab-a509-4a66-b955-40ccf9bbba66-kube-api-access-nf7ct\") pod \"redhat-operators-7f8rr\" (UID: \"67498bab-a509-4a66-b955-40ccf9bbba66\") " pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.353187 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67498bab-a509-4a66-b955-40ccf9bbba66-catalog-content\") pod \"redhat-operators-7f8rr\" (UID: \"67498bab-a509-4a66-b955-40ccf9bbba66\") " pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.353658 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67498bab-a509-4a66-b955-40ccf9bbba66-utilities\") pod \"redhat-operators-7f8rr\" (UID: \"67498bab-a509-4a66-b955-40ccf9bbba66\") " pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.353818 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67498bab-a509-4a66-b955-40ccf9bbba66-catalog-content\") pod \"redhat-operators-7f8rr\" (UID: \"67498bab-a509-4a66-b955-40ccf9bbba66\") " pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.377121 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf7ct\" (UniqueName: \"kubernetes.io/projected/67498bab-a509-4a66-b955-40ccf9bbba66-kube-api-access-nf7ct\") pod \"redhat-operators-7f8rr\" (UID: \"67498bab-a509-4a66-b955-40ccf9bbba66\") " pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.481704 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:16:57 crc kubenswrapper[4854]: I1007 15:16:57.959577 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7f8rr"] Oct 07 15:16:58 crc kubenswrapper[4854]: I1007 15:16:58.162842 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7f8rr" event={"ID":"67498bab-a509-4a66-b955-40ccf9bbba66","Type":"ContainerStarted","Data":"5d3da1826ce11834dfd24defe1d4ce32028df835eb0948595d8d286777344ae4"} Oct 07 15:16:59 crc kubenswrapper[4854]: I1007 15:16:59.175027 4854 generic.go:334] "Generic (PLEG): container finished" podID="67498bab-a509-4a66-b955-40ccf9bbba66" containerID="a3f2afbb56e10c6fa35064ea378e153ea399e45bc3d7c00ea516490d1529e89a" exitCode=0 Oct 07 15:16:59 crc kubenswrapper[4854]: I1007 15:16:59.175267 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7f8rr" event={"ID":"67498bab-a509-4a66-b955-40ccf9bbba66","Type":"ContainerDied","Data":"a3f2afbb56e10c6fa35064ea378e153ea399e45bc3d7c00ea516490d1529e89a"} Oct 07 15:16:59 crc kubenswrapper[4854]: I1007 15:16:59.178074 4854 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 15:17:01 crc kubenswrapper[4854]: I1007 15:17:01.207348 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7f8rr" event={"ID":"67498bab-a509-4a66-b955-40ccf9bbba66","Type":"ContainerStarted","Data":"a60acadcf2face492eefd0e6780b2ccf1504cae3f54d4278fa711df697bf8ebd"} Oct 07 15:17:04 crc kubenswrapper[4854]: I1007 15:17:04.247606 4854 generic.go:334] "Generic (PLEG): container finished" podID="67498bab-a509-4a66-b955-40ccf9bbba66" containerID="a60acadcf2face492eefd0e6780b2ccf1504cae3f54d4278fa711df697bf8ebd" exitCode=0 Oct 07 15:17:04 crc kubenswrapper[4854]: I1007 15:17:04.247684 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7f8rr" event={"ID":"67498bab-a509-4a66-b955-40ccf9bbba66","Type":"ContainerDied","Data":"a60acadcf2face492eefd0e6780b2ccf1504cae3f54d4278fa711df697bf8ebd"} Oct 07 15:17:06 crc kubenswrapper[4854]: I1007 15:17:06.268729 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7f8rr" event={"ID":"67498bab-a509-4a66-b955-40ccf9bbba66","Type":"ContainerStarted","Data":"7e9cca937610d5ce793e0a35a981de787707a9c25528edac909ec7170a888d6d"} Oct 07 15:17:06 crc kubenswrapper[4854]: I1007 15:17:06.295894 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7f8rr" podStartSLOduration=3.347164226 podStartE2EDuration="9.295866774s" podCreationTimestamp="2025-10-07 15:16:57 +0000 UTC" firstStartedPulling="2025-10-07 15:16:59.177765247 +0000 UTC m=+10335.165597512" lastFinishedPulling="2025-10-07 15:17:05.126467805 +0000 UTC m=+10341.114300060" observedRunningTime="2025-10-07 15:17:06.289038906 +0000 UTC m=+10342.276871161" watchObservedRunningTime="2025-10-07 15:17:06.295866774 +0000 UTC m=+10342.283699049" Oct 07 15:17:07 crc kubenswrapper[4854]: I1007 15:17:07.482983 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:17:07 crc kubenswrapper[4854]: I1007 15:17:07.483330 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:17:08 crc kubenswrapper[4854]: I1007 15:17:08.540773 4854 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7f8rr" podUID="67498bab-a509-4a66-b955-40ccf9bbba66" containerName="registry-server" probeResult="failure" output=< Oct 07 15:17:08 crc kubenswrapper[4854]: timeout: failed to connect service ":50051" within 1s Oct 07 15:17:08 crc kubenswrapper[4854]: > Oct 07 15:17:09 crc kubenswrapper[4854]: I1007 15:17:09.703876 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:17:09 crc kubenswrapper[4854]: E1007 15:17:09.705314 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:17:17 crc kubenswrapper[4854]: I1007 15:17:17.545227 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:17:17 crc kubenswrapper[4854]: I1007 15:17:17.615722 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:17:17 crc kubenswrapper[4854]: I1007 15:17:17.786264 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7f8rr"] Oct 07 15:17:19 crc kubenswrapper[4854]: I1007 15:17:19.411777 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7f8rr" podUID="67498bab-a509-4a66-b955-40ccf9bbba66" containerName="registry-server" containerID="cri-o://7e9cca937610d5ce793e0a35a981de787707a9c25528edac909ec7170a888d6d" gracePeriod=2 Oct 07 15:17:19 crc kubenswrapper[4854]: I1007 15:17:19.952215 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.022806 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67498bab-a509-4a66-b955-40ccf9bbba66-utilities\") pod \"67498bab-a509-4a66-b955-40ccf9bbba66\" (UID: \"67498bab-a509-4a66-b955-40ccf9bbba66\") " Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.022937 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67498bab-a509-4a66-b955-40ccf9bbba66-catalog-content\") pod \"67498bab-a509-4a66-b955-40ccf9bbba66\" (UID: \"67498bab-a509-4a66-b955-40ccf9bbba66\") " Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.022990 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf7ct\" (UniqueName: \"kubernetes.io/projected/67498bab-a509-4a66-b955-40ccf9bbba66-kube-api-access-nf7ct\") pod \"67498bab-a509-4a66-b955-40ccf9bbba66\" (UID: \"67498bab-a509-4a66-b955-40ccf9bbba66\") " Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.024255 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67498bab-a509-4a66-b955-40ccf9bbba66-utilities" (OuterVolumeSpecName: "utilities") pod "67498bab-a509-4a66-b955-40ccf9bbba66" (UID: "67498bab-a509-4a66-b955-40ccf9bbba66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.029422 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67498bab-a509-4a66-b955-40ccf9bbba66-kube-api-access-nf7ct" (OuterVolumeSpecName: "kube-api-access-nf7ct") pod "67498bab-a509-4a66-b955-40ccf9bbba66" (UID: "67498bab-a509-4a66-b955-40ccf9bbba66"). InnerVolumeSpecName "kube-api-access-nf7ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.110240 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67498bab-a509-4a66-b955-40ccf9bbba66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67498bab-a509-4a66-b955-40ccf9bbba66" (UID: "67498bab-a509-4a66-b955-40ccf9bbba66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.125855 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67498bab-a509-4a66-b955-40ccf9bbba66-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.125901 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67498bab-a509-4a66-b955-40ccf9bbba66-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.125915 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf7ct\" (UniqueName: \"kubernetes.io/projected/67498bab-a509-4a66-b955-40ccf9bbba66-kube-api-access-nf7ct\") on node \"crc\" DevicePath \"\"" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.431438 4854 generic.go:334] "Generic (PLEG): container finished" podID="67498bab-a509-4a66-b955-40ccf9bbba66" containerID="7e9cca937610d5ce793e0a35a981de787707a9c25528edac909ec7170a888d6d" exitCode=0 Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.431492 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7f8rr" event={"ID":"67498bab-a509-4a66-b955-40ccf9bbba66","Type":"ContainerDied","Data":"7e9cca937610d5ce793e0a35a981de787707a9c25528edac909ec7170a888d6d"} Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.431522 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7f8rr" event={"ID":"67498bab-a509-4a66-b955-40ccf9bbba66","Type":"ContainerDied","Data":"5d3da1826ce11834dfd24defe1d4ce32028df835eb0948595d8d286777344ae4"} Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.431545 4854 scope.go:117] "RemoveContainer" containerID="7e9cca937610d5ce793e0a35a981de787707a9c25528edac909ec7170a888d6d" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.431703 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7f8rr" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.468261 4854 scope.go:117] "RemoveContainer" containerID="a60acadcf2face492eefd0e6780b2ccf1504cae3f54d4278fa711df697bf8ebd" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.477868 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7f8rr"] Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.488523 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7f8rr"] Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.495384 4854 scope.go:117] "RemoveContainer" containerID="a3f2afbb56e10c6fa35064ea378e153ea399e45bc3d7c00ea516490d1529e89a" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.550663 4854 scope.go:117] "RemoveContainer" containerID="7e9cca937610d5ce793e0a35a981de787707a9c25528edac909ec7170a888d6d" Oct 07 15:17:20 crc kubenswrapper[4854]: E1007 15:17:20.551669 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e9cca937610d5ce793e0a35a981de787707a9c25528edac909ec7170a888d6d\": container with ID starting with 7e9cca937610d5ce793e0a35a981de787707a9c25528edac909ec7170a888d6d not found: ID does not exist" containerID="7e9cca937610d5ce793e0a35a981de787707a9c25528edac909ec7170a888d6d" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.551746 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9cca937610d5ce793e0a35a981de787707a9c25528edac909ec7170a888d6d"} err="failed to get container status \"7e9cca937610d5ce793e0a35a981de787707a9c25528edac909ec7170a888d6d\": rpc error: code = NotFound desc = could not find container \"7e9cca937610d5ce793e0a35a981de787707a9c25528edac909ec7170a888d6d\": container with ID starting with 7e9cca937610d5ce793e0a35a981de787707a9c25528edac909ec7170a888d6d not found: ID does not exist" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.551780 4854 scope.go:117] "RemoveContainer" containerID="a60acadcf2face492eefd0e6780b2ccf1504cae3f54d4278fa711df697bf8ebd" Oct 07 15:17:20 crc kubenswrapper[4854]: E1007 15:17:20.552994 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60acadcf2face492eefd0e6780b2ccf1504cae3f54d4278fa711df697bf8ebd\": container with ID starting with a60acadcf2face492eefd0e6780b2ccf1504cae3f54d4278fa711df697bf8ebd not found: ID does not exist" containerID="a60acadcf2face492eefd0e6780b2ccf1504cae3f54d4278fa711df697bf8ebd" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.553030 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60acadcf2face492eefd0e6780b2ccf1504cae3f54d4278fa711df697bf8ebd"} err="failed to get container status \"a60acadcf2face492eefd0e6780b2ccf1504cae3f54d4278fa711df697bf8ebd\": rpc error: code = NotFound desc = could not find container \"a60acadcf2face492eefd0e6780b2ccf1504cae3f54d4278fa711df697bf8ebd\": container with ID starting with a60acadcf2face492eefd0e6780b2ccf1504cae3f54d4278fa711df697bf8ebd not found: ID does not exist" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.553074 4854 scope.go:117] "RemoveContainer" containerID="a3f2afbb56e10c6fa35064ea378e153ea399e45bc3d7c00ea516490d1529e89a" Oct 07 15:17:20 crc kubenswrapper[4854]: E1007 15:17:20.553684 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f2afbb56e10c6fa35064ea378e153ea399e45bc3d7c00ea516490d1529e89a\": container with ID starting with a3f2afbb56e10c6fa35064ea378e153ea399e45bc3d7c00ea516490d1529e89a not found: ID does not exist" containerID="a3f2afbb56e10c6fa35064ea378e153ea399e45bc3d7c00ea516490d1529e89a" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.553782 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f2afbb56e10c6fa35064ea378e153ea399e45bc3d7c00ea516490d1529e89a"} err="failed to get container status \"a3f2afbb56e10c6fa35064ea378e153ea399e45bc3d7c00ea516490d1529e89a\": rpc error: code = NotFound desc = could not find container \"a3f2afbb56e10c6fa35064ea378e153ea399e45bc3d7c00ea516490d1529e89a\": container with ID starting with a3f2afbb56e10c6fa35064ea378e153ea399e45bc3d7c00ea516490d1529e89a not found: ID does not exist" Oct 07 15:17:20 crc kubenswrapper[4854]: I1007 15:17:20.722661 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67498bab-a509-4a66-b955-40ccf9bbba66" path="/var/lib/kubelet/pods/67498bab-a509-4a66-b955-40ccf9bbba66/volumes" Oct 07 15:17:23 crc kubenswrapper[4854]: I1007 15:17:23.703132 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:17:23 crc kubenswrapper[4854]: E1007 15:17:23.704870 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:17:38 crc kubenswrapper[4854]: I1007 15:17:38.703188 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:17:38 crc kubenswrapper[4854]: E1007 15:17:38.704430 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:17:52 crc kubenswrapper[4854]: I1007 15:17:52.703215 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:17:52 crc kubenswrapper[4854]: E1007 15:17:52.704031 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:17:58 crc kubenswrapper[4854]: I1007 15:17:58.917294 4854 generic.go:334] "Generic (PLEG): container finished" podID="615e70f0-2488-42bc-9983-aa68a2d699bd" containerID="356037d7fb2faab4a63b5ade74cb91cda4c4acd492dd1603aa97206836a3dca5" exitCode=0 Oct 07 15:17:58 crc kubenswrapper[4854]: I1007 15:17:58.917374 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vnj8v/must-gather-nkk6l" event={"ID":"615e70f0-2488-42bc-9983-aa68a2d699bd","Type":"ContainerDied","Data":"356037d7fb2faab4a63b5ade74cb91cda4c4acd492dd1603aa97206836a3dca5"} Oct 07 15:17:58 crc kubenswrapper[4854]: I1007 15:17:58.918817 4854 scope.go:117] "RemoveContainer" containerID="356037d7fb2faab4a63b5ade74cb91cda4c4acd492dd1603aa97206836a3dca5" Oct 07 15:17:59 crc kubenswrapper[4854]: I1007 15:17:59.685100 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vnj8v_must-gather-nkk6l_615e70f0-2488-42bc-9983-aa68a2d699bd/gather/0.log" Oct 07 15:18:02 crc kubenswrapper[4854]: E1007 15:18:02.932968 4854 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.243:55524->38.102.83.243:43445: write tcp 38.102.83.243:55524->38.102.83.243:43445: write: broken pipe Oct 07 15:18:05 crc kubenswrapper[4854]: I1007 15:18:05.703734 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:18:05 crc kubenswrapper[4854]: E1007 15:18:05.704446 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:18:08 crc kubenswrapper[4854]: I1007 15:18:08.816723 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vnj8v/must-gather-nkk6l"] Oct 07 15:18:08 crc kubenswrapper[4854]: I1007 15:18:08.818249 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vnj8v/must-gather-nkk6l" podUID="615e70f0-2488-42bc-9983-aa68a2d699bd" containerName="copy" containerID="cri-o://41bcd2afd5423398a6a7776d518eda9adedca10e4a8fd8d9799f4348681dd803" gracePeriod=2 Oct 07 15:18:08 crc kubenswrapper[4854]: I1007 15:18:08.829986 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vnj8v/must-gather-nkk6l"] Oct 07 15:18:09 crc kubenswrapper[4854]: I1007 15:18:09.058821 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vnj8v_must-gather-nkk6l_615e70f0-2488-42bc-9983-aa68a2d699bd/copy/0.log" Oct 07 15:18:09 crc kubenswrapper[4854]: I1007 15:18:09.059712 4854 generic.go:334] "Generic (PLEG): container finished" podID="615e70f0-2488-42bc-9983-aa68a2d699bd" containerID="41bcd2afd5423398a6a7776d518eda9adedca10e4a8fd8d9799f4348681dd803" exitCode=143 Oct 07 15:18:09 crc kubenswrapper[4854]: I1007 15:18:09.261086 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vnj8v_must-gather-nkk6l_615e70f0-2488-42bc-9983-aa68a2d699bd/copy/0.log" Oct 07 15:18:09 crc kubenswrapper[4854]: I1007 15:18:09.261513 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/must-gather-nkk6l" Oct 07 15:18:09 crc kubenswrapper[4854]: I1007 15:18:09.421934 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/615e70f0-2488-42bc-9983-aa68a2d699bd-must-gather-output\") pod \"615e70f0-2488-42bc-9983-aa68a2d699bd\" (UID: \"615e70f0-2488-42bc-9983-aa68a2d699bd\") " Oct 07 15:18:09 crc kubenswrapper[4854]: I1007 15:18:09.422266 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2d2x\" (UniqueName: \"kubernetes.io/projected/615e70f0-2488-42bc-9983-aa68a2d699bd-kube-api-access-t2d2x\") pod \"615e70f0-2488-42bc-9983-aa68a2d699bd\" (UID: \"615e70f0-2488-42bc-9983-aa68a2d699bd\") " Oct 07 15:18:09 crc kubenswrapper[4854]: I1007 15:18:09.430950 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615e70f0-2488-42bc-9983-aa68a2d699bd-kube-api-access-t2d2x" (OuterVolumeSpecName: "kube-api-access-t2d2x") pod "615e70f0-2488-42bc-9983-aa68a2d699bd" (UID: "615e70f0-2488-42bc-9983-aa68a2d699bd"). InnerVolumeSpecName "kube-api-access-t2d2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:18:09 crc kubenswrapper[4854]: I1007 15:18:09.526669 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2d2x\" (UniqueName: \"kubernetes.io/projected/615e70f0-2488-42bc-9983-aa68a2d699bd-kube-api-access-t2d2x\") on node \"crc\" DevicePath \"\"" Oct 07 15:18:09 crc kubenswrapper[4854]: I1007 15:18:09.668900 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/615e70f0-2488-42bc-9983-aa68a2d699bd-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "615e70f0-2488-42bc-9983-aa68a2d699bd" (UID: "615e70f0-2488-42bc-9983-aa68a2d699bd"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:18:09 crc kubenswrapper[4854]: I1007 15:18:09.734304 4854 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/615e70f0-2488-42bc-9983-aa68a2d699bd-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 07 15:18:10 crc kubenswrapper[4854]: I1007 15:18:10.074222 4854 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vnj8v_must-gather-nkk6l_615e70f0-2488-42bc-9983-aa68a2d699bd/copy/0.log" Oct 07 15:18:10 crc kubenswrapper[4854]: I1007 15:18:10.074692 4854 scope.go:117] "RemoveContainer" containerID="41bcd2afd5423398a6a7776d518eda9adedca10e4a8fd8d9799f4348681dd803" Oct 07 15:18:10 crc kubenswrapper[4854]: I1007 15:18:10.074777 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vnj8v/must-gather-nkk6l" Oct 07 15:18:10 crc kubenswrapper[4854]: I1007 15:18:10.115805 4854 scope.go:117] "RemoveContainer" containerID="356037d7fb2faab4a63b5ade74cb91cda4c4acd492dd1603aa97206836a3dca5" Oct 07 15:18:10 crc kubenswrapper[4854]: I1007 15:18:10.720477 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615e70f0-2488-42bc-9983-aa68a2d699bd" path="/var/lib/kubelet/pods/615e70f0-2488-42bc-9983-aa68a2d699bd/volumes" Oct 07 15:18:16 crc kubenswrapper[4854]: I1007 15:18:16.702998 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:18:16 crc kubenswrapper[4854]: E1007 15:18:16.703768 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:18:30 crc kubenswrapper[4854]: I1007 15:18:30.702640 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:18:30 crc kubenswrapper[4854]: E1007 15:18:30.703327 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:18:45 crc kubenswrapper[4854]: I1007 15:18:45.703748 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:18:45 crc kubenswrapper[4854]: E1007 15:18:45.704549 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.649097 4854 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qcxnk"] Oct 07 15:18:56 crc kubenswrapper[4854]: E1007 15:18:56.650467 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615e70f0-2488-42bc-9983-aa68a2d699bd" containerName="gather" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.650491 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="615e70f0-2488-42bc-9983-aa68a2d699bd" containerName="gather" Oct 07 15:18:56 crc kubenswrapper[4854]: E1007 15:18:56.650515 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67498bab-a509-4a66-b955-40ccf9bbba66" containerName="registry-server" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.650528 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="67498bab-a509-4a66-b955-40ccf9bbba66" containerName="registry-server" Oct 07 15:18:56 crc kubenswrapper[4854]: E1007 15:18:56.650553 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615e70f0-2488-42bc-9983-aa68a2d699bd" containerName="copy" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.650566 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="615e70f0-2488-42bc-9983-aa68a2d699bd" containerName="copy" Oct 07 15:18:56 crc kubenswrapper[4854]: E1007 15:18:56.650618 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67498bab-a509-4a66-b955-40ccf9bbba66" containerName="extract-utilities" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.650630 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="67498bab-a509-4a66-b955-40ccf9bbba66" containerName="extract-utilities" Oct 07 15:18:56 crc kubenswrapper[4854]: E1007 15:18:56.650665 4854 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67498bab-a509-4a66-b955-40ccf9bbba66" containerName="extract-content" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.650676 4854 state_mem.go:107] "Deleted CPUSet assignment" podUID="67498bab-a509-4a66-b955-40ccf9bbba66" containerName="extract-content" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.651008 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="67498bab-a509-4a66-b955-40ccf9bbba66" containerName="registry-server" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.651038 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="615e70f0-2488-42bc-9983-aa68a2d699bd" containerName="gather" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.651057 4854 memory_manager.go:354] "RemoveStaleState removing state" podUID="615e70f0-2488-42bc-9983-aa68a2d699bd" containerName="copy" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.653785 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.661683 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qcxnk"] Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.816103 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40acbc45-96f6-432e-b66b-f21cd30b2f56-utilities\") pod \"community-operators-qcxnk\" (UID: \"40acbc45-96f6-432e-b66b-f21cd30b2f56\") " pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.816187 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40acbc45-96f6-432e-b66b-f21cd30b2f56-catalog-content\") pod \"community-operators-qcxnk\" (UID: \"40acbc45-96f6-432e-b66b-f21cd30b2f56\") " pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.816232 4854 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5c9f\" (UniqueName: \"kubernetes.io/projected/40acbc45-96f6-432e-b66b-f21cd30b2f56-kube-api-access-x5c9f\") pod \"community-operators-qcxnk\" (UID: \"40acbc45-96f6-432e-b66b-f21cd30b2f56\") " pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.918357 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40acbc45-96f6-432e-b66b-f21cd30b2f56-utilities\") pod \"community-operators-qcxnk\" (UID: \"40acbc45-96f6-432e-b66b-f21cd30b2f56\") " pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.918440 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40acbc45-96f6-432e-b66b-f21cd30b2f56-catalog-content\") pod \"community-operators-qcxnk\" (UID: \"40acbc45-96f6-432e-b66b-f21cd30b2f56\") " pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.918514 4854 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5c9f\" (UniqueName: \"kubernetes.io/projected/40acbc45-96f6-432e-b66b-f21cd30b2f56-kube-api-access-x5c9f\") pod \"community-operators-qcxnk\" (UID: \"40acbc45-96f6-432e-b66b-f21cd30b2f56\") " pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.919010 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40acbc45-96f6-432e-b66b-f21cd30b2f56-utilities\") pod \"community-operators-qcxnk\" (UID: \"40acbc45-96f6-432e-b66b-f21cd30b2f56\") " pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.919319 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40acbc45-96f6-432e-b66b-f21cd30b2f56-catalog-content\") pod \"community-operators-qcxnk\" (UID: \"40acbc45-96f6-432e-b66b-f21cd30b2f56\") " pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:18:56 crc kubenswrapper[4854]: I1007 15:18:56.939907 4854 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5c9f\" (UniqueName: \"kubernetes.io/projected/40acbc45-96f6-432e-b66b-f21cd30b2f56-kube-api-access-x5c9f\") pod \"community-operators-qcxnk\" (UID: \"40acbc45-96f6-432e-b66b-f21cd30b2f56\") " pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:18:57 crc kubenswrapper[4854]: I1007 15:18:57.033963 4854 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:18:57 crc kubenswrapper[4854]: I1007 15:18:57.576826 4854 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qcxnk"] Oct 07 15:18:57 crc kubenswrapper[4854]: I1007 15:18:57.661793 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcxnk" event={"ID":"40acbc45-96f6-432e-b66b-f21cd30b2f56","Type":"ContainerStarted","Data":"e8cd8a0d85988846ad3957e379f084c5c0fe3d3db00dfdee66b13c930e8f8189"} Oct 07 15:18:58 crc kubenswrapper[4854]: I1007 15:18:58.679394 4854 generic.go:334] "Generic (PLEG): container finished" podID="40acbc45-96f6-432e-b66b-f21cd30b2f56" containerID="615e962efce733f389c6bdc2c5e2a29debd01bdfc535903651d7548f0c6eb7aa" exitCode=0 Oct 07 15:18:58 crc kubenswrapper[4854]: I1007 15:18:58.679738 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcxnk" event={"ID":"40acbc45-96f6-432e-b66b-f21cd30b2f56","Type":"ContainerDied","Data":"615e962efce733f389c6bdc2c5e2a29debd01bdfc535903651d7548f0c6eb7aa"} Oct 07 15:18:58 crc kubenswrapper[4854]: I1007 15:18:58.703115 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:18:58 crc kubenswrapper[4854]: E1007 15:18:58.703711 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:18:59 crc kubenswrapper[4854]: I1007 15:18:59.691917 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcxnk" event={"ID":"40acbc45-96f6-432e-b66b-f21cd30b2f56","Type":"ContainerStarted","Data":"522c49191b80f14c133daba0b9b70a9d4b43453d1f52f5025a3049a3f0f17a2a"} Oct 07 15:19:01 crc kubenswrapper[4854]: I1007 15:19:01.717292 4854 generic.go:334] "Generic (PLEG): container finished" podID="40acbc45-96f6-432e-b66b-f21cd30b2f56" containerID="522c49191b80f14c133daba0b9b70a9d4b43453d1f52f5025a3049a3f0f17a2a" exitCode=0 Oct 07 15:19:01 crc kubenswrapper[4854]: I1007 15:19:01.717389 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcxnk" event={"ID":"40acbc45-96f6-432e-b66b-f21cd30b2f56","Type":"ContainerDied","Data":"522c49191b80f14c133daba0b9b70a9d4b43453d1f52f5025a3049a3f0f17a2a"} Oct 07 15:19:02 crc kubenswrapper[4854]: I1007 15:19:02.736362 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcxnk" event={"ID":"40acbc45-96f6-432e-b66b-f21cd30b2f56","Type":"ContainerStarted","Data":"6ac8b0c7acc44c31e4779160545336a02eccc65d15374361fef0f5834006d43e"} Oct 07 15:19:02 crc kubenswrapper[4854]: I1007 15:19:02.764789 4854 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qcxnk" podStartSLOduration=3.289238306 podStartE2EDuration="6.764765123s" podCreationTimestamp="2025-10-07 15:18:56 +0000 UTC" firstStartedPulling="2025-10-07 15:18:58.683028033 +0000 UTC m=+10454.670860288" lastFinishedPulling="2025-10-07 15:19:02.15855485 +0000 UTC m=+10458.146387105" observedRunningTime="2025-10-07 15:19:02.760078817 +0000 UTC m=+10458.747911082" watchObservedRunningTime="2025-10-07 15:19:02.764765123 +0000 UTC m=+10458.752597368" Oct 07 15:19:07 crc kubenswrapper[4854]: I1007 15:19:07.034854 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:19:07 crc kubenswrapper[4854]: I1007 15:19:07.035496 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:19:07 crc kubenswrapper[4854]: I1007 15:19:07.113386 4854 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:19:07 crc kubenswrapper[4854]: I1007 15:19:07.840600 4854 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:19:07 crc kubenswrapper[4854]: I1007 15:19:07.888107 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qcxnk"] Oct 07 15:19:09 crc kubenswrapper[4854]: I1007 15:19:09.810405 4854 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qcxnk" podUID="40acbc45-96f6-432e-b66b-f21cd30b2f56" containerName="registry-server" containerID="cri-o://6ac8b0c7acc44c31e4779160545336a02eccc65d15374361fef0f5834006d43e" gracePeriod=2 Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.409863 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.540279 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40acbc45-96f6-432e-b66b-f21cd30b2f56-catalog-content\") pod \"40acbc45-96f6-432e-b66b-f21cd30b2f56\" (UID: \"40acbc45-96f6-432e-b66b-f21cd30b2f56\") " Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.540813 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40acbc45-96f6-432e-b66b-f21cd30b2f56-utilities\") pod \"40acbc45-96f6-432e-b66b-f21cd30b2f56\" (UID: \"40acbc45-96f6-432e-b66b-f21cd30b2f56\") " Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.540921 4854 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5c9f\" (UniqueName: \"kubernetes.io/projected/40acbc45-96f6-432e-b66b-f21cd30b2f56-kube-api-access-x5c9f\") pod \"40acbc45-96f6-432e-b66b-f21cd30b2f56\" (UID: \"40acbc45-96f6-432e-b66b-f21cd30b2f56\") " Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.542925 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40acbc45-96f6-432e-b66b-f21cd30b2f56-utilities" (OuterVolumeSpecName: "utilities") pod "40acbc45-96f6-432e-b66b-f21cd30b2f56" (UID: "40acbc45-96f6-432e-b66b-f21cd30b2f56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.552422 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40acbc45-96f6-432e-b66b-f21cd30b2f56-kube-api-access-x5c9f" (OuterVolumeSpecName: "kube-api-access-x5c9f") pod "40acbc45-96f6-432e-b66b-f21cd30b2f56" (UID: "40acbc45-96f6-432e-b66b-f21cd30b2f56"). InnerVolumeSpecName "kube-api-access-x5c9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.619802 4854 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40acbc45-96f6-432e-b66b-f21cd30b2f56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40acbc45-96f6-432e-b66b-f21cd30b2f56" (UID: "40acbc45-96f6-432e-b66b-f21cd30b2f56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.643698 4854 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40acbc45-96f6-432e-b66b-f21cd30b2f56-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.643760 4854 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40acbc45-96f6-432e-b66b-f21cd30b2f56-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.643781 4854 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5c9f\" (UniqueName: \"kubernetes.io/projected/40acbc45-96f6-432e-b66b-f21cd30b2f56-kube-api-access-x5c9f\") on node \"crc\" DevicePath \"\"" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.821320 4854 generic.go:334] "Generic (PLEG): container finished" podID="40acbc45-96f6-432e-b66b-f21cd30b2f56" containerID="6ac8b0c7acc44c31e4779160545336a02eccc65d15374361fef0f5834006d43e" exitCode=0 Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.821370 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcxnk" event={"ID":"40acbc45-96f6-432e-b66b-f21cd30b2f56","Type":"ContainerDied","Data":"6ac8b0c7acc44c31e4779160545336a02eccc65d15374361fef0f5834006d43e"} Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.821401 4854 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcxnk" event={"ID":"40acbc45-96f6-432e-b66b-f21cd30b2f56","Type":"ContainerDied","Data":"e8cd8a0d85988846ad3957e379f084c5c0fe3d3db00dfdee66b13c930e8f8189"} Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.821420 4854 scope.go:117] "RemoveContainer" containerID="6ac8b0c7acc44c31e4779160545336a02eccc65d15374361fef0f5834006d43e" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.821586 4854 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcxnk" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.857705 4854 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qcxnk"] Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.864550 4854 scope.go:117] "RemoveContainer" containerID="522c49191b80f14c133daba0b9b70a9d4b43453d1f52f5025a3049a3f0f17a2a" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.873990 4854 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qcxnk"] Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.890959 4854 scope.go:117] "RemoveContainer" containerID="615e962efce733f389c6bdc2c5e2a29debd01bdfc535903651d7548f0c6eb7aa" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.950471 4854 scope.go:117] "RemoveContainer" containerID="6ac8b0c7acc44c31e4779160545336a02eccc65d15374361fef0f5834006d43e" Oct 07 15:19:10 crc kubenswrapper[4854]: E1007 15:19:10.950985 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ac8b0c7acc44c31e4779160545336a02eccc65d15374361fef0f5834006d43e\": container with ID starting with 6ac8b0c7acc44c31e4779160545336a02eccc65d15374361fef0f5834006d43e not found: ID does not exist" containerID="6ac8b0c7acc44c31e4779160545336a02eccc65d15374361fef0f5834006d43e" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.951049 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ac8b0c7acc44c31e4779160545336a02eccc65d15374361fef0f5834006d43e"} err="failed to get container status \"6ac8b0c7acc44c31e4779160545336a02eccc65d15374361fef0f5834006d43e\": rpc error: code = NotFound desc = could not find container \"6ac8b0c7acc44c31e4779160545336a02eccc65d15374361fef0f5834006d43e\": container with ID starting with 6ac8b0c7acc44c31e4779160545336a02eccc65d15374361fef0f5834006d43e not found: ID does not exist" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.951086 4854 scope.go:117] "RemoveContainer" containerID="522c49191b80f14c133daba0b9b70a9d4b43453d1f52f5025a3049a3f0f17a2a" Oct 07 15:19:10 crc kubenswrapper[4854]: E1007 15:19:10.951638 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"522c49191b80f14c133daba0b9b70a9d4b43453d1f52f5025a3049a3f0f17a2a\": container with ID starting with 522c49191b80f14c133daba0b9b70a9d4b43453d1f52f5025a3049a3f0f17a2a not found: ID does not exist" containerID="522c49191b80f14c133daba0b9b70a9d4b43453d1f52f5025a3049a3f0f17a2a" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.951679 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"522c49191b80f14c133daba0b9b70a9d4b43453d1f52f5025a3049a3f0f17a2a"} err="failed to get container status \"522c49191b80f14c133daba0b9b70a9d4b43453d1f52f5025a3049a3f0f17a2a\": rpc error: code = NotFound desc = could not find container \"522c49191b80f14c133daba0b9b70a9d4b43453d1f52f5025a3049a3f0f17a2a\": container with ID starting with 522c49191b80f14c133daba0b9b70a9d4b43453d1f52f5025a3049a3f0f17a2a not found: ID does not exist" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.951699 4854 scope.go:117] "RemoveContainer" containerID="615e962efce733f389c6bdc2c5e2a29debd01bdfc535903651d7548f0c6eb7aa" Oct 07 15:19:10 crc kubenswrapper[4854]: E1007 15:19:10.952029 4854 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"615e962efce733f389c6bdc2c5e2a29debd01bdfc535903651d7548f0c6eb7aa\": container with ID starting with 615e962efce733f389c6bdc2c5e2a29debd01bdfc535903651d7548f0c6eb7aa not found: ID does not exist" containerID="615e962efce733f389c6bdc2c5e2a29debd01bdfc535903651d7548f0c6eb7aa" Oct 07 15:19:10 crc kubenswrapper[4854]: I1007 15:19:10.952083 4854 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"615e962efce733f389c6bdc2c5e2a29debd01bdfc535903651d7548f0c6eb7aa"} err="failed to get container status \"615e962efce733f389c6bdc2c5e2a29debd01bdfc535903651d7548f0c6eb7aa\": rpc error: code = NotFound desc = could not find container \"615e962efce733f389c6bdc2c5e2a29debd01bdfc535903651d7548f0c6eb7aa\": container with ID starting with 615e962efce733f389c6bdc2c5e2a29debd01bdfc535903651d7548f0c6eb7aa not found: ID does not exist" Oct 07 15:19:11 crc kubenswrapper[4854]: I1007 15:19:11.703422 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:19:11 crc kubenswrapper[4854]: E1007 15:19:11.703898 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:19:12 crc kubenswrapper[4854]: I1007 15:19:12.721450 4854 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40acbc45-96f6-432e-b66b-f21cd30b2f56" path="/var/lib/kubelet/pods/40acbc45-96f6-432e-b66b-f21cd30b2f56/volumes" Oct 07 15:19:22 crc kubenswrapper[4854]: I1007 15:19:22.703208 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:19:22 crc kubenswrapper[4854]: E1007 15:19:22.704083 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:19:30 crc kubenswrapper[4854]: I1007 15:19:30.208640 4854 scope.go:117] "RemoveContainer" containerID="801b703c9b10e1f9ac6da404e03c648a8febeb6c2d328b1783447dc5f3dfd9eb" Oct 07 15:19:30 crc kubenswrapper[4854]: I1007 15:19:30.857030 4854 scope.go:117] "RemoveContainer" containerID="946ec129de886251554ef4fc2234100cd1256244d7d1282e99e83013cf40bad7" Oct 07 15:19:34 crc kubenswrapper[4854]: I1007 15:19:34.710746 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:19:34 crc kubenswrapper[4854]: E1007 15:19:34.712433 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85" Oct 07 15:19:48 crc kubenswrapper[4854]: I1007 15:19:48.703586 4854 scope.go:117] "RemoveContainer" containerID="c98097bfda43ae85e4184a4e0b7522f850f1927291ab2449b125477b5dc770e9" Oct 07 15:19:48 crc kubenswrapper[4854]: E1007 15:19:48.705275 4854 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vbjnw_openshift-machine-config-operator(40b8b82d-cfd5-41d7-8673-5774db092c85)\"" pod="openshift-machine-config-operator/machine-config-daemon-vbjnw" podUID="40b8b82d-cfd5-41d7-8673-5774db092c85"